feature:test
Run the test suite for one or all features of the active product, with granular level selection.
Usage
splent feature:test [<feature_name>] [--unit] [--integration] [--functional] [--e2e] [--load] [-k KEYWORD] [-v]
| Argument / Option | Description |
|---|---|
<feature_name> |
Optional. Run tests only for this feature (e.g. auth). If omitted, all declared features are tested. |
--unit |
Run only unit tests. |
--integration |
Run only integration tests. |
--functional |
Run only functional tests. |
--e2e |
Run only end-to-end (Selenium) tests. |
--load |
Run only load (Locust) tests. |
-k KEYWORD |
Only run tests whose name matches this keyword (passed to pytest). |
-v |
Verbose pytest output. |
When no level flags are given, the default is --unit --integration --functional.
Examples
Test all features (default levels):
splent feature:test
Test a single feature:
splent feature:test auth
Only unit tests (fastest β no DB, no HTTP):
splent feature:test --unit
Only integration tests for a specific feature:
splent feature:test profile --integration -v
Keyword filter across all features:
splent feature:test -k test_login
Example output
π§ͺ Running tests for profile [unit, integration, functional]...
βΆ splent_feature_profile
unit
4 passed in 0.15s
integration
3 passed in 1.39s
functional
1 passed in 1.22s
β
3 passed.
Test structure
Each feature follows a layered test structure:
tests/
βββ conftest.py # Shared fixtures (imports from splent_framework.fixtures)
βββ unit/ # Mocked dependencies, no DB, no HTTP β fast
β βββ test_services.py
βββ integration/ # Real test DB, repositories, services
β βββ test_repositories.py
βββ functional/ # Flask test_client, full HTTP request/response cycles
β βββ test_routes.py
βββ e2e/ # Selenium browser automation (real server required)
β βββ test_browser.py
βββ load/ # Locust performance tests
βββ locustfile.py
| Level | What it tests | Dependencies | Speed |
|---|---|---|---|
| unit | Services, helpers in isolation | Mocks only | ~0.1s |
| integration | Repositories, DB queries, relationships | Test database | ~1s |
| functional | HTTP endpoints, redirects, rendered HTML | Flask test client + DB | ~1s |
| e2e | Full browser flows | Running server + Selenium | ~10s |
| load | Performance under concurrent users | Running server + Locust | varies |
Writing tests at each level
Unit tests should mock all external dependencies:
from unittest.mock import MagicMock
from splent_io.splent_feature_auth.services import AuthenticationService
def test_login_calls_repository():
repo = MagicMock()
repo.get_by_email.return_value = MagicMock(check_password=MagicMock(return_value=True))
service = AuthenticationService(user_repository=repo)
assert service.login("user@test.com", "pass") is True
Integration tests use the real database via test_client fixture:
from splent_framework.db import db
from splent_io.splent_feature_auth.models import User
def test_create_user(test_client):
with test_client.application.app_context():
user = User(email="test@test.com", active=True)
db.session.add(user)
db.session.commit()
assert user.id is not None
Functional tests exercise HTTP endpoints:
def test_login_page_renders(test_client):
response = test_client.get("/login")
assert response.status_code == 200
assert b"Login" in response.data
Cross-feature imports
The PYTHONPATH for every pytest session includes the src/ directory of all features (workspace root + cache), not just the one being tested. This means a feature whose tests import models from another feature (e.g. splent_feature_profile importing splent_feature_auth.models) will resolve correctly.
Requirements
The command validates the testing environment before running:
- Production check β If
SPLENT_ENV=prodis set in the productβs.envfile or in the running web container, the command refuses to run and exits with a clear error message suggesting to stop the production deployment first. - TESTING flag β
TESTINGmust beTruein the Flask app configuration. - Test database β
SQLALCHEMY_DATABASE_URImust containtestto prevent running against a production database.
Run this command inside the development container where the test database is available, not on the host machine.
Notes
- Directories with no
test_*.pyfiles are skipped silently. - Features not found on disk are skipped with a warning.
- Exit code
1if any test suite fails β suitable for CI pipelines. feature:createscaffolds the full test directory structure automatically.