feature:test

Run the test suite for one or all features of the active product, with granular level selection.


Usage

splent feature:test [<feature_name>] [--unit] [--integration] [--functional] [--e2e] [--load] [-k KEYWORD] [-v]
Argument / Option Description
<feature_name> Optional. Run tests only for this feature (e.g. auth). If omitted, all declared features are tested.
--unit Run only unit tests.
--integration Run only integration tests.
--functional Run only functional tests.
--e2e Run only end-to-end (Selenium) tests.
--load Run only load (Locust) tests.
-k KEYWORD Only run tests whose name matches this keyword (passed to pytest).
-v Verbose pytest output.

When no level flags are given, the default is --unit --integration --functional.


Examples

Test all features (default levels):

splent feature:test

Test a single feature:

splent feature:test auth

Only unit tests (fastest β€” no DB, no HTTP):

splent feature:test --unit

Only integration tests for a specific feature:

splent feature:test profile --integration -v

Keyword filter across all features:

splent feature:test -k test_login

Example output

πŸ§ͺ Running tests for profile [unit, integration, functional]...

  β–Ά  splent_feature_profile
     unit
4 passed in 0.15s
     integration
3 passed in 1.39s
     functional
1 passed in 1.22s

βœ… 3 passed.

Test structure

Each feature follows a layered test structure:

tests/
β”œβ”€β”€ conftest.py           # Shared fixtures (imports from splent_framework.fixtures)
β”œβ”€β”€ unit/                 # Mocked dependencies, no DB, no HTTP β€” fast
β”‚   └── test_services.py
β”œβ”€β”€ integration/          # Real test DB, repositories, services
β”‚   └── test_repositories.py
β”œβ”€β”€ functional/           # Flask test_client, full HTTP request/response cycles
β”‚   └── test_routes.py
β”œβ”€β”€ e2e/                  # Selenium browser automation (real server required)
β”‚   └── test_browser.py
└── load/                 # Locust performance tests
    └── locustfile.py
Level What it tests Dependencies Speed
unit Services, helpers in isolation Mocks only ~0.1s
integration Repositories, DB queries, relationships Test database ~1s
functional HTTP endpoints, redirects, rendered HTML Flask test client + DB ~1s
e2e Full browser flows Running server + Selenium ~10s
load Performance under concurrent users Running server + Locust varies

Writing tests at each level

Unit tests should mock all external dependencies:

from unittest.mock import MagicMock
from splent_io.splent_feature_auth.services import AuthenticationService

def test_login_calls_repository():
    repo = MagicMock()
    repo.get_by_email.return_value = MagicMock(check_password=MagicMock(return_value=True))
    service = AuthenticationService(user_repository=repo)
    assert service.login("user@test.com", "pass") is True

Integration tests use the real database via test_client fixture:

from splent_framework.db import db
from splent_io.splent_feature_auth.models import User

def test_create_user(test_client):
    with test_client.application.app_context():
        user = User(email="test@test.com", active=True)
        db.session.add(user)
        db.session.commit()
        assert user.id is not None

Functional tests exercise HTTP endpoints:

def test_login_page_renders(test_client):
    response = test_client.get("/login")
    assert response.status_code == 200
    assert b"Login" in response.data

Cross-feature imports

The PYTHONPATH for every pytest session includes the src/ directory of all features (workspace root + cache), not just the one being tested. This means a feature whose tests import models from another feature (e.g. splent_feature_profile importing splent_feature_auth.models) will resolve correctly.


Requirements

The command validates the testing environment before running:

  1. Production check β€” If SPLENT_ENV=prod is set in the product’s .env file or in the running web container, the command refuses to run and exits with a clear error message suggesting to stop the production deployment first.
  2. TESTING flag β€” TESTING must be True in the Flask app configuration.
  3. Test database β€” SQLALCHEMY_DATABASE_URI must contain test to prevent running against a production database.

Run this command inside the development container where the test database is available, not on the host machine.


Notes

  • Directories with no test_*.py files are skipped silently.
  • Features not found on disk are skipped with a warning.
  • Exit code 1 if any test suite fails β€” suitable for CI pipelines.
  • feature:create scaffolds the full test directory structure automatically.

Back to top

splent. Distributed by an LGPL license v3. Contact us: drorganvidez@us.es