Skip to content

Add cache size limit support #300

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 34 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
17bf244
Update default params test
shaypal5 Jul 18, 2025
53c1e32
Add LRU cache eviction for pickle backend
shaypal5 Jul 18, 2025
fcee1d8
add lru policy support to the redis core + test parallelization
shaypal5 Jul 18, 2025
402a579
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jul 18, 2025
85decc5
parallel test runs for all test suites
shaypal5 Jul 18, 2025
419b202
cool test fixture no isolate schems of redis unit tests enabling para…
shaypal5 Jul 18, 2025
d3b98ee
fix commit-ci errors
shaypal5 Jul 18, 2025
e590e3d
stabilize flaky tests
shaypal5 Jul 18, 2025
79c6fe5
defuq
shaypal5 Jul 18, 2025
8417b55
make flaky tests pass
shaypal5 Jul 19, 2025
0be0d99
more flakiness
shaypal5 Jul 19, 2025
dba24f8
more test isolation
shaypal5 Jul 19, 2025
d06c94d
fixed coverage report combination for local tests
shaypal5 Jul 19, 2025
4478c9e
single pytest call for local tests
shaypal5 Jul 19, 2025
a08834f
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jul 19, 2025
369b5da
diagnose the no data to combine coverage error
shaypal5 Jul 19, 2025
da8c995
formatting fixes
shaypal5 Jul 19, 2025
e34e75c
remove covereage combine
shaypal5 Jul 19, 2025
250873d
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jul 19, 2025
9de8c3b
fix more flaky tets
shaypal5 Jul 19, 2025
130d590
try fix
shaypal5 Jul 19, 2025
7a81c30
run local tests using threading sequentially
shaypal5 Jul 20, 2025
a6d7a7f
update local test script to respect localserial
shaypal5 Jul 20, 2025
273e6d3
accomodate flaky tests
shaypal5 Jul 20, 2025
8134c43
upping coverage
shaypal5 Jul 20, 2025
193982e
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jul 20, 2025
9a9bc2d
pre-commit fixes
shaypal5 Jul 20, 2025
e9ada5a
end of day commit
shaypal5 Jul 20, 2025
9b7f6cf
fix failing redis tests
shaypal5 Jul 21, 2025
0601473
fixing some tests
shaypal5 Jul 21, 2025
5c3c147
fix _custom_mongetter
shaypal5 Jul 21, 2025
59a22a1
linting and formatting
shaypal5 Jul 21, 2025
9b16622
more test isolation for mongodb
shaypal5 Jul 21, 2025
b84b9f8
cleaning test pymongo clients properly
shaypal5 Jul 21, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 15 additions & 4 deletions .github/workflows/ci-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,18 @@ jobs:

- name: Unit tests (local)
if: matrix.backend == 'local'
run: pytest -m "not mongo and not sql and not redis" --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: |
# Run all local tests in parallel (pickle tests have isolation via conftest.py)
pytest -m "not mongo and not sql and not redis and not seriallocal" -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml
# Run seriallocal tests in serial (no parallelization), and append to the same .coverage file
pytest -m "seriallocal" -n0 --cov=cachier --cov-report=term --cov-report=xml:cov.xml --cov-append

# Generate coverage reports (pytest-cov already combined the data
# from different workers into a single .coverage file for the first
# pytest command, and --cov-append used the same .coverage file for
# the second one)
coverage report
coverage xml -o cov.xml

- name: Setup docker (missing on MacOS)
if: runner.os == 'macOS' && matrix.backend == 'mongodb'
Expand Down Expand Up @@ -100,7 +111,7 @@ jobs:

- name: Unit tests (DB)
if: matrix.backend == 'mongodb'
run: pytest -m "mongo" --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m "mongo" -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml
- name: Speed eval
run: python tests/speed_eval.py

Expand All @@ -126,7 +137,7 @@ jobs:
if: matrix.backend == 'postgres'
env:
SQLALCHEMY_DATABASE_URL: postgresql://testuser:testpass@localhost:5432/testdb
run: pytest -m sql --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m sql -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml

- name: Start Redis in docker
if: matrix.backend == 'redis'
Expand All @@ -145,7 +156,7 @@ jobs:

- name: Unit tests (Redis)
if: matrix.backend == 'redis'
run: pytest -m redis --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m redis -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml

- name: Upload coverage to Codecov (non PRs)
continue-on-error: true
Expand Down
18 changes: 18 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -518,6 +518,24 @@ ______________________________________________________________________
- **CI matrix:** See `.github/workflows/` for details on OS/backend combinations.
- **Local testing:** Use specific requirement files for backends you want to test.

### 🚨 CRITICAL: Test Execution Rules

**ALWAYS run tests using `uv run ./scripts/test-local.sh`** - NEVER run pytest directly!

Examples:

- `uv run ./scripts/test-local.sh sql` - Run SQL tests
- `uv run ./scripts/test-local.sh sql -p` - Run SQL tests in parallel
- `uv run ./scripts/test-local.sh all -p` - Run all tests in parallel
- `uv run ./scripts/test-local.sh mongo redis` - Run specific backends

This ensures:

- Correct virtual environment activation
- Proper dependency installation
- Docker container management for backend services
- Correct test markers and filtering

______________________________________________________________________

## 📝 Documentation & Examples
Expand Down
51 changes: 51 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -286,6 +286,19 @@ human readable string like ``"200MB"``.
When ``cachier__verbose=True`` is passed to a call that returns a value
exceeding the limit, an informative message is printed.

Cache Size Limit
~~~~~~~~~~~~~~~~
``cache_size_limit`` constrains the total size of the cache. When the
limit is exceeded, entries are evicted according to the chosen
``replacement_policy``. Currently an ``"lru"`` policy is implemented for
the memory and pickle backends.

.. code-block:: python

@cachier(cache_size_limit="100KB")
def heavy(x):
return x * 2

Ignore Cache
~~~~~~~~~~~~

Expand Down Expand Up @@ -646,6 +659,44 @@ To test all cachier backends (MongoDB, Redis, SQL, Memory, Pickle) locally with
The unified test script automatically manages Docker containers, installs required dependencies, and runs the appropriate test suites. The ``-f`` / ``--files`` option allows you to run specific test files instead of the entire test suite. See ``scripts/README-local-testing.md`` for detailed documentation.


Writing Tests - Important Best Practices
----------------------------------------

When writing tests for cachier, follow these critical guidelines to ensure test isolation:

**Test Function Isolation Rule:** Never share cachier-decorated functions between multiple test functions. Each test must use its own cachier-decorated function to ensure proper test isolation, especially when running tests in parallel.

.. code-block:: python

# GOOD: Each test has its own decorated function
def test_feature_a():
@cachier()
def my_func_a(x):
return x * 2
assert my_func_a(5) == 10

def test_feature_b():
@cachier()
def my_func_b(x): # Different function for different test
return x * 2
assert my_func_b(5) == 10

# BAD: Sharing a decorated function between tests
@cachier()
def shared_func(x): # Don't do this!
return x * 2

def test_feature_a():
assert shared_func(5) == 10

def test_feature_b():
assert shared_func(5) == 10 # This may conflict with test_feature_a

This isolation is crucial because cachier's function identification mechanism uses the full module path and function name as cache keys. Sharing functions between tests can lead to cache conflicts, especially when tests run in parallel with pytest-xdist.

For more detailed testing guidelines, see ``tests/README.md``.


Running pre-commit hooks locally
--------------------------------

Expand Down
10 changes: 10 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,7 @@ lint.per-file-ignores."tests/**" = [
"D401",
"S101",
"S105",
"S110",
"S311",
"S603",
]
Expand Down Expand Up @@ -172,6 +173,7 @@ addopts = [
"-v",
"-s",
"-W error",
# Note: parallel execution is opt-in via --parallel flag or -n option
]
markers = [
"mongo: test the MongoDB core",
Expand All @@ -180,12 +182,20 @@ markers = [
"redis: test the Redis core",
"sql: test the SQL core",
"maxage: test the max_age functionality",
"seriallocal: local core tests that should run serially",
]

# Parallel test execution configuration
# Use: pytest -n auto (for automatic worker detection)
# Or: pytest -n 4 (for specific number of workers)
# Memory tests are safe to run in parallel by default
# Pickle tests require isolation (handled by conftest.py fixture)

# --- coverage ---

[tool.coverage.run]
branch = true
parallel = true
# dynamic_context = "test_function"
omit = [
"tests/*",
Expand Down
18 changes: 17 additions & 1 deletion scripts/README-local-testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,8 @@ This guide explains how to run cachier tests locally with Docker containers for
- `-k, --keep-running` - Keep Docker containers running after tests
- `-h, --html-coverage` - Generate HTML coverage report
- `-f, --files` - Specify test files to run (can be used multiple times)
- `-p, --parallel` - Run tests in parallel using pytest-xdist
- `-w, --workers` - Number of parallel workers (default: auto)
- `--help` - Show help message

## Examples
Expand Down Expand Up @@ -96,6 +98,18 @@ CACHIER_TEST_CORES="mongo redis" ./scripts/test-local.sh

# Combine file selection with other options
./scripts/test-local.sh redis sql -f tests/test_sql_core.py -v -k

# Run tests in parallel with automatic worker detection
./scripts/test-local.sh all -p

# Run tests in parallel with 4 workers
./scripts/test-local.sh external -p -w 4

# Run local tests in parallel (memory and pickle)
./scripts/test-local.sh memory pickle -p

# Combine parallel testing with other options
./scripts/test-local.sh mongo redis -p -v -k
```

### Docker Compose
Expand Down Expand Up @@ -193,10 +207,12 @@ The script automatically sets the required environment variables:
2. **For quick iteration**: Use memory and pickle tests (no Docker required)
3. **For debugging**: Use `-k` to keep containers running and inspect them
4. **For CI parity**: Test with the same backends that CI uses
5. **For faster test runs**: Use `-p` to run tests in parallel, especially when testing multiple backends
6. **For parallel testing**: The script automatically installs pytest-xdist when needed
7. **Worker count**: Use `-w auto` (default) to let pytest-xdist determine optimal workers, or specify a number based on your CPU cores

## Future Enhancements

- Add MySQL/MariaDB support
- Add Elasticsearch support
- Add performance benchmarking mode
- Add parallel test execution for multiple backends
74 changes: 73 additions & 1 deletion scripts/test-local.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,8 @@ KEEP_RUNNING=false
SELECTED_CORES=""
INCLUDE_LOCAL_CORES=false
TEST_FILES=""
PARALLEL=false
PARALLEL_WORKERS="auto"

# Function to print colored messages
print_message() {
Expand Down Expand Up @@ -56,6 +58,8 @@ OPTIONS:
-k, --keep-running Keep containers running after tests
-h, --html-coverage Generate HTML coverage report
-f, --files Specify test files to run (can be used multiple times)
-p, --parallel Run tests in parallel using pytest-xdist
-w, --workers Number of parallel workers (default: auto)
--help Show this help message

EXAMPLES:
Expand All @@ -65,6 +69,8 @@ EXAMPLES:
$0 external -k # Run external backends, keep containers
$0 mongo memory -v # Run MongoDB and memory tests verbosely
$0 all -f tests/test_main.py -f tests/test_redis_core_coverage.py # Run specific test files
$0 memory pickle -p # Run local tests in parallel
$0 all -p -w 4 # Run all tests with 4 parallel workers

ENVIRONMENT:
You can also set cores via CACHIER_TEST_CORES environment variable:
Expand Down Expand Up @@ -102,6 +108,20 @@ while [[ $# -gt 0 ]]; do
usage
exit 0
;;
-p|--parallel)
PARALLEL=true
shift
;;
-w|--workers)
shift
if [[ $# -eq 0 ]] || [[ "$1" == -* ]]; then
print_message $RED "Error: -w/--workers requires a number argument"
usage
exit 1
fi
PARALLEL_WORKERS="$1"
shift
;;
-*)
print_message $RED "Unknown option: $1"
usage
Expand Down Expand Up @@ -232,6 +252,17 @@ check_dependencies() {
}
fi

# Check for pytest-xdist if parallel testing is requested
if [ "$PARALLEL" = true ]; then
if ! python -c "import xdist" 2>/dev/null; then
print_message $YELLOW "Installing pytest-xdist for parallel testing..."
pip install pytest-xdist || {
print_message $RED "Failed to install pytest-xdist"
exit 1
}
fi
fi

# Check MongoDB dependencies if testing MongoDB
if echo "$SELECTED_CORES" | grep -qw "mongo"; then
if ! python -c "import pymongo" 2>/dev/null; then
Expand Down Expand Up @@ -410,14 +441,20 @@ main() {
# Check and install dependencies
check_dependencies

# Check if we need Docker
# Check if we need Docker, and if we should run serial pickle tests
needs_docker=false
run_serial_local_tests=false
for core in $SELECTED_CORES; do
case $core in
mongo|redis|sql)
needs_docker=true
;;
esac
case $core in
pickle|all)
run_serial_local_tests=true
;;
esac
done

if [ "$needs_docker" = true ]; then
Expand Down Expand Up @@ -484,15 +521,20 @@ main() {
sql) test_sql ;;
esac
done
pytest_markers="$pytest_markers and not seriallocal"

# Run pytest
# Build pytest command
PYTEST_CMD="pytest"
# and the specific pytest command for running serial pickle tests
SERIAL_PYTEST_CMD="pytest -m seriallocal -n0"

# Add test files if specified
if [ -n "$TEST_FILES" ]; then
PYTEST_CMD="$PYTEST_CMD $TEST_FILES"
print_message $BLUE "Test files specified: $TEST_FILES"
# and turn off serial local tests, so we run only selected files
run_serial_local_tests=false
fi

# Add markers if needed (only if no specific test files were given)
Expand All @@ -504,6 +546,10 @@ main() {

if [ "$selected_sorted" != "$all_sorted" ]; then
PYTEST_CMD="$PYTEST_CMD -m \"$pytest_markers\""
else
print_message $BLUE "Running all tests without markers since all cores are selected"
PYTEST_CMD="$PYTEST_CMD -m \"not seriallocal\""
run_serial_local_tests=true
fi
else
# When test files are specified, still apply markers if not running all cores
Expand All @@ -519,15 +565,41 @@ main() {
# Add verbose flag if needed
if [ "$VERBOSE" = true ]; then
PYTEST_CMD="$PYTEST_CMD -v"
SERIAL_PYTEST_CMD="$SERIAL_PYTEST_CMD -v"
fi

# Add parallel testing options if requested
if [ "$PARALLEL" = true ]; then
PYTEST_CMD="$PYTEST_CMD -n $PARALLEL_WORKERS"

# Show parallel testing info
if [ "$PARALLEL_WORKERS" = "auto" ]; then
print_message $BLUE "Running tests in parallel with automatic worker detection"
else
print_message $BLUE "Running tests in parallel with $PARALLEL_WORKERS workers"
fi

# Special note for pickle tests
if echo "$SELECTED_CORES" | grep -qw "pickle"; then
print_message $YELLOW "Note: Pickle tests will use isolated cache directories for parallel safety"
fi
fi

# Add coverage options
PYTEST_CMD="$PYTEST_CMD --cov=cachier --cov-report=$COVERAGE_REPORT"
SERIAL_PYTEST_CMD="$SERIAL_PYTEST_CMD --cov=cachier --cov-report=$COVERAGE_REPORT --cov-append"

# Print and run the command
print_message $BLUE "Running: $PYTEST_CMD"
eval $PYTEST_CMD

if [ "$run_serial_local_tests" = true ]; then
print_message $BLUE "Running serial local tests (pickle, memory) with: $SERIAL_PYTEST_CMD"
eval $SERIAL_PYTEST_CMD
else
print_message $BLUE "Skipping serial local tests (pickle, memory) since not requested"
fi

TEST_EXIT_CODE=$?

if [ $TEST_EXIT_CODE -eq 0 ]; then
Expand Down
2 changes: 2 additions & 0 deletions src/cachier/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,8 @@ class Params:
cleanup_stale: bool = False
cleanup_interval: timedelta = timedelta(days=1)
entry_size_limit: Optional[int] = None
cache_size_limit: Optional[int] = None
replacement_policy: str = "lru"


_global_params = Params()
Expand Down
Loading
Loading