-
Notifications
You must be signed in to change notification settings - Fork 15
Managing dependency versions
As per our policy, we aim to support versions of major dependencies such as NumPy for at least 24 months.
In practice, we may support versions for much longer if it is trivial for us to do so, because "doing nothing" is easier than "doing something", in which case we may only remove support for old versions when they become a nuisance or when removing support for old Python verions.
As a rule, we should test against all versions of dependencies that we claim to support.
We do this by randomly selecting versions in the "Update env" task in .github/workflows/test_and_build.yml
.
Hence, managing versions of dependencies primarily means updating the versions here
and in the dependencies section of pyproject.toml
. Randomly choosing versions like this tends
to work pretty well, because packages tend to be compatible with different versions of each other.
Exceptions can be handled as needed.
For dependencies that follow semantic versioning (such as MAJOR.MINOR.BUGFIX
),
it is usually sufficient to only randomly choose between MAJOR.MINOR
versions.
BUGFIX
should be included only when there is reason to, such as if there was a regression.
python-suitesparse-graphblas
is a special dependency that we handle slightly differently.
In CI, we install it from conda-forge, wheels and sdist from PyPI, and build the latest
source from upstream. We try to support versions of it for as long as is reasonable, but
we sometimes need to be tightly integrated with it and only support new versions
(such as if there is an update to the GraphBLAS C API spec).
If code in python-graphblas
needs to branch for different versions of a dependency,
this can be done by behavior (such as using hasattr
) or by checking versions directly.
Use your best judgement for which to use. When checking against behaviors, please include
# MAINT:
comment with a short description so future maintainers can find and act on
these breadcrumbs in the future. When checking versions directly, it's usually sufficient
to do something like if np.__version__[:5] == "1.22."
,
which can easily be found via git grep __version__
.
Dependencies often raise DeprecationWarning
or FutureWarning
, which will cause our
tests to fail if we don't explicitly manage them. We do this by adding specific "ignore"
directives to [tool.pytest.ini_options.filterwarnings]
section in pyproject.toml
.
Instructions for how to add new filters is in pyproject.toml
. When adding a new filter,
please describe what the filter is for, such version of package, version of Python, or date.
These should be documented well enough that a future maintainer can determine when it's safe
to delete each filter.
To check whether new versions of dependencies are available, run:
$ ./scripts/check_versions.sh
This shows the latest known version and any new versions for all dependencies.
If there are new versions available that we want to support, then add them to
.github/workflows/test_and_build.yml
and update ./scripts/check_versions.sh
to the latest version. It may be good practice to run this script before
doing a release.
When adding (or removing) entire dependencies, they may need to be added (or removed) many places:
-
.github/workflows/test_and_build.yml
to randomly test versions -
pyproject.toml
in dependencies or optional-dependencies environment.yml
dev-requirements.txt
-
binder/environment.yml
if it should be installed in the binder environment -
docs/env.yml
only if required to render docs correctly -
docs/getting_started/index.rst
if appropriate to add as optional dependency scripts/check_versions.sh
There is no expectation for how often we update versions in .pre-commit-config.yaml
.
Hopefully at least once a year, but this is up to the current maintainers.
Most versions can be updated by running:
$ pre-commit autoupdate
Versions in &flake8_dependencies
, such as flake8-bugbear
and flake8-simplify
,
need to be updated manually. As with other dependencies, we can see whether new versions
of these are available by running ./scripts/check_versions.sh
.
Updating linters will likely require updating configuration in pyproject.toml
and
updating code to pass the linters. Best judgement should be used when adopting new
linting rules.
Current maintainers are free to add, remove, or modify any pre-commit hook to try to follow current best practices with the goal of improving code quality and making the project more maintainable.
Care should be taken when updating versions such as for pydata-sphinx-theme
or sphinx-panels
,
because new versions will likely change the appearance of the documentation website.
Managing docs should be a separate wiki document.
.github/dependabot.yml
hopefully updates versions of github actions automatically for us
by creating new PRs. This usually does not require much attention.
If we learn any lessons, please note them here.