From 281523c87e46da349f85599eadce3c43a02c3b8b Mon Sep 17 00:00:00 2001 From: Koustav Ghosh Date: Thu, 21 Dec 2023 00:05:54 +0530 Subject: [PATCH 001/554] DOC add dropdown menu for Section 2.5 Decomposing signals in components (#27551) --- doc/modules/decomposition.rst | 40 ++++++++++++++++++++++++++++++----- 1 file changed, 35 insertions(+), 5 deletions(-) diff --git a/doc/modules/decomposition.rst b/doc/modules/decomposition.rst index c1e317d2ff7d3..a151eda636e7b 100644 --- a/doc/modules/decomposition.rst +++ b/doc/modules/decomposition.rst @@ -319,6 +319,11 @@ is eigendecomposed in the Kernel PCA fitting process has an effective rank that is much smaller than its size. This is a situation where approximate eigensolvers can provide speedup with very low precision loss. + +|details-start| +**Eigensolvers** +|details-split| + The optional parameter ``eigen_solver='randomized'`` can be used to *significantly* reduce the computation time when the number of requested ``n_components`` is small compared with the number of samples. It relies on @@ -343,6 +348,7 @@ is extremely small. It is enabled by default when the desired number of components is less than 10 (strict) and the number of samples is more than 200 (strict). See :class:`KernelPCA` for details. + .. topic:: References: * *dense* solver: @@ -365,6 +371,8 @@ components is less than 10 (strict) and the number of samples is more than 200 `_ R. B. Lehoucq, D. C. Sorensen, and C. Yang, (1998) +|details-end| + .. _LSA: @@ -375,6 +383,16 @@ Truncated singular value decomposition and latent semantic analysis (SVD) that only computes the :math:`k` largest singular values, where :math:`k` is a user-specified parameter. +:class:`TruncatedSVD` is very similar to :class:`PCA`, but differs +in that the matrix :math:`X` does not need to be centered. +When the columnwise (per-feature) means of :math:`X` +are subtracted from the feature values, +truncated SVD on the resulting matrix is equivalent to PCA. + +|details-start| +**About truncated SVD and latent semantic analysis (LSA)** +|details-split| + When truncated SVD is applied to term-document matrices (as returned by :class:`~sklearn.feature_extraction.text.CountVectorizer` or :class:`~sklearn.feature_extraction.text.TfidfVectorizer`), @@ -415,11 +433,6 @@ To also transform a test set :math:`X`, we multiply it with :math:`V_k`: We present LSA in a different way that matches the scikit-learn API better, but the singular values found are the same. -:class:`TruncatedSVD` is very similar to :class:`PCA`, but differs -in that the matrix :math:`X` does not need to be centered. -When the columnwise (per-feature) means of :math:`X` -are subtracted from the feature values, -truncated SVD on the resulting matrix is equivalent to PCA. While the :class:`TruncatedSVD` transformer works with any feature matrix, @@ -430,6 +443,8 @@ should be turned on (``sublinear_tf=True, use_idf=True``) to bring the feature values closer to a Gaussian distribution, compensating for LSA's erroneous assumptions about textual data. +|details-end| + .. topic:: Examples: * :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py` @@ -442,6 +457,7 @@ compensating for LSA's erroneous assumptions about textual data. `_ + .. _DictionaryLearning: Dictionary Learning @@ -883,6 +899,10 @@ Note that this definition is not valid if :math:`\beta \in (0; 1)`, yet it can be continuously extended to the definitions of :math:`d_{KL}` and :math:`d_{IS}` respectively. +|details-start| +**NMF implemented solvers** +|details-split| + :class:`NMF` implements two solvers, using Coordinate Descent ('cd') [5]_, and Multiplicative Update ('mu') [6]_. The 'mu' solver can optimize every beta-divergence, including of course the Frobenius norm (:math:`\beta=2`), the @@ -896,6 +916,8 @@ The 'cd' solver can only optimize the Frobenius norm. Due to the underlying non-convexity of NMF, the different solvers may converge to different minima, even when optimizing the same distance function. +|details-end| + NMF is best used with the ``fit_transform`` method, which returns the matrix W. The matrix H is stored into the fitted model in the ``components_`` attribute; the method ``transform`` will decompose a new matrix X_new based on these @@ -910,6 +932,8 @@ stored components:: >>> X_new = np.array([[1, 0], [1, 6.1], [1, 0], [1, 4], [3.2, 1], [0, 4]]) >>> W_new = model.transform(X_new) + + .. topic:: Examples: * :ref:`sphx_glr_auto_examples_decomposition_plot_faces_decomposition.py` @@ -996,6 +1020,10 @@ of topics in the corpus and the distribution of words in the documents. The goal of LDA is to use the observed words to infer the hidden topic structure. +|details-start| +**Details on modeling text corpora** +|details-split| + When modeling text corpora, the model assumes the following generative process for a corpus with :math:`D` documents and :math:`K` topics, with :math:`K` corresponding to `n_components` in the API: @@ -1036,6 +1064,8 @@ Maximizing ELBO is equivalent to minimizing the Kullback-Leibler(KL) divergence between :math:`q(z,\theta,\beta)` and the true posterior :math:`p(z, \theta, \beta |w, \alpha, \eta)`. +|details-end| + :class:`LatentDirichletAllocation` implements the online variational Bayes algorithm and supports both online and batch update methods. While the batch method updates variational variables after each full pass through From b8d783d2b648fc1f194370d8a9bf22bcc00bf138 Mon Sep 17 00:00:00 2001 From: "Thomas J. Fan" Date: Thu, 21 Dec 2023 09:38:23 -0500 Subject: [PATCH 002/554] DOC Fixes pipeline docstring for pandas output (#27992) --- sklearn/pipeline.py | 1 + 1 file changed, 1 insertion(+) diff --git a/sklearn/pipeline.py b/sklearn/pipeline.py index 0bd81a3c57918..4df21618be4ee 100644 --- a/sklearn/pipeline.py +++ b/sklearn/pipeline.py @@ -182,6 +182,7 @@ def set_output(self, *, transform=None): Configure output of `transform` and `fit_transform`. - `"default"`: Default output format of a transformer + - `"pandas"`: DataFrame output - `"polars"`: Polars output - `None`: Transform configuration is unchanged From 2164bb16b978fa06be2e773332a4b0c3ddd90f8e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= <34657725+jeremiedbb@users.noreply.github.com> Date: Fri, 22 Dec 2023 10:40:08 +0100 Subject: [PATCH 003/554] MAINT Bump dev version (#27995) --- doc/whats_new.rst | 1 + doc/whats_new/v1.5.rst | 34 ++++++++++++++++++++++++++++++++++ sklearn/__init__.py | 2 +- 3 files changed, 36 insertions(+), 1 deletion(-) create mode 100644 doc/whats_new/v1.5.rst diff --git a/doc/whats_new.rst b/doc/whats_new.rst index 210d27cc075e5..8fa4c7007e0fd 100644 --- a/doc/whats_new.rst +++ b/doc/whats_new.rst @@ -12,6 +12,7 @@ on libraries.io to be notified when new versions are released. .. toctree:: :maxdepth: 1 + Version 1.5 Version 1.4 Version 1.3 Version 1.2 diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst new file mode 100644 index 0000000000000..d07f1412635aa --- /dev/null +++ b/doc/whats_new/v1.5.rst @@ -0,0 +1,34 @@ +.. include:: _contributors.rst + +.. currentmodule:: sklearn + +.. _changes_1_5: + +Version 1.5.0 +============= + +**In Development** + +.. include:: changelog_legend.inc + +Changelog +--------- + +.. + Entries should be grouped by module (in alphabetic order) and prefixed with + one of the labels: |MajorFeature|, |Feature|, |Efficiency|, |Enhancement|, + |Fix| or |API| (see whats_new.rst for descriptions). + Entries should be ordered by those labels (e.g. |Fix| after |Efficiency|). + Changes not specific to a module should be listed under *Multiple Modules* + or *Miscellaneous*. + Entries should end with: + :pr:`123456` by :user:`Joe Bloggs `. + where 123455 is the *pull request* number, not the issue number. + +Code and Documentation Contributors +----------------------------------- + +Thanks to everyone who has contributed to the maintenance and improvement of +the project since version 1.4, including: + +TODO: update at the time of the release. diff --git a/sklearn/__init__.py b/sklearn/__init__.py index ecb32f9dc0da3..673031649a265 100644 --- a/sklearn/__init__.py +++ b/sklearn/__init__.py @@ -42,7 +42,7 @@ # Dev branch marker is: 'X.Y.dev' or 'X.Y.devN' where N is an integer. # 'X.Y.dev0' is the canonical version of 'X.Y.dev' # -__version__ = "1.4.dev0" +__version__ = "1.5.dev0" # On OSX, we can get a runtime error due to multiple OpenMP libraries loaded From 0d4a88e311997c8f70d54be472b23860521d74b3 Mon Sep 17 00:00:00 2001 From: Lukas Geiger Date: Mon, 25 Dec 2023 17:47:33 +0100 Subject: [PATCH 004/554] MAINT: Prefer `np.fill_diagonal` over `diag_indices` (#27965) --- sklearn/metrics/pairwise.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/sklearn/metrics/pairwise.py b/sklearn/metrics/pairwise.py index 5d848dae5b11f..4e5c37dff0091 100644 --- a/sklearn/metrics/pairwise.py +++ b/sklearn/metrics/pairwise.py @@ -1084,7 +1084,7 @@ def cosine_distances(X, Y=None): if X is Y or Y is None: # Ensure that distances between vectors and themselves are set to 0.0. # This may not be the case due to floating point rounding errors. - S[np.diag_indices_from(S)] = 0.0 + np.fill_diagonal(S, 0.0) return S From 678e3999eeaadcaaef523d1d0d2f52a25986d460 Mon Sep 17 00:00:00 2001 From: "Thomas J. Fan" Date: Wed, 27 Dec 2023 05:41:46 -0500 Subject: [PATCH 005/554] FEAT Adds __getitem__ to ColumnTransformer (#27990) Co-authored-by: Guillaume Lemaitre --- doc/whats_new/v1.5.rst | 6 ++++++ sklearn/compose/_column_transformer.py | 10 ++++++++++ .../compose/tests/test_column_transformer.py | 18 ++++++++++++++++++ 3 files changed, 34 insertions(+) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index d07f1412635aa..fbd8a3f83b1dd 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -32,3 +32,9 @@ Thanks to everyone who has contributed to the maintenance and improvement of the project since version 1.4, including: TODO: update at the time of the release. + +:mod:`sklearn.compose` +...................... + +- |Feature| A fitted :class:`compose.ColumnTransformer` now implements `__getitem__` + which returns the fitted transformers by name. :pr:`27990` by `Thomas Fan`_. diff --git a/sklearn/compose/_column_transformer.py b/sklearn/compose/_column_transformer.py index a53ad2348fe94..6740bdf4e8993 100644 --- a/sklearn/compose/_column_transformer.py +++ b/sklearn/compose/_column_transformer.py @@ -1082,6 +1082,16 @@ def _sk_visual_block_(self): "parallel", transformers, names=names, name_details=name_details ) + def __getitem__(self, key): + try: + return self.named_transformers_[key] + except AttributeError as e: + raise TypeError( + "ColumnTransformer is subscriptable after it is fitted" + ) from e + except KeyError as e: + raise KeyError(f"'{key}' is not a valid transformer name") from e + def _get_empty_routing(self): """Return empty routing. diff --git a/sklearn/compose/tests/test_column_transformer.py b/sklearn/compose/tests/test_column_transformer.py index 1ceaad3ec1737..aa7dfe62fc1a8 100644 --- a/sklearn/compose/tests/test_column_transformer.py +++ b/sklearn/compose/tests/test_column_transformer.py @@ -2301,6 +2301,24 @@ def test_dataframe_different_dataframe_libraries(): assert_array_equal(out_pd_in, X_test_np) +def test_column_transformer__getitem__(): + """Check __getitem__ for ColumnTransformer.""" + X = np.array([[0, 1, 2], [3, 4, 5]]) + ct = ColumnTransformer([("t1", Trans(), [0, 1]), ("t2", Trans(), [1, 2])]) + + msg = "ColumnTransformer is subscriptable after it is fitted" + with pytest.raises(TypeError, match=msg): + ct["t1"] + + ct.fit(X) + assert ct["t1"] is ct.named_transformers_["t1"] + assert ct["t2"] is ct.named_transformers_["t2"] + + msg = "'does_not_exist' is not a valid transformer name" + with pytest.raises(KeyError, match=msg): + ct["does_not_exist"] + + # Metadata Routing Tests # ====================== From 5ffbba4e5cab81002e66362b1c51410392d3839a Mon Sep 17 00:00:00 2001 From: "Thomas J. Fan" Date: Tue, 2 Jan 2024 05:01:25 -0500 Subject: [PATCH 006/554] CI Use gh for assign and unassign instead of curl (#28031) --- .github/workflows/assign.yml | 4 ++-- .github/workflows/unassign.yml | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/.github/workflows/assign.yml b/.github/workflows/assign.yml index 9f87b8fa7e0f9..5d725d76b0b1b 100644 --- a/.github/workflows/assign.yml +++ b/.github/workflows/assign.yml @@ -20,5 +20,5 @@ jobs: steps: - run: | echo "Assigning issue ${{ github.event.issue.number }} to ${{ github.event.comment.user.login }}" - curl -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" -d '{"assignees": ["${{ github.event.comment.user.login }}"]}' https://api.github.com/repos/${{ github.repository }}/issues/${{ github.event.issue.number }}/assignees - curl -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" -X "DELETE" https://api.github.com/repos/${{ github.repository }}/issues/${{ github.event.issue.number }}/labels/help%20wanted + gh issue edit ${{ github.event.issue.number }} --add-assignee ${{ github.event.comment.user.login }} + gh issue edit ${{ github.event.issue.number }} --remove-label "help wanted" diff --git a/.github/workflows/unassign.yml b/.github/workflows/unassign.yml index c73b854530ff7..9cb1616cc0c1e 100644 --- a/.github/workflows/unassign.yml +++ b/.github/workflows/unassign.yml @@ -18,4 +18,4 @@ jobs: if: github.event.issue.state == 'open' run: | echo "Marking issue ${{ github.event.issue.number }} as help wanted" - curl -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" -d '{"labels": ["help wanted"]}' https://api.github.com/repos/${{ github.repository }}/issues/${{ github.event.issue.number }}/labels + gh issue edit ${{ github.event.issue.number }} --add-label "help wanted" From 3d35fa9e49eee96c3bfe2ae8543981a92dbc35b3 Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Tue, 2 Jan 2024 11:31:46 +0100 Subject: [PATCH 007/554] :lock: :robot: CI Update lock files for cirrus-arm CI build(s) :lock: :robot: (#28015) Co-authored-by: Lock file bot --- .../cirrus/pymin_conda_forge_linux-aarch64_conda.lock | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock index c838fefdafef0..d622189ad1ac3 100644 --- a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock +++ b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock @@ -6,7 +6,7 @@ https://conda.anaconda.org/conda-forge/linux-aarch64/ca-certificates-2023.11.17- https://conda.anaconda.org/conda-forge/linux-aarch64/ld_impl_linux-aarch64-2.40-h2d8c526_0.conda#16246d69e945d0b1969a6099e7c5d457 https://conda.anaconda.org/conda-forge/linux-aarch64/libstdcxx-ng-13.2.0-h9a76618_3.conda#7ad2164936c4975d94ca883d34809c0f https://conda.anaconda.org/conda-forge/linux-aarch64/python_abi-3.9-4_cp39.conda#c191905a08694e4a5cb1238e90233878 -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/linux-aarch64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#98a1185182fec3c434069fa74e6473d6 https://conda.anaconda.org/conda-forge/linux-aarch64/libgcc-ng-13.2.0-hf8544c7_3.conda#00f021ee1a24c798ae53c87ee79597f1 https://conda.anaconda.org/conda-forge/linux-aarch64/bzip2-1.0.8-h31becfc_5.conda#a64e35f01e0b7a2a152eca87d33b9c87 @@ -19,6 +19,7 @@ https://conda.anaconda.org/conda-forge/linux-aarch64/libjpeg-turbo-3.0.0-h31becf https://conda.anaconda.org/conda-forge/linux-aarch64/libnsl-2.0.1-h31becfc_0.conda#c14f32510f694e3185704d89967ec422 https://conda.anaconda.org/conda-forge/linux-aarch64/libuuid-2.38.1-hb4cce97_0.conda#000e30b09db0b7c775b21695dff30969 https://conda.anaconda.org/conda-forge/linux-aarch64/libwebp-base-1.3.2-h31becfc_0.conda#1490de434d2a2c06a98af27641a2ffff +https://conda.anaconda.org/conda-forge/linux-aarch64/libxcrypt-4.4.36-h31becfc_1.conda#b4df5d7d4b63579d081fd3a4cf99740e https://conda.anaconda.org/conda-forge/linux-aarch64/libzlib-1.2.13-h31becfc_5.conda#b213aa87eea9491ef7b129179322e955 https://conda.anaconda.org/conda-forge/linux-aarch64/ncurses-6.4-h0425590_2.conda#4ff0a396150dedad4269e16e5810f769 https://conda.anaconda.org/conda-forge/linux-aarch64/openssl-3.2.0-h31becfc_1.conda#b24247441ed7ce138382de2ec51200e4 @@ -41,13 +42,13 @@ https://conda.anaconda.org/conda-forge/linux-aarch64/libhiredis-1.0.2-h05efe27_0 https://conda.anaconda.org/conda-forge/linux-aarch64/libopenblas-0.3.25-pthreads_h5a5ec62_0.conda#60e86bc93e3f213278dc5081115fb63b https://conda.anaconda.org/conda-forge/linux-aarch64/libtiff-4.6.0-h1708d11_2.conda#d5638e110e7f22e2602a8edd20656720 https://conda.anaconda.org/conda-forge/linux-aarch64/llvm-openmp-17.0.6-h8b0cb96_0.conda#48337e980ec89cd22dd87ced0a0aa878 -https://conda.anaconda.org/conda-forge/linux-aarch64/python-3.9.18-h4ac3b42_0_cpython.conda#4d36e157278470ac06508579c6d36555 +https://conda.anaconda.org/conda-forge/linux-aarch64/python-3.9.18-h4ac3b42_1_cpython.conda#6ba2858e603df9b6ab7ad172b15be15f https://conda.anaconda.org/conda-forge/linux-aarch64/brotli-1.1.0-h31becfc_1.conda#e41f5862ac746428407f3fd44d2ed01f https://conda.anaconda.org/conda-forge/linux-aarch64/ccache-4.8.1-h6552966_0.conda#5b436a19e818f05fe0c9ab4f5ac61233 https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/linux-aarch64/cython-3.0.6-py39h387a81e_0.conda#49d46c249d4e1d6ccc302059537c9ef9 +https://conda.anaconda.org/conda-forge/linux-aarch64/cython-3.0.7-py39h387a81e_0.conda#e5495f92998c2dca45221dbe10c49999 https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 https://conda.anaconda.org/conda-forge/noarch/execnet-2.0.2-pyhd8ed1ab_0.conda#67de0d8241e1060a479e3c37793e26f9 https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda#f800d2da156d08e289b14e87e43c1ae5 @@ -69,14 +70,14 @@ https://conda.anaconda.org/conda-forge/linux-aarch64/tornado-6.3.3-py39h7cc1d5f_ https://conda.anaconda.org/conda-forge/linux-aarch64/unicodedata2-15.1.0-py39h898b7ef_0.conda#8c072c9329aeea97a46005625267a851 https://conda.anaconda.org/conda-forge/noarch/wheel-0.42.0-pyhd8ed1ab_0.conda#1cdea58981c5cbc17b51973bcaddcea7 https://conda.anaconda.org/conda-forge/noarch/zipp-3.17.0-pyhd8ed1ab_0.conda#2e4d6bc0b14e10f895fc6791a7d9b26a -https://conda.anaconda.org/conda-forge/linux-aarch64/fonttools-4.46.0-py39h898b7ef_0.conda#515b31b7bba3302949c9be091b1945e2 +https://conda.anaconda.org/conda-forge/linux-aarch64/fonttools-4.47.0-py39h898b7ef_0.conda#c1104ffe473cef5d35af62e0b6351de3 https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.1.1-pyhd8ed1ab_0.conda#3d5fa25cf42f3f32a12b2d874ace8574 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc https://conda.anaconda.org/conda-forge/linux-aarch64/libcblas-3.9.0-20_linuxaarch64_openblas.conda#b41e55ae2cb9d3518da2cbe3677b3b3b https://conda.anaconda.org/conda-forge/linux-aarch64/liblapack-3.9.0-20_linuxaarch64_openblas.conda#e7412a592d9ee7c92026eb1189687271 https://conda.anaconda.org/conda-forge/linux-aarch64/pillow-10.1.0-py39h8ce38d7_0.conda#afedc0abb518dac535cb861f24585160 https://conda.anaconda.org/conda-forge/noarch/pip-23.3.2-pyhd8ed1ab_0.conda#8591c748f98dcc02253003533bc2e4b1 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-aarch64/liblapacke-3.9.0-20_linuxaarch64_openblas.conda#1b8192f036a2dc41fec67700bb8bacef From 9da38ba1ef317f353c75f58c257dea52b32b56c9 Mon Sep 17 00:00:00 2001 From: "Thomas J. Fan" Date: Tue, 2 Jan 2024 09:08:44 -0500 Subject: [PATCH 008/554] CI Fix assign/unassign workflow (#28044) --- .github/workflows/assign.yml | 7 +++++-- .github/workflows/unassign.yml | 5 ++++- 2 files changed, 9 insertions(+), 3 deletions(-) diff --git a/.github/workflows/assign.yml b/.github/workflows/assign.yml index 5d725d76b0b1b..fa3b6f95a5e95 100644 --- a/.github/workflows/assign.yml +++ b/.github/workflows/assign.yml @@ -20,5 +20,8 @@ jobs: steps: - run: | echo "Assigning issue ${{ github.event.issue.number }} to ${{ github.event.comment.user.login }}" - gh issue edit ${{ github.event.issue.number }} --add-assignee ${{ github.event.comment.user.login }} - gh issue edit ${{ github.event.issue.number }} --remove-label "help wanted" + gh issue edit $ISSUE --add-assignee ${{ github.event.comment.user.login }} + gh issue edit $ISSUE --remove-label "help wanted" + env: + GH_TOKEN: ${{ github.token }} + ISSUE: ${{ github.event.issue.html_url }} diff --git a/.github/workflows/unassign.yml b/.github/workflows/unassign.yml index 9cb1616cc0c1e..94a50d49839d6 100644 --- a/.github/workflows/unassign.yml +++ b/.github/workflows/unassign.yml @@ -18,4 +18,7 @@ jobs: if: github.event.issue.state == 'open' run: | echo "Marking issue ${{ github.event.issue.number }} as help wanted" - gh issue edit ${{ github.event.issue.number }} --add-label "help wanted" + gh issue edit $ISSUE --add-label "help wanted" + env: + GH_TOKEN: ${{ github.token }} + ISSUE: ${{ github.event.issue.html_url }} From ffe4e789dab9237230530d49dd210aeb84e8608d Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Tue, 2 Jan 2024 16:29:06 +0100 Subject: [PATCH 009/554] :lock: :robot: CI Update lock files for pypy CI build(s) :lock: :robot: (#28017) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Lock file bot Co-authored-by: Loïc Estève --- build_tools/azure/pypy3_linux-64_conda.lock | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/build_tools/azure/pypy3_linux-64_conda.lock b/build_tools/azure/pypy3_linux-64_conda.lock index 7446b1acce459..5fd5f84fcf17e 100644 --- a/build_tools/azure/pypy3_linux-64_conda.lock +++ b/build_tools/azure/pypy3_linux-64_conda.lock @@ -6,7 +6,7 @@ https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.ta https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_3.conda#937eaed008f6bf2191c5fe76f87755e9 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_pypy39_pp73.conda#c1b2f29111681a4036ed21eaa3f44620 -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h807b86a_3.conda#23fdf1fef05baeb7eadc2aed5fb0011f https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-hd590300_5.conda#69b8b6202a07720f448be700e300ccf4 @@ -61,7 +61,7 @@ https://conda.anaconda.org/conda-forge/linux-64/python-3.9.18-0_73_pypy.conda#aa https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.6-py39hc10206b_0.conda#5fc69db5e035852a013acfa22dd8e18b +https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.7-py39hc10206b_0.conda#4068d9f575989a3482032d526cf42d5a https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 https://conda.anaconda.org/conda-forge/noarch/execnet-2.0.2-pyhd8ed1ab_0.conda#67de0d8241e1060a479e3c37793e26f9 https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda#f800d2da156d08e289b14e87e43c1ae5 @@ -84,10 +84,10 @@ https://conda.anaconda.org/conda-forge/linux-64/unicodedata2-15.1.0-py39hf860d4a https://conda.anaconda.org/conda-forge/noarch/zipp-3.17.0-pyhd8ed1ab_0.conda#2e4d6bc0b14e10f895fc6791a7d9b26a https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py39ha90811c_0.conda#f3b2afc64bf0cbe901a9b00d44611c61 -https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.46.0-py39hf860d4a_0.conda#05d7d08eaa9678ac1dd33d1fc3e1c6dd +https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.47.0-py39hf860d4a_0.conda#ebe895da6a30d81da5433696f008389d https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.1.1-pyhd8ed1ab_0.conda#3d5fa25cf42f3f32a12b2d874ace8574 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py39h6dedee3_0.conda#066da96b1c7587d85b572f97d631ce1a https://conda.anaconda.org/conda-forge/linux-64/blas-2.120-openblas.conda#c8f6916a81a340650078171b1d852574 From 4ce8e19859cb8b2f2bef197ed5b28beea44ee4b4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Tue, 2 Jan 2024 17:29:01 +0100 Subject: [PATCH 010/554] MNT Update main lock files (#28045) --- build_tools/azure/debian_atlas_32bit_lock.txt | 4 +- ...latest_conda_forge_mkl_linux-64_conda.lock | 23 +++++----- ..._forge_mkl_no_coverage_linux-64_conda.lock | 19 ++++---- ...pylatest_conda_forge_mkl_osx-64_conda.lock | 16 +++---- ...test_conda_mkl_no_openmp_osx-64_conda.lock | 6 +-- ...st_pip_openblas_pandas_linux-64_conda.lock | 20 ++++---- ...onda_defaults_openblas_linux-64_conda.lock | 4 +- .../pymin_conda_forge_mkl_win-64_conda.lock | 18 ++++---- ...e_openblas_ubuntu_2204_linux-64_conda.lock | 21 +++++---- build_tools/azure/ubuntu_atlas_lock.txt | 2 +- build_tools/circle/doc_environment.yml | 1 + build_tools/circle/doc_linux-64_conda.lock | 46 +++++++++---------- .../doc_min_dependencies_linux-64_conda.lock | 19 ++++---- .../update_environments_and_lock_files.py | 6 +++ 14 files changed, 108 insertions(+), 97 deletions(-) diff --git a/build_tools/azure/debian_atlas_32bit_lock.txt b/build_tools/azure/debian_atlas_32bit_lock.txt index 8de0a2fda8bac..02b9100e3dd6b 100644 --- a/build_tools/azure/debian_atlas_32bit_lock.txt +++ b/build_tools/azure/debian_atlas_32bit_lock.txt @@ -4,9 +4,9 @@ # # pip-compile --output-file=build_tools/azure/debian_atlas_32bit_lock.txt build_tools/azure/debian_atlas_32bit_requirements.txt # -attrs==23.1.0 +attrs==23.2.0 # via pytest -coverage==7.3.3 +coverage==7.4.0 # via pytest-cov cython==0.29.33 # via -r build_tools/azure/debian_atlas_32bit_requirements.txt diff --git a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock index 3cb84a4b0bd9a..188936db093a6 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 7aa55d66dfbd0f6267a9aff8c750d1e9f42cd339726c8f9c4d1299341b064849 +# input_hash: 06a1abd91fe199d0e020e5ac38efba4bc3d4a7752e01cf91e4b046c5d0ba8a93 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -11,7 +11,7 @@ https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_1.co https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h41732ed_0.conda#7aca3059a1729aa76c597603f10b0dd3 https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_3.conda#937eaed008f6bf2191c5fe76f87755e9 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.11-4_cp311.conda#d786502c97404c94d7d58d258a445a65 -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 @@ -45,6 +45,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2 https://conda.anaconda.org/conda-forge/linux-64/libutf8proc-2.8.0-h166bdaf_0.tar.bz2#ede4266dc02e875fe1ea77b25dd43747 https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.3.2-hd590300_0.conda#30de3fd9b3b602f7473f30e684eeea8c +https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#bdadff838d5437aea83607ced8b37f75 @@ -114,7 +115,7 @@ https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-17.0.6-h4dfa4b3_0.co https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.0.33-hca2cd23_6.conda#e87530d1b12dd7f4e0f856dc07358d60 https://conda.anaconda.org/conda-forge/linux-64/nss-3.96-h1d7d5a4_0.conda#1c8f8b8eb041ecd54053fc4b6ad57957 https://conda.anaconda.org/conda-forge/linux-64/orc-1.9.0-h2f23424_1.conda#9571eb3eb0f7fe8b59956a7786babbcd -https://conda.anaconda.org/conda-forge/linux-64/python-3.11.7-hab00c5b_0_cpython.conda#bf281a975393266ab95734a8cfd532ec +https://conda.anaconda.org/conda-forge/linux-64/python-3.11.7-hab00c5b_1_cpython.conda#27cf681282c11dba7b0b1fd266e8f289 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-renderutil-0.3.9-hd590300_1.conda#e995b155d938b6779da6ace6c6b13816 @@ -128,7 +129,7 @@ https://conda.anaconda.org/conda-forge/linux-64/ccache-4.8.1-h1fcd64f_0.conda#fd https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.6-py311hb755f60_0.conda#88cc84238dda72e11285d9cfcbe43e51 +https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.7-py311hb755f60_0.conda#97b12677eec6c2fd23c7867db1c7a87d https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2#ecfff944ba3960ecb334b9a2663d708d https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 https://conda.anaconda.org/conda-forge/noarch/execnet-2.0.2-pyhd8ed1ab_0.conda#67de0d8241e1060a479e3c37793e26f9 @@ -149,7 +150,7 @@ https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#23 https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb -https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.3-pyhd8ed1ab_0.conda#2590495f608a63625e165915fb4e2e34 +https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 @@ -166,8 +167,8 @@ https://conda.anaconda.org/conda-forge/linux-64/xorg-libxrender-0.9.11-hd590300_ https://conda.anaconda.org/conda-forge/linux-64/aws-c-auth-0.7.3-h28f7589_1.conda#97503d3e565004697f1651753aa95b9e https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.9.3-hb447be9_1.conda#c520669eb0be9269a5f0d8ef62531882 https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f907bb958910dc404647326ca80c263e -https://conda.anaconda.org/conda-forge/linux-64/coverage-7.3.3-py311h459d7ec_0.conda#9db2c1316e96068c0189beaeb716f3fe -https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.46.0-py311h459d7ec_0.conda#a14114f70e23f7fd5ab9941fec45b095 +https://conda.anaconda.org/conda-forge/linux-64/coverage-7.4.0-py311h459d7ec_0.conda#bbaf0376ed2f153a90f167ad908da3d0 +https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.47.0-py311h459d7ec_0.conda#f7ec87c448f714f53519fe9c87ba1747 https://conda.anaconda.org/conda-forge/linux-64/glib-2.78.3-hfc55251_0.conda#e08e51acc7d1ae8dbe13255e7b4c64ac https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc https://conda.anaconda.org/conda-forge/linux-64/libclang-15.0.7-default_hb11cfb5_4.conda#c90f4cbb57839c98fef8f830e4b9972f @@ -176,19 +177,19 @@ https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-h5d7e998_0.co https://conda.anaconda.org/conda-forge/linux-64/mkl-2022.2.1-h84fe81f_16997.conda#a7ce56d5757f5b57e7daabe703ade5bb https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py311ha6c5da5_0.conda#83a988daf5c49e57f7d2086fb6781fe8 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py311hb755f60_0.conda#02336abab4cb5dd794010ef53c54bd09 https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.3.14-hf3aad02_1.conda#a968ffa7e9fe0c257628033d393e512f https://conda.anaconda.org/conda-forge/linux-64/blas-1.0-mkl.tar.bz2#349aef876b1d8c9dccae01de20d5b385 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.7-h98fc4e7_1.conda#a8d71f6705ed1f70d7099a6bd1c078ac +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-16_linux64_mkl.tar.bz2#85f61af03fd291dae33150ffe89dc09a https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py311hb755f60_5.conda#e4d262cc3600e70b505a6761d29f6207 https://conda.anaconda.org/conda-forge/noarch/pytest-cov-4.1.0-pyhd8ed1ab_0.conda#06eb685a3a0b146347a58dda979485da https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.21.0-hb942446_5.conda#07d92ed5403ad7b5c66ffd7d5b8f7e57 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.7-h8e1006c_1.conda#89cd9374d5fc7371db238e4ef5c5f258 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-16_linux64_mkl.tar.bz2#361bf757b95488de76c4f123805742d3 https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-16_linux64_mkl.tar.bz2#a2f166748917d6d6e4707841ca1f519e https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e @@ -198,7 +199,7 @@ https://conda.anaconda.org/conda-forge/linux-64/qt-main-5.15.8-h82b777d_17.conda https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py311h9547e67_0.conda#40828c5b36ef52433e21f89943e09f33 https://conda.anaconda.org/conda-forge/linux-64/libarrow-12.0.1-hb87d912_8_cpu.conda#3f3b11398fe79b578e3c44dd00a44e4a https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py311h320fe9a_0.conda#e44ccb61b6621bf3f8053ae66eba7397 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.0-py311hf926cbc_0.conda#fe3a6de20a7326780b494372ca3f4ec2 +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.2-py311hf926cbc_0.conda#18f12d27741769ae5432dacce21acc93 https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py311hf0fb5b6_5.conda#ec7e45bc76d9d0b69a74a2075932b8e8 https://conda.anaconda.org/conda-forge/linux-64/pytorch-1.13.1-cpu_py311h410fd25_1.conda#ddd2fadddf89e3dc3d541a2537fce010 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py311h64a7726_0.conda#9ac5334f1b5ed072d3dbc342503d7868 diff --git a/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock index 605257904f571..1f4ef37ac52c2 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 223cf367742008b437f38ff4642c0e70494f665cf9434d4da5c6483c757397fd +# input_hash: 66cbc7b263fbf4db3cc89cc53f522739390cbf324ab81cff43bff8bd3630c49d @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -12,7 +12,7 @@ https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h41732ed_0 https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_3.conda#937eaed008f6bf2191c5fe76f87755e9 https://conda.anaconda.org/conda-forge/linux-64/mkl-include-2023.2.0-h84fe81f_50496.conda#7af9fd0b2d7219f4a4200a34561340f6 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.12-4_cp312.conda#dccc2d142812964fcc6abdc97b672dff -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 @@ -38,6 +38,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libogg-1.3.4-h7f98852_1.tar.bz2# https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2#15345e56d527b330e1cacbdf58676e8f https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.3.2-hd590300_0.conda#30de3fd9b3b602f7473f30e684eeea8c +https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#bdadff838d5437aea83607ced8b37f75 @@ -89,7 +90,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-ha9c0a0a_2.conda#5 https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-17.0.6-h4dfa4b3_0.conda#c1665f9c1c9f6c93d8b4e492a6a39056 https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.0.33-hca2cd23_6.conda#e87530d1b12dd7f4e0f856dc07358d60 https://conda.anaconda.org/conda-forge/linux-64/nss-3.96-h1d7d5a4_0.conda#1c8f8b8eb041ecd54053fc4b6ad57957 -https://conda.anaconda.org/conda-forge/linux-64/python-3.12.0-hab00c5b_0_cpython.conda#7f97faab5bebcc2580f4f299285323da +https://conda.anaconda.org/conda-forge/linux-64/python-3.12.1-hab00c5b_1_cpython.conda#0bab699354cbd66959550eb9b9866620 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-renderutil-0.3.9-hd590300_1.conda#e995b155d938b6779da6ace6c6b13816 @@ -100,7 +101,7 @@ https://conda.anaconda.org/conda-forge/linux-64/ccache-4.8.1-h1fcd64f_0.conda#fd https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.6-py312h30efb56_0.conda#d677efb974cd83fbc0c3d2fe3b6770ab +https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.7-py312h30efb56_0.conda#2b97b8193bd02c72ebd57c5bf88a0457 https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2#ecfff944ba3960ecb334b9a2663d708d https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 https://conda.anaconda.org/conda-forge/noarch/execnet-2.0.2-pyhd8ed1ab_0.conda#67de0d8241e1060a479e3c37793e26f9 @@ -121,7 +122,7 @@ https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#23 https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb -https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.3-pyhd8ed1ab_0.conda#2590495f608a63625e165915fb4e2e34 +https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 @@ -134,7 +135,7 @@ https://conda.anaconda.org/conda-forge/linux-64/xkeyboard-config-2.40-hd590300_0 https://conda.anaconda.org/conda-forge/linux-64/xorg-libxext-1.3.4-h0b41bf4_2.conda#82b6df12252e6f32402b96dacc656fec https://conda.anaconda.org/conda-forge/linux-64/xorg-libxrender-0.9.11-hd590300_0.conda#ed67c36f215b310412b2af935bf3e530 https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f907bb958910dc404647326ca80c263e -https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.46.0-py312h98912ed_0.conda#2b76aa1ec66928a4295235c29ae9d978 +https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.47.0-py312h98912ed_0.conda#37998571aee0938fff9047691bda0b26 https://conda.anaconda.org/conda-forge/linux-64/glib-2.78.3-hfc55251_0.conda#e08e51acc7d1ae8dbe13255e7b4c64ac https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-20_linux64_mkl.conda#8bf521f6007b0b0eb91515a1165b5d85 @@ -143,16 +144,16 @@ https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-hd429924_1.co https://conda.anaconda.org/conda-forge/linux-64/mkl-devel-2023.2.0-ha770c72_50496.conda#3b4c50e31ff098b18a450e4f5f860adf https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py312hf3581a9_0.conda#c04d3de9d831a69a5fdfab1413ec2fb6 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py312h30efb56_0.conda#32633871002ee9902f747d2236e0d122 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.7-h98fc4e7_1.conda#a8d71f6705ed1f70d7099a6bd1c078ac +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-20_linux64_mkl.conda#7a2972758a03adc92d856072c71c9170 https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-20_linux64_mkl.conda#4db0cd03efcdab535f6f066aca4cddbb https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py312h30efb56_5.conda#8a2a122dc4fe14d8cff38f1cf426381f https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.7-h8e1006c_1.conda#89cd9374d5fc7371db238e4ef5c5f258 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_mkl.conda#3dea5e9be386b963d7f4368966e238b3 https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.2-py312heda63a1_0.conda#6d7b0ae4472449b7893345c015f486d3 https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e diff --git a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock index 01105897892fc..3f1ea3d25b2ce 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: osx-64 -# input_hash: 02abef27514db5e5119c3cdc253e84a06374c1b308495298b46bdb14dcc52ae9 +# input_hash: 1c061d421872c406aaefcd63aa475f5decae7806dd07d710dc5d742da72de61a @EXPLICIT https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-h10d778d_5.conda#6097a6ca9ada32699b5fc4312dd6ef18 https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2023.11.17-h8857fd0_0.conda#c687e9d14c49e3d3946d50a413cdbf16 @@ -20,7 +20,7 @@ https://conda.anaconda.org/conda-forge/osx-64/mkl-include-2023.2.0-h6bab518_5050 https://conda.anaconda.org/conda-forge/osx-64/pthread-stubs-0.4-hc929b4f_1001.tar.bz2#addd19059de62181cd11ae8f4ef26084 https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.12-4_cp312.conda#87201ac4314b911b74197e588cca3639 https://conda.anaconda.org/conda-forge/osx-64/tbb-2021.10.0-h1c7c39f_2.conda#73434bcf87082942e938352afae9b0fa -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/osx-64/xorg-libxau-1.0.11-h0dc2134_0.conda#9566b4c29274125b0266d0177b5eb97b https://conda.anaconda.org/conda-forge/osx-64/xorg-libxdmcp-1.1.3-h35c211d_0.tar.bz2#86ac76d6bf1cbb9621943eb3bd9ae36e https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2#a72f9d4ea13d55d745ff1ed594747f10 @@ -61,14 +61,14 @@ https://conda.anaconda.org/conda-forge/osx-64/liblapack-3.9.0-20_osx64_mkl.conda https://conda.anaconda.org/conda-forge/osx-64/llvm-tools-16.0.6-hbedff68_3.conda#e9356b0807462e8f84c1384a8da539a5 https://conda.anaconda.org/conda-forge/osx-64/mpc-1.3.1-h81bd1dd_0.conda#c752c0eb6c250919559172c011e5f65b https://conda.anaconda.org/conda-forge/osx-64/openjpeg-2.5.0-ha4da562_3.conda#40a36f8e9a6fdf6a78c6428ee6c44188 -https://conda.anaconda.org/conda-forge/osx-64/python-3.12.0-h30d4d87_0_cpython.conda#d11dc8f4551011fb6baa2865f1ead48f +https://conda.anaconda.org/conda-forge/osx-64/python-3.12.1-h9f0c242_1_cpython.conda#41d5549764b9f37199e6255e5e9daee6 https://conda.anaconda.org/conda-forge/osx-64/ccache-4.8.1-h28e096f_0.conda#dcc8cc97fdab7a5fad9e1a6bbad9ed0e https://conda.anaconda.org/conda-forge/osx-64/cctools_osx-64-973.0.1-ha1c5b94_15.conda#c9dbe505cd17a5a4a6a787dbceea2dba https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 https://conda.anaconda.org/conda-forge/osx-64/clang-16-16.0.6-default_h6b1ee41_3.conda#07654411a331ea916e6f93ae0d8363b7 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/osx-64/cython-3.0.6-py312h444b7ae_0.conda#3a38f4e03fe33a698b5bf5f56e63256c +https://conda.anaconda.org/conda-forge/osx-64/cython-3.0.7-py312hede676d_0.conda#89a76a23df8d704d26a3f27e0a1c372d https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 https://conda.anaconda.org/conda-forge/noarch/execnet-2.0.2-pyhd8ed1ab_0.conda#67de0d8241e1060a479e3c37793e26f9 https://conda.anaconda.org/conda-forge/osx-64/gfortran_impl_osx-64-12.3.0-h54fd467_1.conda#5f4d40236e204c6e62cd0a316244f316 @@ -83,7 +83,7 @@ https://conda.anaconda.org/conda-forge/osx-64/pillow-10.1.0-py312h0c70c2f_0.cond https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb -https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.3-pyhd8ed1ab_0.conda#2590495f608a63625e165915fb4e2e34 +https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 @@ -95,10 +95,10 @@ https://conda.anaconda.org/conda-forge/osx-64/blas-devel-3.9.0-20_osx64_mkl.cond https://conda.anaconda.org/conda-forge/osx-64/cctools-973.0.1-h40f6528_15.conda#bc85aa6ab5eea61c47f39015dbe34a88 https://conda.anaconda.org/conda-forge/osx-64/clang-16.0.6-hac416ee_3.conda#b143a7f213c0d25ced055089a2baef46 https://conda.anaconda.org/conda-forge/osx-64/contourpy-1.2.0-py312hbf0bb39_0.conda#74190e06053cda7139a0cb71f3e618fd -https://conda.anaconda.org/conda-forge/osx-64/coverage-7.3.3-py312h41838bb_0.conda#8d722deb062da474f20bd37f5518d920 -https://conda.anaconda.org/conda-forge/osx-64/fonttools-4.46.0-py312h41838bb_0.conda#d5cc686fe3a5971312ac3ff9fd4f1557 +https://conda.anaconda.org/conda-forge/osx-64/coverage-7.4.0-py312h41838bb_0.conda#8fdd619940b64e33b0702cb46d701f6e +https://conda.anaconda.org/conda-forge/osx-64/fonttools-4.47.0-py312h41838bb_0.conda#73605f0b5026ee8445b68fceafb53941 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/osx-64/scipy-1.11.4-py312heccc6a5_0.conda#b7b422b49ae2e5c8276bffd05f3ba63c https://conda.anaconda.org/conda-forge/osx-64/blas-2.120-mkl.conda#b041a7677a412f3d925d8208936cb1e2 diff --git a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock index fb19597cc1392..a89638ebbdd83 100644 --- a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock @@ -1,10 +1,10 @@ # Generated by conda-lock. # platform: osx-64 -# input_hash: 03f7604aefb9752d2367c457bdf4e4923158be96db35ac0dd1d5dc60a9981cd1 +# input_hash: c8fdd08f1a9a3d91ec09f211e4444ef33921a111f684fa63428591be5ca1eb68 @EXPLICIT https://repo.anaconda.com/pkgs/main/osx-64/blas-1.0-mkl.conda#cb2c87e85ac8e0ceae776d26d4214c8a https://repo.anaconda.com/pkgs/main/osx-64/bzip2-1.0.8-h1de35cc_0.conda#19fcb113b170fe2a0be96b47801fed7d -https://repo.anaconda.com/pkgs/main/osx-64/ca-certificates-2023.08.22-hecd8cb5_0.conda#62e40f0ed4b9adcf54eb2da76acbaf63 +https://repo.anaconda.com/pkgs/main/osx-64/ca-certificates-2023.12.12-hecd8cb5_0.conda#1f885715539fba0c408ab58d1bda6c8e https://repo.anaconda.com/pkgs/main/osx-64/giflib-5.2.1-h6c40b1e_3.conda#a5ab49bdb6fdc875fb965221241e3bcf https://repo.anaconda.com/pkgs/main/osx-64/jpeg-9e-h6c40b1e_1.conda#fc3e61fa41309946c9283fe8737d7f41 https://repo.anaconda.com/pkgs/main/osx-64/libbrotlicommon-1.0.9-hca72f7f_7.conda#6c865b9e76fa2fad0c8ac32aa0f01f75 @@ -40,7 +40,7 @@ https://repo.anaconda.com/pkgs/main/osx-64/libtiff-4.5.1-hcec6c5f_0.conda#e127a8 https://repo.anaconda.com/pkgs/main/osx-64/python-3.11.5-hf27a42d_0.conda#f088169d190325a14aaa0dcb53a9864f https://repo.anaconda.com/pkgs/main/osx-64/coverage-7.2.2-py311h6c40b1e_0.conda#e15605553450156cf75c3ae38a920475 https://repo.anaconda.com/pkgs/main/noarch/cycler-0.11.0-pyhd3eb1b0_0.conda#f5e365d2cdb66d547eb8c3ab93843aab -https://repo.anaconda.com/pkgs/main/osx-64/cython-3.0.0-py311h6c40b1e_0.conda#f1831f4c643b4653ecb777477763f9cc +https://repo.anaconda.com/pkgs/main/osx-64/cython-3.0.6-py311h6c40b1e_0.conda#6c8a140209eb4814de054f52627f543c https://repo.anaconda.com/pkgs/main/noarch/execnet-1.9.0-pyhd3eb1b0_0.conda#f895937671af67cebb8af617494b3513 https://repo.anaconda.com/pkgs/main/noarch/iniconfig-1.1.1-pyhd3eb1b0_0.tar.bz2#e40edff2c5708f342cef43c7f280c507 https://repo.anaconda.com/pkgs/main/osx-64/joblib-1.2.0-py311hecd8cb5_0.conda#af8c1fcd4e8e0c6fa2a4f4ecda261dc9 diff --git a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock index 0d0108160f7bb..5a314f7a7df3b 100644 --- a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock @@ -1,9 +1,9 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: d01d23bd27bcd50d2b3643492f966c8e390822d72b69f31bf66c2fe98a265a4c +# input_hash: 51f374bd6034467b82c190398f401712163436d283f9536c2e5a1d07e9f7b1e2 @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 -https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.08.22-h06a4308_0.conda#243d5065a09a3e85ab888c05f5b6445a +https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.12.12-h06a4308_0.conda#12bf7315c3f5ca50300e8b48d1b4ef2e https://repo.anaconda.com/pkgs/main/linux-64/ld_impl_linux-64-2.38-h1181459_1.conda#68eedfd9c06f2b0e6888d8db345b7f5b https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023c-h04d1e81_0.conda#29db02adf8808f7c64642cead3e28acd https://repo.anaconda.com/pkgs/main/linux-64/libgomp-11.2.0-h1234567_1.conda#b372c0eea9b60732fdae4b817a63c8cd @@ -28,11 +28,11 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685 # pip certifi @ https://files.pythonhosted.org/packages/64/62/428ef076be88fa93716b576e4a01f919d25968913e817077a386fcbe4f42/certifi-2023.11.17-py3-none-any.whl#sha256=e036ab49d5b79556f99cfc2d9320b34cfbe5be05c5871b51de9329f0603b0474 # pip charset-normalizer @ https://files.pythonhosted.org/packages/98/69/5d8751b4b670d623aa7a47bef061d69c279e9f922f6705147983aa76c3ce/charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796 # pip cycler @ https://files.pythonhosted.org/packages/e7/05/c19819d5e3d95294a6f5947fb9b9629efb316b96de511b418c53d245aae6/cycler-0.12.1-py3-none-any.whl#sha256=85cef7cff222d8644161529808465972e51340599459b8ac3ccbac5a854e0d30 -# pip cython @ https://files.pythonhosted.org/packages/4e/0c/c796b64bb889e980a9b066249f65da5105110e4fbaf53885180313012ad3/Cython-3.0.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=1074e84752cd0daf3226823ddbc37cca8bc45f61c94a1db2a34e641f2b9b0797 +# pip cython @ https://files.pythonhosted.org/packages/32/63/b947d620e99250ab9b920d3bfdbeab305124e9d39afbe260a85906943e59/Cython-3.0.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=b9d0dae6dccd349b8ccf197c10ef2d05c711ca36a649c7eddbab1de2c90b63a1 # pip docutils @ https://files.pythonhosted.org/packages/26/87/f238c0670b94533ac0353a4e2a1a771a0cc73277b88bff23d3ae35a256c1/docutils-0.20.1-py3-none-any.whl#sha256=96f387a2c5562db4476f09f13bbab2192e764cac08ebbf3a34a95d9b1e4a59d6 # pip exceptiongroup @ https://files.pythonhosted.org/packages/b8/9a/5028fd52db10e600f1c4674441b968cf2ea4959085bfb5b99fb1250e5f68/exceptiongroup-1.2.0-py3-none-any.whl#sha256=4bfd3996ac73b41e9b9628b04e079f193850720ea5945fc96a08633c66912f14 # pip execnet @ https://files.pythonhosted.org/packages/e8/9c/a079946da30fac4924d92dbc617e5367d454954494cf1e71567bcc4e00ee/execnet-2.0.2-py3-none-any.whl#sha256=88256416ae766bc9e8895c76a87928c0012183da3cc4fc18016e6f050e025f41 -# pip fonttools @ https://files.pythonhosted.org/packages/ad/94/6cc0d252b4e8e6c61c971a8c50e38229c34a61147a059aafd308d1587b9f/fonttools-4.46.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=d00fc63131dcac6b25f50a5a129758438317e54e3ce5587163f7058de4b0e933 +# pip fonttools @ https://files.pythonhosted.org/packages/55/a7/f08f063c6ff1b2d3abd68cc4a6872143fbc0f99a83cc44b96944ff11f817/fonttools-4.47.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=253bb46bab970e8aae254cebf2ae3db98a4ef6bd034707aa68a239027d2b198d # pip idna @ https://files.pythonhosted.org/packages/c2/e7/a82b05cf63a603df6e68d59ae6a68bf5064484a0718ea5033660af4b54a9/idna-3.6-py3-none-any.whl#sha256=c05567e9c24a6b9faaa835c4821bad0590fbb9d5779e7caa6e1cc4978e7eb24f # pip imagesize @ https://files.pythonhosted.org/packages/ff/62/85c4c919272577931d407be5ba5d71c20f0b616d31a0befe0ae45bb79abd/imagesize-1.4.1-py2.py3-none-any.whl#sha256=0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b # pip iniconfig @ https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl#sha256=b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374 @@ -43,7 +43,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685 # pip networkx @ https://files.pythonhosted.org/packages/d5/f0/8fbc882ca80cf077f1b246c0e3c3465f7f415439bdea6b899f6b19f61f70/networkx-3.2.1-py3-none-any.whl#sha256=f18c69adc97877c42332c170849c96cefa91881c99a7cb3e95b7c659ebdc1ec2 # pip numpy @ https://files.pythonhosted.org/packages/2f/75/f007cc0e6a373207818bef17f463d3305e9dd380a70db0e523e7660bf21f/numpy-1.26.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=baf8aab04a2c0e859da118f0b38617e5ee65d75b83795055fb66c0d5e9e9b818 # pip packaging @ https://files.pythonhosted.org/packages/ec/1a/610693ac4ee14fcdf2d9bf3c493370e4f2ef7ae2e19217d7a237ff42367d/packaging-23.2-py3-none-any.whl#sha256=8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7 -# pip pillow @ https://files.pythonhosted.org/packages/5c/dc/acccca38a87272cb2eed372f112595439418dfb6119770b04dc06d3b78bd/Pillow-10.1.0-cp39-cp39-manylinux_2_28_x86_64.whl#sha256=b4005fee46ed9be0b8fb42be0c20e79411533d1fd58edabebc0dd24626882cfd +# pip pillow @ https://files.pythonhosted.org/packages/87/0d/8f5136a5481731c342a901ff155c587ce7804114db069345e1894ab4978a/pillow-10.2.0-cp39-cp39-manylinux_2_28_x86_64.whl#sha256=b6f491cdf80ae540738859d9766783e3b3c8e5bd37f5dfa0b76abdecc5081f13 # pip pluggy @ https://files.pythonhosted.org/packages/05/b8/42ed91898d4784546c5f06c60506400548db3f7a4b3fb441cba4e5c17952/pluggy-1.3.0-py3-none-any.whl#sha256=d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7 # pip py @ https://files.pythonhosted.org/packages/f6/f0/10642828a8dfb741e5f3fbaac830550a518a775c7fff6f04a007259b0548/py-1.11.0-py2.py3-none-any.whl#sha256=607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378 # pip pygments @ https://files.pythonhosted.org/packages/97/9c/372fef8377a6e340b1704768d20daaded98bf13282b5327beb2e2fe2c7ef/pygments-2.17.2-py3-none-any.whl#sha256=b27c2826c47d0f3219f29554824c30c5e8945175d888647acd804ddd04af846c @@ -55,21 +55,21 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685 # pip tabulate @ https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl#sha256=024ca478df22e9340661486f85298cff5f6dcdba14f3813e8830015b9ed1948f # pip threadpoolctl @ https://files.pythonhosted.org/packages/81/12/fd4dea011af9d69e1cad05c75f3f7202cdcbeac9b712eea58ca779a72865/threadpoolctl-3.2.0-py3-none-any.whl#sha256=2b7818516e423bdaebb97c723f86a7c6b0a83d3f3b0970328d66f4d9104dc032 # pip tomli @ https://files.pythonhosted.org/packages/97/75/10a9ebee3fd790d20926a90a2547f0bf78f371b2f13aa822c759680ca7b9/tomli-2.0.1-py3-none-any.whl#sha256=939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc -# pip tzdata @ https://files.pythonhosted.org/packages/d5/fb/a79efcab32b8a1f1ddca7f35109a50e4a80d42ac1c9187ab46522b2407d7/tzdata-2023.3-py2.py3-none-any.whl#sha256=7e65763eef3120314099b6939b5546db7adce1e7d6f2e179e3df563c70511eda +# pip tzdata @ https://files.pythonhosted.org/packages/a3/fb/52b62131e21b24ee297e4e95ed41eba29647dad0e0051a92bb66b43c70ff/tzdata-2023.4-py2.py3-none-any.whl#sha256=aa3ace4329eeacda5b7beb7ea08ece826c28d761cda36e747cfbf97996d39bf3 # pip urllib3 @ https://files.pythonhosted.org/packages/96/94/c31f58c7a7f470d5665935262ebd7455c7e4c7782eb525658d3dbf4b9403/urllib3-2.1.0-py3-none-any.whl#sha256=55901e917a5896a349ff771be919f8bd99aff50b79fe58fec595eb37bbc56bb3 # pip zipp @ https://files.pythonhosted.org/packages/d9/66/48866fc6b158c81cc2bfecc04c480f105c6040e8b077bc54c634b4a67926/zipp-3.17.0-py3-none-any.whl#sha256=0e923e726174922dce09c53c59ad483ff7bbb8e572e00c7f7c46b88556409f31 # pip contourpy @ https://files.pythonhosted.org/packages/a9/ba/d8fd1380876f1e9114157606302e3644c85f6d116aeba354c212ee13edc7/contourpy-1.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=11f8d2554e52f459918f7b8e6aa20ec2a3bce35ce95c1f0ef4ba36fbda306df5 -# pip coverage @ https://files.pythonhosted.org/packages/49/74/be94444c0458e68334dcd97d520b998cbef61eb71aead22fe105852c601b/coverage-7.3.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=f3bfd2c2f0e5384276e12b14882bf2c7621f97c35320c3e7132c156ce18436a1 +# pip coverage @ https://files.pythonhosted.org/packages/dc/9a/825705f435ef469c780045746c725f974ca8b059380df28b6331995a2ae1/coverage-7.4.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=bf635a52fc1ea401baf88843ae8708591aa4adff875e5c23220de43b1ccf575c # pip imageio @ https://files.pythonhosted.org/packages/c0/69/3aaa69cb0748e33e644fda114c9abd3186ce369edd4fca11107e9f39c6a7/imageio-2.33.1-py3-none-any.whl#sha256=c5094c48ccf6b2e6da8b4061cd95e1209380afafcbeae4a4e280938cce227e1d -# pip importlib-metadata @ https://files.pythonhosted.org/packages/73/26/9777cfe0cdc8181a32eaf542f4a2a435e5aba5dd38f41cfc0a532dc51027/importlib_metadata-7.0.0-py3-none-any.whl#sha256=d97503976bb81f40a193d41ee6570868479c69d5068651eb039c40d850c59d67 +# pip importlib-metadata @ https://files.pythonhosted.org/packages/c0/8b/d8427f023c081a8303e6ac7209c16e6878f2765d5b59667f3903fbcfd365/importlib_metadata-7.0.1-py3-none-any.whl#sha256=4805911c3a4ec7c3966410053e9ec6a1fecd629117df5adee56dfc9432a1081e # pip importlib-resources @ https://files.pythonhosted.org/packages/93/e8/facde510585869b5ec694e8e0363ffe4eba067cb357a8398a55f6a1f8023/importlib_resources-6.1.1-py3-none-any.whl#sha256=e8bf90d8213b486f428c9c39714b920041cb02c184686a3dee24905aaa8105d6 # pip jinja2 @ https://files.pythonhosted.org/packages/bc/c3/f068337a370801f372f2f8f6bad74a5c140f6fda3d9de154052708dd3c65/Jinja2-3.1.2-py3-none-any.whl#sha256=6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61 -# pip pytest @ https://files.pythonhosted.org/packages/f3/8c/f16efd81ca8e293b2cc78f111190a79ee539d0d5d36ccd49975cb3beac60/pytest-7.4.3-py3-none-any.whl#sha256=0d009c083ea859a71b76adf7c1d502e4bc170b80a8ef002da5806527b9591fac +# pip pytest @ https://files.pythonhosted.org/packages/51/ff/f6e8b8f39e08547faece4bd80f89d5a8de68a38b2d179cc1c4490ffa3286/pytest-7.4.4-py3-none-any.whl#sha256=b090cdf5ed60bf4c45261be03239c2c1c22df034fbffe691abe93cd80cea01d8 # pip python-dateutil @ https://files.pythonhosted.org/packages/36/7a/87837f39d0296e723bb9b62bbb257d0355c7f6128853c78955f57342a56d/python_dateutil-2.8.2-py2.py3-none-any.whl#sha256=961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9 # pip requests @ https://files.pythonhosted.org/packages/70/8e/0e2d847013cb52cd35b38c009bb167a1a26b2ce6cd6965bf26b47bc0bf44/requests-2.31.0-py3-none-any.whl#sha256=58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f # pip scipy @ https://files.pythonhosted.org/packages/db/86/bf3f01f003224c00dd94d9443d676023ed65d63ea2e34356888dc7fa8f48/scipy-1.11.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=91af76a68eeae0064887a48e25c4e616fa519fa0d38602eda7e0f97d65d57937 # pip tifffile @ https://files.pythonhosted.org/packages/54/a4/569fc717831969cf48bced350bdaf070cdeab06918d179429899e144358d/tifffile-2023.12.9-py3-none-any.whl#sha256=9b066e4b1a900891ea42ffd33dab8ba34c537935618b9893ddef42d7d422692f -# pip lightgbm @ https://files.pythonhosted.org/packages/b8/9d/1ce80cee7c5ef60f2fcc7e9fa97f29f7a8de3dc5a08922b3b2f1e9106481/lightgbm-4.1.0-py3-none-manylinux_2_28_x86_64.whl#sha256=47578cff4bc8116b62adc02437bf2b49dcc7ad4e8e3dd8dad3fe88e694d74d93 +# pip lightgbm @ https://files.pythonhosted.org/packages/a6/11/5171f6a1ecf7f008648fef6ef780d92414763ff5ba50a796657b9275dc1e/lightgbm-4.2.0-py3-none-manylinux_2_28_x86_64.whl#sha256=4a767795253ea5872abc7cc4e0892120af9b48a10e151c03cd62116bc2f099ab # pip matplotlib @ https://files.pythonhosted.org/packages/53/1f/653d60d2ec81a6095fa3e571cf2de57742bab8a51a5c01de26730ce3dc53/matplotlib-3.8.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=5864bdd7da445e4e5e011b199bb67168cdad10b501750367c496420f2ad00843 # pip pandas @ https://files.pythonhosted.org/packages/bc/f8/2aa75ae200bdb9dc6967712f26628a06bf45d3ad94cbbf6fb4962ada15a3/pandas-2.1.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=1ebfd771110b50055712b3b711b51bee5d50135429364d0498e1213a7adc2be8 # pip pyamg @ https://files.pythonhosted.org/packages/35/1c/8b2aa6fbb2bae258ab6cdb35b09635bf50865ac2bcdaf220db3d972cc0d8/pyamg-5.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=1332acec6d5ede9440c8ced0ef20952f5b766387116f254b79880ce29fdecee7 diff --git a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock index 9c2e4e5ef0b33..7cdaba97d29c6 100644 --- a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: b4bfe38c127d42c34beb5fbcbb6d7a983e7063f8a6ec415182acb410dfc68d8d +# input_hash: c63ec98efe67f85fd681c6634249719a3658c65049b5eeb017b5f0259990901a @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 https://repo.anaconda.com/pkgs/main/linux-64/blas-1.0-openblas.conda#9ddfcaef10d79366c90128f5dc444be8 @@ -34,7 +34,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/zlib-1.2.13-h5eee18b_0.conda#333e31 https://repo.anaconda.com/pkgs/main/linux-64/ccache-3.7.9-hfe4627d_0.conda#bef6fc681c273bb7bd0c67d1a591365e https://repo.anaconda.com/pkgs/main/linux-64/glib-2.69.1-he621ea3_2.conda#51cf1899782b3f3744aedd143fbc07f3 https://repo.anaconda.com/pkgs/main/linux-64/libcups-2.4.2-h2d74bed_1.conda#3f265c2172a9e8c90a74037b6fa13685 -https://repo.anaconda.com/pkgs/main/linux-64/libedit-3.1.20221030-h5eee18b_0.conda#7c724a17739aceaf9d1633ff06962137 +https://repo.anaconda.com/pkgs/main/linux-64/libedit-3.1.20230828-h5eee18b_0.conda#850eb5a9d2d7d3c66cce12e84406ca08 https://repo.anaconda.com/pkgs/main/linux-64/libllvm14-14.0.6-hdb19cb5_3.conda#aefea2b45cf32f12b4f1ffaa70aa3201 https://repo.anaconda.com/pkgs/main/linux-64/libpng-1.6.39-h5eee18b_0.conda#f6aee38184512eb05b06c2e94d39ab22 https://repo.anaconda.com/pkgs/main/linux-64/libxml2-2.10.4-hf1b16e4_1.conda#e87849ce513f9968794f20bba620e6a4 diff --git a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock index 27d314a248b4c..f0f5a1834d75b 100644 --- a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock @@ -1,13 +1,13 @@ # Generated by conda-lock. # platform: win-64 -# input_hash: af544b6135127d0b6abf1eedcc8ba32a4d5e2e1d2904d4592abc7f3dba338569 +# input_hash: 74fe5aa9801e09d66b9a87902cfa12e2e9343f9b8337d0126093f48d00544ab6 @EXPLICIT https://conda.anaconda.org/conda-forge/win-64/ca-certificates-2023.11.17-h56e8100_0.conda#1163114b483f26761f993c709e65271f https://conda.anaconda.org/conda-forge/win-64/intel-openmp-2023.2.0-h57928b3_50497.conda#a401f3cae152deb75bbed766a90a6312 https://conda.anaconda.org/conda-forge/win-64/mkl-include-2023.2.0-h6a75c08_50497.conda#02fd1f15c56cc902aeaf3df3497cf266 https://conda.anaconda.org/conda-forge/win-64/msys2-conda-epoch-20160418-1.tar.bz2#b0309b72560df66f71a9d5e34a5efdfa https://conda.anaconda.org/conda-forge/win-64/python_abi-3.9-4_cp39.conda#948b0d93d4ab1372d8fd45e1560afd47 -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/win-64/ucrt-10.0.22621.0-h57928b3_0.tar.bz2#72608f6cd3e5898229c3ea16deb1ac43 https://conda.anaconda.org/conda-forge/win-64/m2w64-gmp-6.1.0-2.tar.bz2#53a1c73e1e3d185516d7e3af177596d9 https://conda.anaconda.org/conda-forge/win-64/m2w64-libwinpthread-git-5.0.0.4634.697f757-2.tar.bz2#774130a326dee16f1ceb05cc687ee4f0 @@ -42,13 +42,13 @@ https://conda.anaconda.org/conda-forge/win-64/libvorbis-1.3.7-h0e60522_0.tar.bz2 https://conda.anaconda.org/conda-forge/win-64/libxml2-2.11.6-hc3477c8_0.conda#08ffbb4c22dd3622e122058368f8b708 https://conda.anaconda.org/conda-forge/win-64/m2w64-gcc-libs-5.3.0-7.tar.bz2#fe759119b8b3bfa720b8762c6fdc35de https://conda.anaconda.org/conda-forge/win-64/pcre2-10.42-h17e33f8_0.conda#59610c61da3af020289a806ec9c6a7fd -https://conda.anaconda.org/conda-forge/win-64/python-3.9.18-h4de0772_0_cpython.conda#ab83d6883a06de9c783c9aba765226c9 +https://conda.anaconda.org/conda-forge/win-64/python-3.9.18-h4de0772_1_cpython.conda#c0bc0080c5ec044edae6dbfa97ab337f https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.5-h12be248_0.conda#792bb5da68bf0a6cac6a6072ecb8dbeb https://conda.anaconda.org/conda-forge/win-64/brotli-bin-1.1.0-hcfcfb64_1.conda#0105229d7c5fabaa840043a86c10ec64 https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/win-64/cython-3.0.6-py39h99910a6_0.conda#eff4ff92d5839706ca82770ffdd12c36 +https://conda.anaconda.org/conda-forge/win-64/cython-3.0.7-py39h99910a6_0.conda#1b2dc7e2a329356c29d63f655c7b0c56 https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 https://conda.anaconda.org/conda-forge/noarch/execnet-2.0.2-pyhd8ed1ab_0.conda#67de0d8241e1060a479e3c37793e26f9 https://conda.anaconda.org/conda-forge/win-64/freetype-2.12.1-hdaf720e_2.conda#3761b23693f768dc75a8fd0a73ca053f @@ -77,7 +77,7 @@ https://conda.anaconda.org/conda-forge/win-64/xorg-libxau-1.0.11-hcd874cb_0.cond https://conda.anaconda.org/conda-forge/win-64/xorg-libxdmcp-1.1.3-hcd874cb_0.tar.bz2#46878ebb6b9cbd8afcf8088d7ef00ece https://conda.anaconda.org/conda-forge/noarch/zipp-3.17.0-pyhd8ed1ab_0.conda#2e4d6bc0b14e10f895fc6791a7d9b26a https://conda.anaconda.org/conda-forge/win-64/brotli-1.1.0-hcfcfb64_1.conda#f47f6db2528e38321fb00ae31674c133 -https://conda.anaconda.org/conda-forge/win-64/coverage-7.3.3-py39ha55989b_0.conda#6339bf1eb399f0add59cd5b4efe32417 +https://conda.anaconda.org/conda-forge/win-64/coverage-7.4.0-py39ha55989b_0.conda#ba8293a942069b021cbbef98f8df62ea https://conda.anaconda.org/conda-forge/win-64/glib-tools-2.78.3-h12be248_0.conda#03c45e65dbac2ba6c247dfd4896b664c https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.1.1-pyhd8ed1ab_0.conda#3d5fa25cf42f3f32a12b2d874ace8574 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc @@ -85,11 +85,11 @@ https://conda.anaconda.org/conda-forge/win-64/lcms2-2.16-h67d730c_0.conda#d35924 https://conda.anaconda.org/conda-forge/win-64/libxcb-1.15-hcd874cb_0.conda#090d91b69396f14afef450c285f9758c https://conda.anaconda.org/conda-forge/win-64/openjpeg-2.5.0-h3d672ee_3.conda#45a9628a04efb6fc326fff0a8f47b799 https://conda.anaconda.org/conda-forge/noarch/pip-23.3.2-pyhd8ed1ab_0.conda#8591c748f98dcc02253003533bc2e4b1 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/win-64/sip-6.7.12-py39h99910a6_0.conda#0cc5774390ada632ed7975203057c91c https://conda.anaconda.org/conda-forge/win-64/tbb-2021.11.0-h91493d7_0.conda#517c08eba817fb0e56cfd411ed198261 -https://conda.anaconda.org/conda-forge/win-64/fonttools-4.46.0-py39ha55989b_0.conda#af11b744b5913ff4ea2c500c2990c4f2 +https://conda.anaconda.org/conda-forge/win-64/fonttools-4.47.0-py39ha55989b_0.conda#77a71ca5ece414b48eab4f436e5f1113 https://conda.anaconda.org/conda-forge/win-64/glib-2.78.3-h12be248_0.conda#a14440f1d004a2ddccd9c1354dbeffdf https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/win-64/mkl-2023.2.0-h6a75c08_50497.conda#064cea9f45531e7b53584acf4bd8b044 @@ -97,11 +97,11 @@ https://conda.anaconda.org/conda-forge/win-64/pillow-10.1.0-py39h368b509_0.conda https://conda.anaconda.org/conda-forge/win-64/pyqt5-sip-12.12.2-py39h99910a6_5.conda#dffbcea794c524c471772a5f697c2aea https://conda.anaconda.org/conda-forge/noarch/pytest-cov-4.1.0-pyhd8ed1ab_0.conda#06eb685a3a0b146347a58dda979485da https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 -https://conda.anaconda.org/conda-forge/win-64/gstreamer-1.22.7-hb4038d2_1.conda#ae1bffda04b64c19f0cf3ac66473f3ab +https://conda.anaconda.org/conda-forge/win-64/gstreamer-1.22.8-hb4038d2_0.conda#498ec8375c067d237a6c85771f395138 https://conda.anaconda.org/conda-forge/win-64/libblas-3.9.0-20_win64_mkl.conda#6cad6cd2fbdeef4d651b8f752a4da960 https://conda.anaconda.org/conda-forge/win-64/mkl-devel-2023.2.0-h57928b3_50497.conda#0d52cfab24361c77268b54920c11903c https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e -https://conda.anaconda.org/conda-forge/win-64/gst-plugins-base-1.22.7-h001b923_1.conda#9f180ad66d7cec7a626c8283412a51cb +https://conda.anaconda.org/conda-forge/win-64/gst-plugins-base-1.22.8-h001b923_0.conda#4871a223a0b53452cbd34fd4c0c518e6 https://conda.anaconda.org/conda-forge/win-64/libcblas-3.9.0-20_win64_mkl.conda#e6d36cfcb2f2dff0f659d2aa0813eb2d https://conda.anaconda.org/conda-forge/win-64/liblapack-3.9.0-20_win64_mkl.conda#9510d07424d70fcac553d86b3e4a7c14 https://conda.anaconda.org/conda-forge/win-64/liblapacke-3.9.0-20_win64_mkl.conda#960008cd6e9827a5c9b68e77fdf3d29f diff --git a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock index d6dfdb57b0be0..c864e4e354f2e 100644 --- a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: d70964a380150a9fdd34471eab9c13547ec7744156a6719ec0e4b97fc7d298fa +# input_hash: dfda5c3b73321eb2a8bdc6c50490846e4a7a71dc4c8229f1f1b7a175acd8de80 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -11,7 +11,7 @@ https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_1.co https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h41732ed_0.conda#7aca3059a1729aa76c597603f10b0dd3 https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_3.conda#937eaed008f6bf2191c5fe76f87755e9 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_cp39.conda#bfe4b3259a8ac6cdf0037752904da6a7 -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 @@ -37,6 +37,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libogg-1.3.4-h7f98852_1.tar.bz2# https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2#15345e56d527b330e1cacbdf58676e8f https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.3.2-hd590300_0.conda#30de3fd9b3b602f7473f30e684eeea8c +https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#bdadff838d5437aea83607ced8b37f75 @@ -88,7 +89,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-ha9c0a0a_2.conda#5 https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-17.0.6-h4dfa4b3_0.conda#c1665f9c1c9f6c93d8b4e492a6a39056 https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.0.33-hca2cd23_6.conda#e87530d1b12dd7f4e0f856dc07358d60 https://conda.anaconda.org/conda-forge/linux-64/nss-3.96-h1d7d5a4_0.conda#1c8f8b8eb041ecd54053fc4b6ad57957 -https://conda.anaconda.org/conda-forge/linux-64/python-3.9.18-h0755675_0_cpython.conda#3ede353bc605068d9677e700b1847382 +https://conda.anaconda.org/conda-forge/linux-64/python-3.9.18-h0755675_1_cpython.conda#255a7002aeec7a067ff19b545aca6328 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-renderutil-0.3.9-hd590300_1.conda#e995b155d938b6779da6ace6c6b13816 @@ -102,7 +103,7 @@ https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.co https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.3.2-pyhd8ed1ab_0.conda#7f4a9e3fcff3f6356ae99244a014da6a https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.6-py39h3d6467e_0.conda#bfde3cf098e298b81d1c1cbc9c79ab59 +https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.7-py39h3d6467e_0.conda#04866e62ce30cff8f6f9c2ea9460eb09 https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2#ecfff944ba3960ecb334b9a2663d708d https://conda.anaconda.org/conda-forge/linux-64/docutils-0.20.1-py39hf3d152e_3.conda#09a48956e1c155907fd0d626f3e80f2e https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 @@ -130,7 +131,7 @@ https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b46 https://conda.anaconda.org/conda-forge/noarch/pygments-2.17.2-pyhd8ed1ab_0.conda#140a7f159396547e9799aa98f9f0742e https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2#2a7de29fb590ca14b5243c4c812c8025 -https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.3-pyhd8ed1ab_0.conda#2590495f608a63625e165915fb4e2e34 +https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 @@ -149,9 +150,9 @@ https://conda.anaconda.org/conda-forge/linux-64/xorg-libxrender-0.9.11-hd590300_ https://conda.anaconda.org/conda-forge/noarch/zipp-3.17.0-pyhd8ed1ab_0.conda#2e4d6bc0b14e10f895fc6791a7d9b26a https://conda.anaconda.org/conda-forge/noarch/babel-2.14.0-pyhd8ed1ab_0.conda#9669586875baeced8fc30c0826c3270e https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f907bb958910dc404647326ca80c263e -https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.46.0-py39hd1e30aa_0.conda#9b58e5973dd3d786253f4ca9534b1aba +https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.47.0-py39hd1e30aa_0.conda#01eba09d574310de928abf121f89b116 https://conda.anaconda.org/conda-forge/linux-64/glib-2.78.3-hfc55251_0.conda#e08e51acc7d1ae8dbe13255e7b4c64ac -https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.0.0-pyha770c72_0.conda#a941237cd06538837b25cd245fcd25d8 +https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.0.1-pyha770c72_0.conda#746623a787e06191d80a2133e5daff17 https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.1.1-pyhd8ed1ab_0.conda#3d5fa25cf42f3f32a12b2d874ace8574 https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.2-pyhd8ed1ab_1.tar.bz2#c8490ed5c70966d232fdd389d0dbed37 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc @@ -161,11 +162,11 @@ https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-20_linux64_openb https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-hd429924_1.conda#1dbcc04604fdf1e526e6d1b0b6938396 https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py39had0adad_0.conda#eeaa413fddccecb2ab7f747bdb55b07f https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda#e667a3ab0df62c54e60e1843d2e6defb https://conda.anaconda.org/conda-forge/noarch/urllib3-2.1.0-pyhd8ed1ab_0.conda#f8ced8ee63830dec7ecc1be048d1470a -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.7-h98fc4e7_1.conda#a8d71f6705ed1f70d7099a6bd1c078ac +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae @@ -175,7 +176,7 @@ https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.c https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py39h7633fee_0.conda#ed71ad3e30eb03da363fb797419cce98 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.7-h8e1006c_1.conda#89cd9374d5fc7371db238e4ef5c5f258 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py39hddac248_0.conda#dcfd2f15c6f8f0bbf234412b18a2a5d0 https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py39h474f0d3_0.conda#4b401c1516417b4b14aa1249d2f7929d diff --git a/build_tools/azure/ubuntu_atlas_lock.txt b/build_tools/azure/ubuntu_atlas_lock.txt index 65ffa33bd87ff..680f90207abe8 100644 --- a/build_tools/azure/ubuntu_atlas_lock.txt +++ b/build_tools/azure/ubuntu_atlas_lock.txt @@ -20,7 +20,7 @@ pluggy==1.3.0 # via pytest py==1.11.0 # via pytest-forked -pytest==7.4.3 +pytest==7.4.4 # via # -r build_tools/azure/ubuntu_atlas_requirements.txt # pytest-forked diff --git a/build_tools/circle/doc_environment.yml b/build_tools/circle/doc_environment.yml index 1567503ac45de..5789d2dfeabd1 100644 --- a/build_tools/circle/doc_environment.yml +++ b/build_tools/circle/doc_environment.yml @@ -20,6 +20,7 @@ dependencies: - setuptools - scikit-image - seaborn + - patsy=0.5.4 - memory_profiler - compilers - sphinx diff --git a/build_tools/circle/doc_linux-64_conda.lock b/build_tools/circle/doc_linux-64_conda.lock index 0f32504a121cd..ba98b2e05d26b 100644 --- a/build_tools/circle/doc_linux-64_conda.lock +++ b/build_tools/circle/doc_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 0d62c56444fc81a1e285d3657990a983d2c40ceb6fb44130975b4e8e72626137 +# input_hash: 2220bb165ad69a88fcb9db817451817fe8405e7de686ad25121b4e4239916e10 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -14,7 +14,7 @@ https://conda.anaconda.org/conda-forge/noarch/libgcc-devel_linux-64-12.3.0-h8bca https://conda.anaconda.org/conda-forge/noarch/libstdcxx-devel_linux-64-12.3.0-h8bca6fd_103.conda#3f784d2c059e960156d1ab3858cbf200 https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_3.conda#937eaed008f6bf2191c5fe76f87755e9 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_cp39.conda#bfe4b3259a8ac6cdf0037752904da6a7 -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 https://conda.anaconda.org/conda-forge/linux-64/libgomp-13.2.0-h807b86a_3.conda#7124cbb46b13d395bdde68f2d215c989 https://conda.anaconda.org/conda-forge/noarch/sysroot_linux-64-2.12-he073ed8_16.conda#071ea8dceff4d30ac511f4a2f8437cd1 @@ -34,7 +34,7 @@ https://conda.anaconda.org/conda-forge/linux-64/gettext-0.21.1-h27087fc_0.tar.bz https://conda.anaconda.org/conda-forge/linux-64/giflib-5.2.1-h0b41bf4_3.conda#96f3b11872ef6fad973eac856cd2624f https://conda.anaconda.org/conda-forge/linux-64/graphite2-1.3.13-h58526e2_1001.tar.bz2#8c54672728e8ec6aa6db90cf2806d220 https://conda.anaconda.org/conda-forge/linux-64/icu-73.2-h59595ed_0.conda#cc47e1facc155f91abd89b11e48e72ff -https://conda.anaconda.org/conda-forge/linux-64/jxrlib-1.1-h7f98852_2.tar.bz2#8e787b08fe19986d99d034b839df2961 +https://conda.anaconda.org/conda-forge/linux-64/jxrlib-1.1-hd590300_3.conda#5aeabe88534ea4169d4c49998f293d6c https://conda.anaconda.org/conda-forge/linux-64/keyutils-1.6.1-h166bdaf_0.tar.bz2#30186d27e2c9fa62b45fb1476b7200e3 https://conda.anaconda.org/conda-forge/linux-64/lame-3.100-h166bdaf_1003.tar.bz2#a8832b479f93521a9e7b5b743803be51 https://conda.anaconda.org/conda-forge/linux-64/lerc-4.0.0-h27087fc_0.tar.bz2#76bbff344f0134279f225174e9064c8f @@ -52,6 +52,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2 https://conda.anaconda.org/conda-forge/linux-64/libsanitizer-12.3.0-h0f45ef3_3.conda#eda05ab0db8f8490945fd99244183e3a https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.3.2-hd590300_0.conda#30de3fd9b3b602f7473f30e684eeea8c +https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/libzopfli-1.0.3-h9c3ff4c_0.tar.bz2#c66fe2d123249af7651ebde8984c51c2 https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 @@ -100,7 +101,7 @@ https://conda.anaconda.org/conda-forge/linux-64/zlib-1.2.13-hd590300_5.conda#68c https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.5-hfc55251_0.conda#04b88013080254850d6c01ed54810589 https://conda.anaconda.org/conda-forge/linux-64/blosc-1.21.5-h0f2a231_0.conda#009521b7ed97cca25f8f997f9e745976 https://conda.anaconda.org/conda-forge/linux-64/brotli-bin-1.1.0-hd590300_1.conda#39f910d205726805a958da408ca194ba -https://conda.anaconda.org/conda-forge/linux-64/c-blosc2-2.11.3-hb4ffafa_0.conda#f394ac64ab0e1fcb0152cc9c16df3d85 +https://conda.anaconda.org/conda-forge/linux-64/c-blosc2-2.12.0-hb4ffafa_0.conda#1a9b16afb84d734a1bb2d196c308d477 https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-h267a509_2.conda#9ae35c3d96db2c94ce0cef86efdfa2cb https://conda.anaconda.org/conda-forge/linux-64/gcc-12.3.0-h8d2909c_2.conda#e2f2f81f367e14ca1f77a870bda2fe59 https://conda.anaconda.org/conda-forge/linux-64/gcc_linux-64-12.3.0-h76fc315_2.conda#11517e7b5c910c5b5d6985c0c7eb7f50 @@ -116,7 +117,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-ha9c0a0a_2.conda#5 https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-17.0.6-h4dfa4b3_0.conda#c1665f9c1c9f6c93d8b4e492a6a39056 https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.0.33-hca2cd23_6.conda#e87530d1b12dd7f4e0f856dc07358d60 https://conda.anaconda.org/conda-forge/linux-64/nss-3.96-h1d7d5a4_0.conda#1c8f8b8eb041ecd54053fc4b6ad57957 -https://conda.anaconda.org/conda-forge/linux-64/python-3.9.18-h0755675_0_cpython.conda#3ede353bc605068d9677e700b1847382 +https://conda.anaconda.org/conda-forge/linux-64/python-3.9.18-h0755675_1_cpython.conda#255a7002aeec7a067ff19b545aca6328 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-renderutil-0.3.9-hd590300_1.conda#e995b155d938b6779da6ace6c6b13816 @@ -130,7 +131,7 @@ https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.co https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.3.2-pyhd8ed1ab_0.conda#7f4a9e3fcff3f6356ae99244a014da6a https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.6-py39h3d6467e_0.conda#bfde3cf098e298b81d1c1cbc9c79ab59 +https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.7-py39h3d6467e_0.conda#04866e62ce30cff8f6f9c2ea9460eb09 https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2#ecfff944ba3960ecb334b9a2663d708d https://conda.anaconda.org/conda-forge/linux-64/docutils-0.20.1-py39hf3d152e_3.conda#09a48956e1c155907fd0d626f3e80f2e https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 @@ -166,7 +167,7 @@ https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b46 https://conda.anaconda.org/conda-forge/noarch/pygments-2.17.2-pyhd8ed1ab_0.conda#140a7f159396547e9799aa98f9f0742e https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2#2a7de29fb590ca14b5243c4c812c8025 -https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.3-pyhd8ed1ab_0.conda#2590495f608a63625e165915fb4e2e34 +https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 @@ -190,10 +191,10 @@ https://conda.anaconda.org/conda-forge/noarch/babel-2.14.0-pyhd8ed1ab_0.conda#96 https://conda.anaconda.org/conda-forge/linux-64/brunsli-0.1-h9c3ff4c_0.tar.bz2#c1ac6229d0bfd14f8354ff9ad2a26cad https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f907bb958910dc404647326ca80c263e https://conda.anaconda.org/conda-forge/linux-64/cxx-compiler-1.7.0-h00ab1b0_0.conda#b4537c98cb59f8725b0e1e65816b4a28 -https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.46.0-py39hd1e30aa_0.conda#9b58e5973dd3d786253f4ca9534b1aba +https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.47.0-py39hd1e30aa_0.conda#01eba09d574310de928abf121f89b116 https://conda.anaconda.org/conda-forge/linux-64/fortran-compiler-1.7.0-heb67821_0.conda#7ef7c0f111dad1c8006504a0f1ccd820 https://conda.anaconda.org/conda-forge/linux-64/glib-2.78.3-hfc55251_0.conda#e08e51acc7d1ae8dbe13255e7b4c64ac -https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.0.0-pyha770c72_0.conda#a941237cd06538837b25cd245fcd25d8 +https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.0.1-pyha770c72_0.conda#746623a787e06191d80a2133e5daff17 https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.1.1-pyhd8ed1ab_0.conda#3d5fa25cf42f3f32a12b2d874ace8574 https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.2-pyhd8ed1ab_1.tar.bz2#c8490ed5c70966d232fdd389d0dbed37 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc @@ -206,12 +207,12 @@ https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py39had0adad_0.con https://conda.anaconda.org/conda-forge/noarch/pip-23.3.2-pyhd8ed1ab_0.conda#8591c748f98dcc02253003533bc2e4b1 https://conda.anaconda.org/conda-forge/noarch/plotly-5.18.0-pyhd8ed1ab_0.conda#9f6a8664f1fe752f79473eeb9bf33a60 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda#e667a3ab0df62c54e60e1843d2e6defb https://conda.anaconda.org/conda-forge/noarch/urllib3-2.1.0-pyhd8ed1ab_0.conda#f8ced8ee63830dec7ecc1be048d1470a https://conda.anaconda.org/conda-forge/linux-64/compilers-1.7.0-ha770c72_0.conda#81458b3aed8ab8711951ec3c0c04e097 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.7-h98fc4e7_1.conda#a8d71f6705ed1f70d7099a6bd1c078ac +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae @@ -221,12 +222,12 @@ https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.c https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py39h7633fee_0.conda#ed71ad3e30eb03da363fb797419cce98 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.7-h8e1006c_1.conda#89cd9374d5fc7371db238e4ef5c5f258 -https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-2023.9.18-py39hf9b8f0e_2.conda#38f576a701ea508ed210087c711a06ee +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 +https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-2024.1.1-py39hf9b8f0e_0.conda#9ddd29852457d1152ca235eb87bc74fb https://conda.anaconda.org/conda-forge/noarch/imageio-2.33.1-pyh8c1a49c_0.conda#1c34d58ac469a34e7e96832861368bce https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py39hddac248_0.conda#dcfd2f15c6f8f0bbf234412b18a2a5d0 https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.4-pyhd8ed1ab_0.conda#1184267eddebb57e47f8e1419c225595 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.19.19-py39h90d8ae4_0.conda#9cefe0d7ce9208c3afbbac29951aff59 +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.2-py39h90d8ae4_0.conda#8e63cf0a9bfbdb45c794de1aa6ff6806 https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.0-pyhd8ed1ab_0.conda#134b2b57b7865d2316a7cce1915a51ed https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/pywavelets-1.4.1-py39h44dd56e_1.conda#d037c20e3da2e85f03ebd20ad480c359 @@ -253,10 +254,10 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-qthelp-1.0.6-pyhd8ed https://conda.anaconda.org/conda-forge/noarch/sphinx-7.2.6-pyhd8ed1ab_0.conda#bbfd1120d1824d2d073bc65935f0e4c0 https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-serializinghtml-1.1.9-pyhd8ed1ab_0.conda#0612e497d7860728f2cda421ea2aec09 https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1ab_0.conda#286283e05a1eff606f55e7cd70f6d7f7 -# pip attrs @ https://files.pythonhosted.org/packages/f0/eb/fcb708c7bf5056045e9e98f62b93bd7467eb718b0202e7698eb11d66416c/attrs-23.1.0-py3-none-any.whl#sha256=1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04 +# pip attrs @ https://files.pythonhosted.org/packages/e0/44/827b2a91a5816512fcaf3cc4ebc465ccd5d598c45cefa6703fcf4a79018f/attrs-23.2.0-py3-none-any.whl#sha256=99b87a485a5820b23b879f04c2305b44b951b502fd64be915879d77a7e8fc6f1 # pip cloudpickle @ https://files.pythonhosted.org/packages/96/43/dae06432d0c4b1dc9e9149ad37b4ca8384cf6eb7700cd9215b177b914f0a/cloudpickle-3.0.0-py3-none-any.whl#sha256=246ee7d0c295602a036e86369c77fecda4ab17b506496730f2f576d9016fd9c7 # pip defusedxml @ https://files.pythonhosted.org/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl#sha256=a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61 -# pip fastjsonschema @ https://files.pythonhosted.org/packages/63/e9/d3dca06ea6b8e58e65716973bc7d9bee9bc39ce233595aa04d04e89a1089/fastjsonschema-2.19.0-py3-none-any.whl#sha256=b9fd1a2dd6971dbc7fee280a95bd199ae0dd9ce22beb91cc75e9c1c528a5170e +# pip fastjsonschema @ https://files.pythonhosted.org/packages/9c/b9/79691036d4a8f9857e74d1728b23f34f583b81350a27492edda58d5604e1/fastjsonschema-2.19.1-py3-none-any.whl#sha256=3672b47bc94178c9f23dbb654bf47440155d4db9df5f7bc47643315f9c405cd0 # pip fqdn @ https://files.pythonhosted.org/packages/cf/58/8acf1b3e91c58313ce5cb67df61001fc9dcd21be4fadb76c1a2d540e09ed/fqdn-1.5.1-py3-none-any.whl#sha256=3a179af3761e4df6eb2e026ff9e1a3033d3587bf980a0b1b2e1e5d08d7358014 # pip json5 @ https://files.pythonhosted.org/packages/70/ba/fa37123a86ae8287d6678535a944f9c3377d8165e536310ed6f6cb0f0c0e/json5-0.9.14-py2.py3-none-any.whl#sha256=740c7f1b9e584a468dbb2939d8d458db3427f2c93ae2139d05f47e453eae964f # pip jsonpointer @ https://files.pythonhosted.org/packages/12/f6/0232cc0c617e195f06f810534d00b74d2f348fe71b2118009ad8ad31f878/jsonpointer-2.4-py2.py3-none-any.whl#sha256=15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a @@ -271,13 +272,12 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip python-json-logger @ https://files.pythonhosted.org/packages/35/a6/145655273568ee78a581e734cf35beb9e33a370b29c5d3c8fee3744de29f/python_json_logger-2.0.7-py3-none-any.whl#sha256=f380b826a991ebbe3de4d897aeec42760035ac760345e57b812938dc8b35e2bd # pip pyyaml @ https://files.pythonhosted.org/packages/7d/39/472f2554a0f1e825bd7c5afc11c817cd7a2f3657460f7159f691fbb37c51/PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c # pip rfc3986-validator @ https://files.pythonhosted.org/packages/9e/51/17023c0f8f1869d8806b979a2bffa3f861f26a3f1a66b094288323fba52f/rfc3986_validator-0.1.1-py2.py3-none-any.whl#sha256=2f235c432ef459970b4306369336b9d5dbdda31b510ca1e327636e01f528bfa9 -# pip rpds-py @ https://files.pythonhosted.org/packages/13/de/2cf650e81270f3da659bfb38fd75d895c56588363afc459f8490e11caed7/rpds_py-0.15.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=2dccc623725d0b298f557d869a68496a2fd2a9e9c41107f234fa5f7a37d278ac +# pip rpds-py @ https://files.pythonhosted.org/packages/5e/e3/8a2d5cfb6c77c5897e72793b6bdc769fd55e4ce349569a4faf8e076eb775/rpds_py-0.16.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=80443fe2f7b3ea3934c5d75fb0e04a5dbb4a8e943e5ff2de0dec059202b70a8b # pip send2trash @ https://files.pythonhosted.org/packages/a9/78/e4df1e080ed790acf3a704edf521006dd96b9841bd2e2a462c0d255e0565/Send2Trash-1.8.2-py3-none-any.whl#sha256=a384719d99c07ce1eefd6905d2decb6f8b7ed054025bb0e618919f945de4f679 # pip sniffio @ https://files.pythonhosted.org/packages/c3/a0/5dba8ed157b0136607c7f2151db695885606968d1fae123dc3391e0cfdbf/sniffio-1.3.0-py3-none-any.whl#sha256=eecefdce1e5bbfb7ad2eeaabf7c1eeb404d7757c379bd1f7e5cce9d8bf425384 # pip soupsieve @ https://files.pythonhosted.org/packages/4c/f3/038b302fdfbe3be7da016777069f26ceefe11a681055ea1f7817546508e3/soupsieve-2.5-py3-none-any.whl#sha256=eaa337ff55a1579b6549dc679565eac1e3d000563bcb1c8ab0d0fefbc0c2cdc7 # pip traitlets @ https://files.pythonhosted.org/packages/a7/1d/7d07e1b152b419a8a9c7f812eeefd408a0610d869489ee2e86973486713f/traitlets-5.14.0-py3-none-any.whl#sha256=f14949d23829023013c47df20b4a76ccd1a85effb786dc060f34de7948361b33 # pip types-python-dateutil @ https://files.pythonhosted.org/packages/1c/af/5af2e2a02bc464c1c7818c260606343020b96c0d5b64f637d9e91aee24fe/types_python_dateutil-2.8.19.14-py3-none-any.whl#sha256=f977b8de27787639986b4e28963263fd0e5158942b3ecef91b9335c130cb1ce9 -# pip typing-extensions @ https://files.pythonhosted.org/packages/b7/f4/6a90020cd2d93349b442bfcb657d0dc91eee65491600b2cb1d388bc98e6b/typing_extensions-4.9.0-py3-none-any.whl#sha256=af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd # pip uri-template @ https://files.pythonhosted.org/packages/e7/00/3fca040d7cf8a32776d3d81a00c8ee7457e00f80c649f1e4a863c8321ae9/uri_template-1.3.0-py3-none-any.whl#sha256=a44a133ea12d44a0c0f06d7d42a52d71282e77e2f937d8abd5655b8d56fc1363 # pip webcolors @ https://files.pythonhosted.org/packages/d5/e1/3e9013159b4cbb71df9bd7611cbf90dc2c621c8aeeb677fc41dad72f2261/webcolors-1.13-py3-none-any.whl#sha256=29bc7e8752c0a1bd4a1f03c14d6e6a72e93d82193738fa860cbff59d0fcc11bf # pip webencodings @ https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl#sha256=a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78 @@ -288,15 +288,15 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip bleach @ https://files.pythonhosted.org/packages/ea/63/da7237f805089ecc28a3f36bca6a21c31fcbc2eb380f3b8f1be3312abd14/bleach-6.1.0-py3-none-any.whl#sha256=3225f354cfc436b9789c66c4ee030194bee0568fbf9cbdad3bc8b5c26c5f12b6 # pip cffi @ https://files.pythonhosted.org/packages/ea/ac/e9e77bc385729035143e54cc8c4785bd480eaca9df17565963556b0b7a93/cffi-1.16.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=8f8e709127c6c77446a8c0a8c8bf3c8ee706a06cd44b1e827c3e6a2ee6b8c098 # pip doit @ https://files.pythonhosted.org/packages/44/83/a2960d2c975836daa629a73995134fd86520c101412578c57da3d2aa71ee/doit-0.36.0-py3-none-any.whl#sha256=ebc285f6666871b5300091c26eafdff3de968a6bd60ea35dd1e3fc6f2e32479a -# pip jupyter-core @ https://files.pythonhosted.org/packages/ab/ea/af6508f71d2bcbf4db538940120cc3d3f10287f62105e756bd315aa345b5/jupyter_core-5.5.0-py3-none-any.whl#sha256=e11e02cd8ae0a9de5c6c44abf5727df9f2581055afe00b22183f621ba3585805 +# pip jupyter-core @ https://files.pythonhosted.org/packages/9d/27/38fa0cac8acc54a202dd432f98553ddd1826da9633fe875e72b09a9e2b98/jupyter_core-5.6.1-py3-none-any.whl#sha256=3d16aec2e1ec84b69f7794e49c32830c1d950ad149526aec954c100047c5f3a7 # pip referencing @ https://files.pythonhosted.org/packages/b4/11/d121780c173336c9bc3a5b8240ed31f518957cc22f6311c76259cb0fcf32/referencing-0.32.0-py3-none-any.whl#sha256=bdcd3efb936f82ff86f993093f6da7435c7de69a3b3a5a06678a6050184bee99 # pip rfc3339-validator @ https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl#sha256=24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa # pip terminado @ https://files.pythonhosted.org/packages/69/df/deebc9fb14a49062a3330f673e80b100e665b54d998163b3f62620b6240c/terminado-0.18.0-py3-none-any.whl#sha256=87b0d96642d0fe5f5abd7783857b9cab167f221a39ff98e3b9619a788a3c0f2e # pip tinycss2 @ https://files.pythonhosted.org/packages/da/99/fd23634d6962c2791fb8cb6ccae1f05dcbfc39bce36bba8b1c9a8d92eae8/tinycss2-1.2.1-py3-none-any.whl#sha256=2b80a96d41e7c3914b8cda8bc7f705a4d9c49275616e886103dd839dfc847847 # pip argon2-cffi-bindings @ https://files.pythonhosted.org/packages/ec/f7/378254e6dd7ae6f31fe40c8649eea7d4832a42243acaf0f1fff9083b2bed/argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=b746dba803a79238e925d9046a63aa26bf86ab2a2fe74ce6b009a1c3f5c8f2ae # pip isoduration @ https://files.pythonhosted.org/packages/7b/55/e5326141505c5d5e34c5e0935d2908a74e4561eca44108fbfb9c13d2911a/isoduration-20.11.0-py3-none-any.whl#sha256=b2904c2a4228c3d44f409c8ae8e2370eb21a26f7ac2ec5446df141dde3452042 -# pip jsonschema-specifications @ https://files.pythonhosted.org/packages/d7/48/b62ccba8f4ac91817d6a11b340e63806175dafb10234a8cf7140bd389da5/jsonschema_specifications-2023.11.2-py3-none-any.whl#sha256=e74ba7c0a65e8cb49dc26837d6cfe576557084a8b423ed16a420984228104f93 -# pip jupyter-server-terminals @ https://files.pythonhosted.org/packages/63/9a/98d252b7977ac3aa0aa4152b87b356e2048d4b193f38840c0e00dd85fadc/jupyter_server_terminals-0.5.0-py3-none-any.whl#sha256=2fc0692c883bfd891f4fba0c4b4a684a37234b0ba472f2e97ed0a3888f46e1e4 +# pip jsonschema-specifications @ https://files.pythonhosted.org/packages/ee/07/44bd408781594c4d0a027666ef27fab1e441b109dc3b76b4f836f8fd04fe/jsonschema_specifications-2023.12.1-py3-none-any.whl#sha256=87e4fdf3a94858b8a2ba2778d9ba57d8a9cafca7c7489c46ba0d30a8bc6a9c3c +# pip jupyter-server-terminals @ https://files.pythonhosted.org/packages/13/50/9e4688558eb1a20d16e99171af9026be27d31a8b212c241595241736811a/jupyter_server_terminals-0.5.1-py3-none-any.whl#sha256=5e63e947ddd97bb2832db5ef837a258d9ccd4192cd608c1270850ad947ae5dd7 # pip jupyterlite-core @ https://files.pythonhosted.org/packages/2f/0b/58eb568cbce3bbaa8702c6ce297870402828b222598a1db10e23e7190f52/jupyterlite_core-0.2.1-py3-none-any.whl#sha256=3f6161c4ad609bca913a42598005ff577611daae8dce448292fbb2c15db6b393 # pip pyzmq @ https://files.pythonhosted.org/packages/76/8b/6fca99e22c6316917de32b17be299dea431544209d619da16b6d9ec85c83/pyzmq-25.1.2-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl#sha256=c0b5ca88a8928147b7b1e2dfa09f3b6c256bc1135a1338536cbc9ea13d3b7add # pip argon2-cffi @ https://files.pythonhosted.org/packages/a4/6a/e8a041599e78b6b3752da48000b14c8d1e8a04ded09c88c714ba047f34f5/argon2_cffi-23.1.0-py3-none-any.whl#sha256=c670642b78ba29641818ab2e68bd4e6a78ba53b7eff7b4c3815ae16abf91c7ea @@ -306,7 +306,7 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip jupyter-events @ https://files.pythonhosted.org/packages/e3/55/0c1aa72f4317e826a471dc4adc3036acd11d496ded68c4bbac2a88551519/jupyter_events-0.9.0-py3-none-any.whl#sha256=d853b3c10273ff9bc8bb8b30076d65e2c9685579db736873de6c2232dde148bf # pip nbformat @ https://files.pythonhosted.org/packages/f4/e7/ef30a90b70eba39e675689b9eaaa92530a71d7435ab8f9cae520814e0caf/nbformat-5.9.2-py3-none-any.whl#sha256=1c5172d786a41b82bcfd0c23f9e6b6f072e8fb49c39250219e4acfff1efe89e9 # pip nbclient @ https://files.pythonhosted.org/packages/6b/3a/607149974149f847125c38a62b9ea2b8267eb74823bbf8d8c54ae0212a00/nbclient-0.9.0-py3-none-any.whl#sha256=a3a1ddfb34d4a9d17fc744d655962714a866639acd30130e9be84191cd97cd15 -# pip nbconvert @ https://files.pythonhosted.org/packages/f4/c8/b2b201d67d8fbe6e33865bf32b84104a77e6ace7f1e12614d686a1130033/nbconvert-7.12.0-py3-none-any.whl#sha256=5b6c848194d270cc55fb691169202620d7b52a12fec259508d142ecbe4219310 +# pip nbconvert @ https://files.pythonhosted.org/packages/7f/ba/3a8a9870a8b42e63e8f5e770adedd191d5adc2348f3097fc0e7c83a39439/nbconvert-7.14.0-py3-none-any.whl#sha256=483dde47facdaa4875903d651305ad53cd76e2255ae3c61efe412a95f2d22a24 # pip jupyter-server @ https://files.pythonhosted.org/packages/ed/20/2437a3865083360103b0218e82a910c4c35f3bf7248c5cdae6934ba4d01c/jupyter_server-2.12.1-py3-none-any.whl#sha256=fd030dd7be1ca572e4598203f718df6630c12bd28a599d7f1791c4d7938e1010 # pip jupyterlab-server @ https://files.pythonhosted.org/packages/a2/97/abbbe35fc67b6f9423309988f2e411f7cb117b08321866d3d8b720f4c0d4/jupyterlab_server-2.25.2-py3-none-any.whl#sha256=5b1798c9cc6a44f65c757de9f97fc06fc3d42535afbf47d2ace5e964ab447aaf -# pip jupyterlite-sphinx @ https://files.pythonhosted.org/packages/fa/f9/ad6d7164eca7ab9d523fc9b8c8a4a5508b424ee051f44a01797be224aeaa/jupyterlite_sphinx-0.10.0-py3-none-any.whl#sha256=72f332bf2748902802b719fbce598234e27facfcdc9aec020bf8cf025b12ba62 +# pip jupyterlite-sphinx @ https://files.pythonhosted.org/packages/9c/bd/1695eebeb376315c9fc5cbd41c54fb84bb69c68e69651bfc6f03aa4fe659/jupyterlite_sphinx-0.11.0-py3-none-any.whl#sha256=2a0762167e89ec6acd267c73bb90b528728fdba5e30390ea4fe37ddcec277191 diff --git a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock index 1a990a87c9e46..b105d3629d947 100644 --- a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock +++ b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 63e92fdc759dcf030bf7e6d4a5d86bec102c98562cfb7ebd4d3d4991c895678b +# input_hash: 38f0008ad0777e0e6c0aed8337cd71641123af41d0a9025d70195fbb550b1f6f @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -14,7 +14,7 @@ https://conda.anaconda.org/conda-forge/noarch/libgcc-devel_linux-64-12.3.0-h8bca https://conda.anaconda.org/conda-forge/noarch/libstdcxx-devel_linux-64-12.3.0-h8bca6fd_103.conda#3f784d2c059e960156d1ab3858cbf200 https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_3.conda#937eaed008f6bf2191c5fe76f87755e9 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_cp39.conda#bfe4b3259a8ac6cdf0037752904da6a7 -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda#939e3e74d8be4dac89ce83b20de2492a +https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 https://conda.anaconda.org/conda-forge/linux-64/libgomp-13.2.0-h807b86a_3.conda#7124cbb46b13d395bdde68f2d215c989 https://conda.anaconda.org/conda-forge/noarch/sysroot_linux-64-2.12-he073ed8_16.conda#071ea8dceff4d30ac511f4a2f8437cd1 @@ -45,6 +45,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2 https://conda.anaconda.org/conda-forge/linux-64/libsanitizer-12.3.0-h0f45ef3_3.conda#eda05ab0db8f8490945fd99244183e3a https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.3.2-hd590300_0.conda#30de3fd9b3b602f7473f30e684eeea8c +https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#bdadff838d5437aea83607ced8b37f75 @@ -98,7 +99,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-ha9c0a0a_2.conda#5 https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-17.0.6-h4dfa4b3_0.conda#c1665f9c1c9f6c93d8b4e492a6a39056 https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.0.33-hca2cd23_6.conda#e87530d1b12dd7f4e0f856dc07358d60 https://conda.anaconda.org/conda-forge/linux-64/nss-3.96-h1d7d5a4_0.conda#1c8f8b8eb041ecd54053fc4b6ad57957 -https://conda.anaconda.org/conda-forge/linux-64/python-3.9.18-h0755675_0_cpython.conda#3ede353bc605068d9677e700b1847382 +https://conda.anaconda.org/conda-forge/linux-64/python-3.9.18-h0755675_1_cpython.conda#255a7002aeec7a067ff19b545aca6328 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-renderutil-0.3.9-hd590300_1.conda#e995b155d938b6779da6ace6c6b13816 @@ -174,7 +175,7 @@ https://conda.anaconda.org/conda-forge/linux-64/cxx-compiler-1.7.0-h00ab1b0_0.co https://conda.anaconda.org/conda-forge/linux-64/cytoolz-0.12.2-py39hd1e30aa_1.conda#e5b62f0c1f96413116f16d33973f1a44 https://conda.anaconda.org/conda-forge/linux-64/fortran-compiler-1.7.0-heb67821_0.conda#7ef7c0f111dad1c8006504a0f1ccd820 https://conda.anaconda.org/conda-forge/linux-64/glib-2.78.3-hfc55251_0.conda#e08e51acc7d1ae8dbe13255e7b4c64ac -https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.0.0-pyha770c72_0.conda#a941237cd06538837b25cd245fcd25d8 +https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.0.1-pyha770c72_0.conda#746623a787e06191d80a2133e5daff17 https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.2-pyhd8ed1ab_1.tar.bz2#c8490ed5c70966d232fdd389d0dbed37 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-20_linux64_openblas.conda#36d486d72ab64ffea932329a1d3729a3 @@ -187,14 +188,14 @@ https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py39had0adad_0.con https://conda.anaconda.org/conda-forge/noarch/pip-23.3.2-pyhd8ed1ab_0.conda#8591c748f98dcc02253003533bc2e4b1 https://conda.anaconda.org/conda-forge/noarch/plotly-5.14.0-pyhd8ed1ab_0.conda#6a7bcc42ef58dd6cf3da9333ea102433 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.3-pyhd8ed1ab_0.conda#5bdca0aca30b0ee62bb84854e027eae0 +https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda#e667a3ab0df62c54e60e1843d2e6defb https://conda.anaconda.org/conda-forge/noarch/urllib3-2.1.0-pyhd8ed1ab_0.conda#f8ced8ee63830dec7ecc1be048d1470a https://conda.anaconda.org/conda-forge/linux-64/compilers-1.7.0-ha770c72_0.conda#81458b3aed8ab8711951ec3c0c04e097 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.7-h98fc4e7_1.conda#a8d71f6705ed1f70d7099a6bd1c078ac +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 -https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-7.0.0-hd8ed1ab_0.conda#12aff14f84c337be5e5636bf612f4140 +https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-7.0.1-hd8ed1ab_0.conda#4a2f43a20fa404b998859c6a470ba316 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae https://conda.anaconda.org/conda-forge/linux-64/numpy-1.19.5-py39hd249d9e_3.tar.bz2#0cf333996ebdeeba8d1c8c1c0ee9eff9 https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py39h3d6467e_5.conda#93aff412f3e49fdb43361c0215cbd72d @@ -202,12 +203,12 @@ https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.c https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/noarch/dask-core-2023.12.1-pyhd8ed1ab_0.conda#bf6ad72d882bc3f04e6a0fb50fd2cce8 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.7-h8e1006c_1.conda#89cd9374d5fc7371db238e4ef5c5f258 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-lite-2019.12.3-py39hd257fcd_5.tar.bz2#32dba66d6abc2b4b5b019c9e54307312 https://conda.anaconda.org/conda-forge/noarch/imageio-2.33.1-pyh8c1a49c_0.conda#1c34d58ac469a34e7e96832861368bce https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.3.4-py39h2fa2bec_0.tar.bz2#9ec0b2186fab9121c54f4844f93ee5b7 https://conda.anaconda.org/conda-forge/linux-64/pandas-1.1.5-py39hde0f152_0.tar.bz2#79fc4b5b3a865b90dd3701cecf1ad33c -https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.4-pyhd8ed1ab_0.conda#1184267eddebb57e47f8e1419c225595 +https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.5-pyhd8ed1ab_0.conda#f266f66ba1dcae0dbcc771a491acbea4 https://conda.anaconda.org/conda-forge/linux-64/polars-0.19.12-py39h90d8ae4_0.conda#191828961c95f8d59fa2b86a590f9905 https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.0-pyhd8ed1ab_0.conda#134b2b57b7865d2316a7cce1915a51ed https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e diff --git a/build_tools/update_environments_and_lock_files.py b/build_tools/update_environments_and_lock_files.py index abd91512759f2..a5b3068d3964b 100644 --- a/build_tools/update_environments_and_lock_files.py +++ b/build_tools/update_environments_and_lock_files.py @@ -306,6 +306,8 @@ def remove_from(alist, to_remove): "conda_dependencies": common_dependencies_without_coverage + [ "scikit-image", "seaborn", + # TODO Remove when patsy pin is not needed anymore, see below + "patsy", "memory_profiler", "compilers", "sphinx", @@ -321,6 +323,10 @@ def remove_from(alist, to_remove): "pip_dependencies": ["jupyterlite-sphinx", "jupyterlite-pyodide-kernel"], "package_constraints": { "python": "3.9", + # TODO: Remove pin when issue is fixed in patsy, see + # https://github.com/pydata/patsy/issues/198. patsy 0.5.5 + # introduced a DeprecationWarning at import-time. + "patsy": "0.5.4", }, }, { From b5827cbf7a4c232a77c18b904c0a7cf69c082346 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Thu, 4 Jan 2024 14:28:45 +0100 Subject: [PATCH 011/554] CI Fix scipy-dev build (#28047) --- sklearn/datasets/tests/test_svmlight_format.py | 2 +- sklearn/ensemble/tests/test_forest.py | 2 +- sklearn/metrics/tests/test_dist_metrics.py | 2 +- sklearn/neighbors/tests/test_neighbors.py | 4 ++-- sklearn/utils/_testing.py | 10 +++++----- sklearn/utils/tests/test_class_weight.py | 2 +- sklearn/utils/tests/test_utils.py | 2 +- sklearn/utils/tests/test_validation.py | 18 +++++++++++++++--- 8 files changed, 27 insertions(+), 15 deletions(-) diff --git a/sklearn/datasets/tests/test_svmlight_format.py b/sklearn/datasets/tests/test_svmlight_format.py index 78a006f8f228b..10b0e29810ef7 100644 --- a/sklearn/datasets/tests/test_svmlight_format.py +++ b/sklearn/datasets/tests/test_svmlight_format.py @@ -261,7 +261,7 @@ def test_invalid_filename(): def test_dump(csr_container): X_sparse, y_dense = _load_svmlight_local_test_file(datafile) X_dense = X_sparse.toarray() - y_sparse = csr_container(y_dense) + y_sparse = csr_container(np.atleast_2d(y_dense)) # slicing a csr_matrix can unsort its .indices, so test that we sort # those correctly diff --git a/sklearn/ensemble/tests/test_forest.py b/sklearn/ensemble/tests/test_forest.py index 0eced91418278..2468f8fc5b590 100644 --- a/sklearn/ensemble/tests/test_forest.py +++ b/sklearn/ensemble/tests/test_forest.py @@ -1595,7 +1595,7 @@ def test_max_samples_boundary_classifiers(name): @pytest.mark.parametrize("csr_container", CSR_CONTAINERS) def test_forest_y_sparse(csr_container): X = [[1, 2, 3]] - y = csr_container([4, 5, 6]) + y = csr_container([[4, 5, 6]]) est = RandomForestClassifier() msg = "sparse multilabel-indicator for y is not supported." with pytest.raises(ValueError, match=msg): diff --git a/sklearn/metrics/tests/test_dist_metrics.py b/sklearn/metrics/tests/test_dist_metrics.py index b7b2e04b11396..baaf447d3909b 100644 --- a/sklearn/metrics/tests/test_dist_metrics.py +++ b/sklearn/metrics/tests/test_dist_metrics.py @@ -366,7 +366,7 @@ def test_readonly_kwargs(): (np.array([1, 1.5, np.nan]), ValueError, "w contains NaN"), *[ ( - csr_container([1, 1.5, 1]), + csr_container([[1, 1.5, 1]]), TypeError, "Sparse data was passed for w, but dense data is required", ) diff --git a/sklearn/neighbors/tests/test_neighbors.py b/sklearn/neighbors/tests/test_neighbors.py index 00c53734c9576..2be0237cd5f7e 100644 --- a/sklearn/neighbors/tests/test_neighbors.py +++ b/sklearn/neighbors/tests/test_neighbors.py @@ -476,8 +476,8 @@ def test_is_sorted_by_data(csr_container): # _is_sorted_by_data should return True when entries are sorted by data, # and False in all other cases. - # Test with sorted 1D array - X = csr_container(np.arange(10)) + # Test with sorted single row sparse array + X = csr_container(np.arange(10).reshape(1, 10)) assert _is_sorted_by_data(X) # Test with unsorted 1D array X[0, 2] = 5 diff --git a/sklearn/utils/_testing.py b/sklearn/utils/_testing.py index a9ecaa8cd2d9d..5411c4dacf766 100644 --- a/sklearn/utils/_testing.py +++ b/sklearn/utils/_testing.py @@ -765,7 +765,7 @@ def _convert_container( elif constructor_name == "array": return np.asarray(container, dtype=dtype) elif constructor_name == "sparse": - return sp.sparse.csr_matrix(container, dtype=dtype) + return sp.sparse.csr_matrix(np.atleast_2d(container), dtype=dtype) elif constructor_name in ("pandas", "dataframe"): pd = pytest.importorskip("pandas", minversion=minversion) result = pd.DataFrame(container, columns=columns_name, dtype=dtype, copy=False) @@ -803,18 +803,18 @@ def _convert_container( elif constructor_name == "slice": return slice(container[0], container[1]) elif constructor_name == "sparse_csr": - return sp.sparse.csr_matrix(container, dtype=dtype) + return sp.sparse.csr_matrix(np.atleast_2d(container), dtype=dtype) elif constructor_name == "sparse_csr_array": if sp_version >= parse_version("1.8"): - return sp.sparse.csr_array(container, dtype=dtype) + return sp.sparse.csr_array(np.atleast_2d(container), dtype=dtype) raise ValueError( f"sparse_csr_array is only available with scipy>=1.8.0, got {sp_version}" ) elif constructor_name == "sparse_csc": - return sp.sparse.csc_matrix(container, dtype=dtype) + return sp.sparse.csc_matrix(np.atleast_2d(container), dtype=dtype) elif constructor_name == "sparse_csc_array": if sp_version >= parse_version("1.8"): - return sp.sparse.csc_array(container, dtype=dtype) + return sp.sparse.csc_array(np.atleast_2d(container), dtype=dtype) raise ValueError( f"sparse_csc_array is only available with scipy>=1.8.0, got {sp_version}" ) diff --git a/sklearn/utils/tests/test_class_weight.py b/sklearn/utils/tests/test_class_weight.py index d1deeae8ebd20..b98ce6be05658 100644 --- a/sklearn/utils/tests/test_class_weight.py +++ b/sklearn/utils/tests/test_class_weight.py @@ -311,6 +311,6 @@ def test_class_weight_does_not_contains_more_classes(): @pytest.mark.parametrize("csc_container", CSC_CONTAINERS) def test_compute_sample_weight_sparse(csc_container): """Check that we can compute weight for sparse `y`.""" - y = csc_container(np.asarray([0, 1, 1])).T + y = csc_container(np.asarray([[0], [1], [1]])) sample_weight = compute_sample_weight("balanced", y) assert_allclose(sample_weight, [1.5, 0.75, 0.75]) diff --git a/sklearn/utils/tests/test_utils.py b/sklearn/utils/tests/test_utils.py index 89ab73582cefc..5f3fe72c0f7ef 100644 --- a/sklearn/utils/tests/test_utils.py +++ b/sklearn/utils/tests/test_utils.py @@ -168,7 +168,7 @@ def test_resample_stratify_sparse_error(csr_container): n_samples = 100 X = rng.normal(size=(n_samples, 2)) y = rng.randint(0, 2, size=n_samples) - stratify = csr_container(y) + stratify = csr_container(y.reshape(-1, 1)) with pytest.raises(TypeError, match="Sparse data was passed"): X, y = resample(X, y, n_samples=50, random_state=rng, stratify=stratify) diff --git a/sklearn/utils/tests/test_validation.py b/sklearn/utils/tests/test_validation.py index 1f847dbd55d62..b627c55a7ef12 100644 --- a/sklearn/utils/tests/test_validation.py +++ b/sklearn/utils/tests/test_validation.py @@ -639,9 +639,21 @@ def test_check_array_accept_sparse_no_exception(): @pytest.fixture(params=["csr", "csc", "coo", "bsr"]) def X_64bit(request): X = sp.rand(20, 10, format=request.param) - for attr in ["indices", "indptr", "row", "col"]: - if hasattr(X, attr): - setattr(X, attr, getattr(X, attr).astype("int64")) + + if request.param == "coo": + if hasattr(X, "indices"): + # for scipy >= 1.13 .indices is a new attribute and is a tuple. The + # .col and .row attributes do not seem to be able to change the + # dtype, for more details see https://github.com/scipy/scipy/pull/18530/ + X.indices = tuple(v.astype("int64") for v in X.indices) + else: + # scipy < 1.13 + X.row = X.row.astype("int64") + X.col = X.col.astype("int64") + else: + X.indices = X.indices.astype("int64") + X.indptr = X.indptr.astype("int64") + yield X From e2b3785b6a1f96989c3992bffa2a05ef5c048a7e Mon Sep 17 00:00:00 2001 From: Mark Elliot <123787712+mark-thm@users.noreply.github.com> Date: Thu, 4 Jan 2024 10:04:30 -0500 Subject: [PATCH 012/554] FEAT Add custom imputation strategy to SimpleImputer (#28053) Co-authored-by: Joel Nothman --- doc/whats_new/v1.5.rst | 6 +++++ sklearn/impute/_base.py | 24 ++++++++++++++++++-- sklearn/impute/tests/test_impute.py | 34 +++++++++++++++++++++++++++++ 3 files changed, 62 insertions(+), 2 deletions(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index fbd8a3f83b1dd..f7a521ca4f0d0 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -25,6 +25,12 @@ Changelog :pr:`123456` by :user:`Joe Bloggs `. where 123455 is the *pull request* number, not the issue number. +:mod:`sklearn.impute` +..................... +- |Enhancement| :class:`impute.SimpleImputer` now supports custom strategies + by passing a function in place of a strategy name. + :pr:`28053` by :user:`Mark Elliot `. + Code and Documentation Contributors ----------------------------------- diff --git a/sklearn/impute/_base.py b/sklearn/impute/_base.py index 4202cd5feb799..dff39d4734554 100644 --- a/sklearn/impute/_base.py +++ b/sklearn/impute/_base.py @@ -6,6 +6,7 @@ import warnings from collections import Counter from functools import partial +from typing import Callable import numpy as np import numpy.ma as ma @@ -163,7 +164,7 @@ class SimpleImputer(_BaseImputer): nullable integer dtypes with missing values, `missing_values` can be set to either `np.nan` or `pd.NA`. - strategy : str, default='mean' + strategy : str or Callable, default='mean' The imputation strategy. - If "mean", then replace missing values using the mean along @@ -175,10 +176,16 @@ class SimpleImputer(_BaseImputer): If there is more than one such value, only the smallest is returned. - If "constant", then replace missing values with fill_value. Can be used with strings or numeric data. + - If an instance of Callable, then replace missing values using the + scalar statistic returned by running the callable over a dense 1d + array containing non-missing values of each column. .. versionadded:: 0.20 strategy="constant" for fixed value imputation. + .. versionadded:: 1.5 + strategy=callable for custom value imputation. + fill_value : str or numerical value, default=None When strategy == "constant", `fill_value` is used to replace all occurrences of missing_values. For string or object data types, @@ -270,7 +277,10 @@ class SimpleImputer(_BaseImputer): _parameter_constraints: dict = { **_BaseImputer._parameter_constraints, - "strategy": [StrOptions({"mean", "median", "most_frequent", "constant"})], + "strategy": [ + StrOptions({"mean", "median", "most_frequent", "constant"}), + callable, + ], "fill_value": "no_validation", # any object is valid "copy": ["boolean"], } @@ -456,6 +466,9 @@ def _sparse_fit(self, X, strategy, missing_values, fill_value): elif strategy == "most_frequent": statistics[i] = _most_frequent(column, 0, n_zeros) + elif isinstance(strategy, Callable): + statistics[i] = self.strategy(column) + super()._fit_indicator(missing_mask) return statistics @@ -518,6 +531,13 @@ def _dense_fit(self, X, strategy, missing_values, fill_value): # fill_value in each column return np.full(X.shape[1], fill_value, dtype=X.dtype) + # Custom + elif isinstance(strategy, Callable): + statistics = np.empty(masked_X.shape[1]) + for i in range(masked_X.shape[1]): + statistics[i] = self.strategy(masked_X[:, i].compressed()) + return statistics + def transform(self, X): """Impute all missing values in `X`. diff --git a/sklearn/impute/tests/test_impute.py b/sklearn/impute/tests/test_impute.py index c499dc3b89d32..2128c796e4800 100644 --- a/sklearn/impute/tests/test_impute.py +++ b/sklearn/impute/tests/test_impute.py @@ -1710,3 +1710,37 @@ def test_simple_imputer_keep_empty_features(strategy, array_type, keep_empty_fea assert_array_equal(constant_feature, 0) else: assert X_imputed.shape == (X.shape[0], X.shape[1] - 1) + + +@pytest.mark.parametrize("csc_container", CSC_CONTAINERS) +def test_imputation_custom(csc_container): + X = np.array( + [ + [1.1, 1.1, 1.1], + [3.9, 1.2, np.nan], + [np.nan, 1.3, np.nan], + [0.1, 1.4, 1.4], + [4.9, 1.5, 1.5], + [np.nan, 1.6, 1.6], + ] + ) + + X_true = np.array( + [ + [1.1, 1.1, 1.1], + [3.9, 1.2, 1.1], + [0.1, 1.3, 1.1], + [0.1, 1.4, 1.4], + [4.9, 1.5, 1.5], + [0.1, 1.6, 1.6], + ] + ) + + imputer = SimpleImputer(missing_values=np.nan, strategy=np.min) + X_trans = imputer.fit_transform(X) + assert_array_equal(X_trans, X_true) + + # Sparse matrix + imputer = SimpleImputer(missing_values=np.nan, strategy=np.min) + X_trans = imputer.fit_transform(csc_container(X)) + assert_array_equal(X_trans.toarray(), X_true) From 3c759c599090cc3a695d84e737e320767d45cb96 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Thu, 4 Jan 2024 20:58:45 +0100 Subject: [PATCH 013/554] CI Remove unused mkl_no_coverage lock file (#28061) --- ...onda_forge_mkl_no_coverage_environment.yml | 21 --- ..._forge_mkl_no_coverage_linux-64_conda.lock | 169 ------------------ .../update_environments_and_lock_files.py | 10 -- 3 files changed, 200 deletions(-) delete mode 100644 build_tools/azure/pylatest_conda_forge_mkl_no_coverage_environment.yml delete mode 100644 build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock diff --git a/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_environment.yml b/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_environment.yml deleted file mode 100644 index 02392a4e05aa8..0000000000000 --- a/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_environment.yml +++ /dev/null @@ -1,21 +0,0 @@ -# DO NOT EDIT: this file is generated from the specification found in the -# following script to centralize the configuration for CI builds: -# build_tools/update_environments_and_lock_files.py -channels: - - conda-forge -dependencies: - - python - - numpy - - blas[build=mkl] - - scipy - - cython - - joblib - - threadpoolctl - - matplotlib - - pandas - - pyamg - - pytest - - pytest-xdist=2.5.0 - - pillow - - setuptools - - ccache diff --git a/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock deleted file mode 100644 index 1f4ef37ac52c2..0000000000000 --- a/build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock +++ /dev/null @@ -1,169 +0,0 @@ -# Generated by conda-lock. -# platform: linux-64 -# input_hash: 66cbc7b263fbf4db3cc89cc53f522739390cbf324ab81cff43bff8bd3630c49d -@EXPLICIT -https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 -https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee -https://conda.anaconda.org/conda-forge/noarch/font-ttf-dejavu-sans-mono-2.37-hab24e00_0.tar.bz2#0c96522c6bdaed4b1566d11387caaf45 -https://conda.anaconda.org/conda-forge/noarch/font-ttf-inconsolata-3.000-h77eed37_0.tar.bz2#34893075a5c9e55cdafac56607368fc6 -https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77eed37_0.tar.bz2#4d59c254e01d9cde7957100457e2d5fb -https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_1.conda#6185f640c43843e5ad6fd1c5372c3f80 -https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h41732ed_0.conda#7aca3059a1729aa76c597603f10b0dd3 -https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_3.conda#937eaed008f6bf2191c5fe76f87755e9 -https://conda.anaconda.org/conda-forge/linux-64/mkl-include-2023.2.0-h84fe81f_50496.conda#7af9fd0b2d7219f4a4200a34561340f6 -https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.12-4_cp312.conda#dccc2d142812964fcc6abdc97b672dff -https://conda.anaconda.org/conda-forge/noarch/tzdata-2023d-h0c530f3_0.conda#8dee24b8be2d9ff81e7bd4d7d97ff1b0 -https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 -https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab -https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 -https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h807b86a_3.conda#23fdf1fef05baeb7eadc2aed5fb0011f -https://conda.anaconda.org/conda-forge/linux-64/alsa-lib-1.2.10-hd590300_0.conda#75dae9a4201732aa78a530b826ee5fe0 -https://conda.anaconda.org/conda-forge/linux-64/attr-2.5.1-h166bdaf_1.tar.bz2#d9c69a24ad678ffce24c6543a0176b00 -https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-hd590300_5.conda#69b8b6202a07720f448be700e300ccf4 -https://conda.anaconda.org/conda-forge/linux-64/gettext-0.21.1-h27087fc_0.tar.bz2#14947d8770185e5153fdd04d4673ed37 -https://conda.anaconda.org/conda-forge/linux-64/graphite2-1.3.13-h58526e2_1001.tar.bz2#8c54672728e8ec6aa6db90cf2806d220 -https://conda.anaconda.org/conda-forge/linux-64/icu-73.2-h59595ed_0.conda#cc47e1facc155f91abd89b11e48e72ff -https://conda.anaconda.org/conda-forge/linux-64/keyutils-1.6.1-h166bdaf_0.tar.bz2#30186d27e2c9fa62b45fb1476b7200e3 -https://conda.anaconda.org/conda-forge/linux-64/lame-3.100-h166bdaf_1003.tar.bz2#a8832b479f93521a9e7b5b743803be51 -https://conda.anaconda.org/conda-forge/linux-64/lerc-4.0.0-h27087fc_0.tar.bz2#76bbff344f0134279f225174e9064c8f -https://conda.anaconda.org/conda-forge/linux-64/libbrotlicommon-1.1.0-hd590300_1.conda#aec6c91c7371c26392a06708a73c70e5 -https://conda.anaconda.org/conda-forge/linux-64/libdeflate-1.19-hd590300_0.conda#1635570038840ee3f9c71d22aa5b8b6d -https://conda.anaconda.org/conda-forge/linux-64/libexpat-2.5.0-hcb278e6_1.conda#6305a3dd2752c76335295da4e581f2fd -https://conda.anaconda.org/conda-forge/linux-64/libffi-3.4.2-h7f98852_5.tar.bz2#d645c6d2ac96843a2bfaccd2d62b3ac3 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-ha4646dd_3.conda#c714d905cdfa0e70200f68b80cc04764 -https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.17-hd590300_2.conda#d66573916ffcf376178462f1b61c941e -https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.0.0-hd590300_1.conda#ea25936bb4080d843790b586850f82b8 -https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.1-hd590300_0.conda#30fd6e37fe21f86f4bd26d6ee73eeec7 -https://conda.anaconda.org/conda-forge/linux-64/libogg-1.3.4-h7f98852_1.tar.bz2#6e8cc2173440d77708196c5b93771680 -https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2#15345e56d527b330e1cacbdf58676e8f -https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b -https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.3.2-hd590300_0.conda#30de3fd9b3b602f7473f30e684eeea8c -https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc -https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad -https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 -https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#bdadff838d5437aea83607ced8b37f75 -https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4-h59595ed_2.conda#7dbaa197d7ba6032caf7ae7f32c1efa0 -https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 -https://conda.anaconda.org/conda-forge/linux-64/openssl-3.2.0-hd590300_1.conda#603827b39ea2b835268adb8c821b8570 -https://conda.anaconda.org/conda-forge/linux-64/pixman-0.42.2-h59595ed_0.conda#700edd63ccd5fc66b70b1c028cea9a68 -https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-h36c2ea0_1001.tar.bz2#22dad4df6e8630e8dff2428f6f6a7036 -https://conda.anaconda.org/conda-forge/linux-64/tbb-2021.7.0-h924138e_0.tar.bz2#819421f81b127a5547bf96ad57eccdd9 -https://conda.anaconda.org/conda-forge/linux-64/xorg-kbproto-1.0.7-h7f98852_1002.tar.bz2#4b230e8381279d76131116660f5a241a -https://conda.anaconda.org/conda-forge/linux-64/xorg-libice-1.1.1-hd590300_0.conda#b462a33c0be1421532f28bfe8f4a7514 -https://conda.anaconda.org/conda-forge/linux-64/xorg-libxau-1.0.11-hd590300_0.conda#2c80dc38fface310c9bd81b17037fee5 -https://conda.anaconda.org/conda-forge/linux-64/xorg-libxdmcp-1.1.3-h7f98852_0.tar.bz2#be93aabceefa2fac576e971aef407908 -https://conda.anaconda.org/conda-forge/linux-64/xorg-renderproto-0.11.1-h7f98852_1002.tar.bz2#06feff3d2634e3097ce2fe681474b534 -https://conda.anaconda.org/conda-forge/linux-64/xorg-xextproto-7.3.0-h0b41bf4_1003.conda#bce9f945da8ad2ae9b1d7165a64d0f87 -https://conda.anaconda.org/conda-forge/linux-64/xorg-xf86vidmodeproto-2.3.1-h7f98852_1002.tar.bz2#3ceea9668625c18f19530de98b15d5b0 -https://conda.anaconda.org/conda-forge/linux-64/xorg-xproto-7.0.31-h7f98852_1007.tar.bz2#b4a4381d54784606820704f7b5f05a15 -https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2#2161070d867d1b1204ea749c8eec4ef0 -https://conda.anaconda.org/conda-forge/linux-64/expat-2.5.0-hcb278e6_1.conda#8b9b5aca60558d02ddaa09d599e55920 -https://conda.anaconda.org/conda-forge/linux-64/libbrotlidec-1.1.0-hd590300_1.conda#f07002e225d7a60a694d42a7bf5ff53f -https://conda.anaconda.org/conda-forge/linux-64/libbrotlienc-1.1.0-hd590300_1.conda#5fc11c6020d421960607d821310fcd4d -https://conda.anaconda.org/conda-forge/linux-64/libcap-2.69-h0f662aa_0.conda#25cb5999faa414e5ccb2c1388f62d3d5 -https://conda.anaconda.org/conda-forge/linux-64/libedit-3.1.20191231-he28a2e2_2.tar.bz2#4d331e44109e3f0e19b4cb8f9b82f3e1 -https://conda.anaconda.org/conda-forge/linux-64/libevent-2.1.12-hf998b51_1.conda#a1cfcc585f0c42bf8d5546bb1dfb668d -https://conda.anaconda.org/conda-forge/linux-64/libflac-1.4.3-h59595ed_0.conda#ee48bf17cc83a00f59ca1494d5646869 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_3.conda#73031c79546ad06f1fe62e57fdd021bc -https://conda.anaconda.org/conda-forge/linux-64/libgpg-error-1.47-h71f35ed_0.conda#c2097d0b46367996f09b4e8e4920384a -https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.39-h753d276_0.conda#e1c890aebdebbfbf87e2c917187b4416 -https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.44.2-h2797004_0.conda#3b6a9f225c3dbe0d24f4fedd4625c5bf -https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 -https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.15-h0b41bf4_0.conda#33277193f5b92bad9fdd230eb700929c -https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.3-h232c23b_0.conda#bc6ac4c0cea148d924f621985bc3892b -https://conda.anaconda.org/conda-forge/linux-64/mysql-common-8.0.33-hf1915f5_6.conda#80bf3b277c120dd294b51d404b931a75 -https://conda.anaconda.org/conda-forge/linux-64/pcre2-10.42-hcad00b1_0.conda#679c8961826aa4b50653bce17ee52abe -https://conda.anaconda.org/conda-forge/linux-64/readline-8.2-h8228510_1.conda#47d31b792659ce70f470b5c82fdfb7a4 -https://conda.anaconda.org/conda-forge/linux-64/tk-8.6.13-noxft_h4845f30_101.conda#d453b98d9c83e71da0741bb0ff4d76bc -https://conda.anaconda.org/conda-forge/linux-64/xorg-libsm-1.2.4-h7391055_0.conda#93ee23f12bc2e684548181256edd2cf6 -https://conda.anaconda.org/conda-forge/linux-64/zlib-1.2.13-hd590300_5.conda#68c34ec6149623be41a1933ab996a209 -https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.5-hfc55251_0.conda#04b88013080254850d6c01ed54810589 -https://conda.anaconda.org/conda-forge/linux-64/brotli-bin-1.1.0-hd590300_1.conda#39f910d205726805a958da408ca194ba -https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-h267a509_2.conda#9ae35c3d96db2c94ce0cef86efdfa2cb -https://conda.anaconda.org/conda-forge/linux-64/krb5-1.21.2-h659d440_0.conda#cd95826dbd331ed1be26bdf401432844 -https://conda.anaconda.org/conda-forge/linux-64/libgcrypt-1.10.3-hd590300_0.conda#32d16ad533c59bb0a3c5ffaf16110829 -https://conda.anaconda.org/conda-forge/linux-64/libglib-2.78.3-h783c2da_0.conda#9bd06b12bbfa6fd1740fd23af4b0f0c7 -https://conda.anaconda.org/conda-forge/linux-64/libhiredis-1.0.2-h2cc385e_0.tar.bz2#b34907d3a81a3cd8095ee83d174c074a -https://conda.anaconda.org/conda-forge/linux-64/libllvm15-15.0.7-hb3ce162_4.conda#8a35df3cbc0c8b12cc8af9473ae75eef -https://conda.anaconda.org/conda-forge/linux-64/libsndfile-1.2.2-hc60ed4a_1.conda#ef1910918dd895516a769ed36b5b3a4e -https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-ha9c0a0a_2.conda#55ed21669b2015f77c180feb1dd41930 -https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-17.0.6-h4dfa4b3_0.conda#c1665f9c1c9f6c93d8b4e492a6a39056 -https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.0.33-hca2cd23_6.conda#e87530d1b12dd7f4e0f856dc07358d60 -https://conda.anaconda.org/conda-forge/linux-64/nss-3.96-h1d7d5a4_0.conda#1c8f8b8eb041ecd54053fc4b6ad57957 -https://conda.anaconda.org/conda-forge/linux-64/python-3.12.1-hab00c5b_1_cpython.conda#0bab699354cbd66959550eb9b9866620 -https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 -https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 -https://conda.anaconda.org/conda-forge/linux-64/xcb-util-renderutil-0.3.9-hd590300_1.conda#e995b155d938b6779da6ace6c6b13816 -https://conda.anaconda.org/conda-forge/linux-64/xcb-util-wm-0.4.1-h8ee46fc_1.conda#90108a432fb5c6150ccfee3f03388656 -https://conda.anaconda.org/conda-forge/linux-64/xorg-libx11-1.8.7-h8ee46fc_0.conda#49e482d882669206653b095f5206c05b -https://conda.anaconda.org/conda-forge/linux-64/brotli-1.1.0-hd590300_1.conda#f27a24d46e3ea7b70a1f98e50c62508f -https://conda.anaconda.org/conda-forge/linux-64/ccache-4.8.1-h1fcd64f_0.conda#fd37a0c47d8b3667b73af0549037ce83 -https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 -https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 -https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 -https://conda.anaconda.org/conda-forge/linux-64/cython-3.0.7-py312h30efb56_0.conda#2b97b8193bd02c72ebd57c5bf88a0457 -https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2#ecfff944ba3960ecb334b9a2663d708d -https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_0.conda#f6c211fee3c98229652b60a9a42ef363 -https://conda.anaconda.org/conda-forge/noarch/execnet-2.0.2-pyhd8ed1ab_0.conda#67de0d8241e1060a479e3c37793e26f9 -https://conda.anaconda.org/conda-forge/linux-64/fontconfig-2.14.2-h14ed4e7_0.conda#0f69b688f52ff6da70bccb7ff7001d1d -https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.78.3-hfc55251_0.conda#41d2f46e0ac8372eeb959860713d9b21 -https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda#f800d2da156d08e289b14e87e43c1ae5 -https://conda.anaconda.org/conda-forge/linux-64/kiwisolver-1.4.5-py312h8572e83_1.conda#c1e71f2bc05d8e8e033aefac2c490d05 -https://conda.anaconda.org/conda-forge/linux-64/lcms2-2.16-hb7c19ff_0.conda#51bb7010fc86f70eee639b4bb7a894f5 -https://conda.anaconda.org/conda-forge/linux-64/libclang13-15.0.7-default_ha2b6cf4_4.conda#898e0dd993afbed0d871b60c2eb33b83 -https://conda.anaconda.org/conda-forge/linux-64/libcups-2.3.3-h4637d8d_4.conda#d4529f4dff3057982a7617c7ac58fde3 -https://conda.anaconda.org/conda-forge/linux-64/libpq-16.1-h33b98f1_7.conda#675317e46167caea24542d85c72f19a3 -https://conda.anaconda.org/conda-forge/linux-64/libsystemd0-255-h3516f8a_0.conda#24e2649ebd432e652aa72cfd05f23a8e -https://conda.anaconda.org/conda-forge/linux-64/mkl-2023.2.0-h84fe81f_50496.conda#81d4a1a57d618adf0152db973d93b2ad -https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2#2ba8498c1018c1e9c61eb99b973dfe19 -https://conda.anaconda.org/conda-forge/linux-64/openjpeg-2.5.0-h488ebb8_3.conda#128c25b7fe6a25286a48f3a6a9b5b6f3 -https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 -https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d -https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 -https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb -https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 -https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 -https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 -https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.2.0-pyha21a80b_0.conda#978d03388b62173b8e6f79162cf52b86 -https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2#f832c45a477c78bebd107098db465095 -https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2#5844808ffab9ebdb694585b50ba02a96 -https://conda.anaconda.org/conda-forge/linux-64/tornado-6.3.3-py312h98912ed_1.conda#5bd63a3bf512694536cee3e48463a47c -https://conda.anaconda.org/conda-forge/linux-64/xcb-util-image-0.4.0-h8ee46fc_1.conda#9d7bcddf49cbf727730af10e71022c73 -https://conda.anaconda.org/conda-forge/linux-64/xkeyboard-config-2.40-hd590300_0.conda#07c15d846a2e4d673da22cbd85fdb6d2 -https://conda.anaconda.org/conda-forge/linux-64/xorg-libxext-1.3.4-h0b41bf4_2.conda#82b6df12252e6f32402b96dacc656fec -https://conda.anaconda.org/conda-forge/linux-64/xorg-libxrender-0.9.11-hd590300_0.conda#ed67c36f215b310412b2af935bf3e530 -https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f907bb958910dc404647326ca80c263e -https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.47.0-py312h98912ed_0.conda#37998571aee0938fff9047691bda0b26 -https://conda.anaconda.org/conda-forge/linux-64/glib-2.78.3-hfc55251_0.conda#e08e51acc7d1ae8dbe13255e7b4c64ac -https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc -https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-20_linux64_mkl.conda#8bf521f6007b0b0eb91515a1165b5d85 -https://conda.anaconda.org/conda-forge/linux-64/libclang-15.0.7-default_hb11cfb5_4.conda#c90f4cbb57839c98fef8f830e4b9972f -https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-hd429924_1.conda#1dbcc04604fdf1e526e6d1b0b6938396 -https://conda.anaconda.org/conda-forge/linux-64/mkl-devel-2023.2.0-ha770c72_50496.conda#3b4c50e31ff098b18a450e4f5f860adf -https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py312hf3581a9_0.conda#c04d3de9d831a69a5fdfab1413ec2fb6 -https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 -https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 -https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 -https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py312h30efb56_0.conda#32633871002ee9902f747d2236e0d122 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 -https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 -https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-20_linux64_mkl.conda#7a2972758a03adc92d856072c71c9170 -https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-20_linux64_mkl.conda#4db0cd03efcdab535f6f066aca4cddbb -https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py312h30efb56_5.conda#8a2a122dc4fe14d8cff38f1cf426381f -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 -https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_mkl.conda#3dea5e9be386b963d7f4368966e238b3 -https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.2-py312heda63a1_0.conda#6d7b0ae4472449b7893345c015f486d3 -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e -https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_mkl.conda#079d50df2338a3d47522d7e84c3dfbf6 -https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py312h8572e83_0.conda#b6249daaaf4577e6f72d95fc4ab767c6 -https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py312hfb8ada1_0.conda#d0745ae74c2b26571b692ddde112eebb -https://conda.anaconda.org/conda-forge/linux-64/qt-main-5.15.8-h450f30e_18.conda#ef0430f8df5dcdedcaaab340b228f30c -https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py312heda63a1_0.conda#e1fac3255958529700de75951f060710 -https://conda.anaconda.org/conda-forge/linux-64/blas-2.120-mkl.conda#9444330235a4828878cbe9c897ba0aa3 -https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.8.2-py312he5832f3_0.conda#1bf345f8df6896b5a8016f16188946ba -https://conda.anaconda.org/conda-forge/linux-64/pyamg-5.0.1-py312hfb10629_1.conda#79ec33a3b3e9e6858e40e6f253b174ab -https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py312h949fe66_5.conda#f6548a564e2d01b2a42020259503945b -https://conda.anaconda.org/conda-forge/linux-64/matplotlib-3.8.2-py312h7900ff3_0.conda#b409beb1dc6ebb34b767b7fb8fc70b9d diff --git a/build_tools/update_environments_and_lock_files.py b/build_tools/update_environments_and_lock_files.py index a5b3068d3964b..b344785ad01ca 100644 --- a/build_tools/update_environments_and_lock_files.py +++ b/build_tools/update_environments_and_lock_files.py @@ -136,16 +136,6 @@ def remove_from(alist, to_remove): "numpy": "<1.25", }, }, - { - "build_name": "pylatest_conda_forge_mkl_no_coverage", - "folder": "build_tools/azure", - "platform": "linux-64", - "channel": "conda-forge", - "conda_dependencies": common_dependencies_without_coverage + ["ccache"], - "package_constraints": { - "blas": "[build=mkl]", - }, - }, { "build_name": "pymin_conda_defaults_openblas", "folder": "build_tools/azure", From 34061dafa67282e293b33d40f03b8ac60efeb2f1 Mon Sep 17 00:00:00 2001 From: Stefanie Senger <91849487+StefanieSenger@users.noreply.github.com> Date: Fri, 5 Jan 2024 14:25:53 +0100 Subject: [PATCH 014/554] ENH Improved error messages for `UnsetMetadataPassedError` (#28056) --- sklearn/exceptions.py | 2 +- sklearn/model_selection/_validation.py | 25 +++++++++++++------ .../model_selection/tests/test_validation.py | 2 +- 3 files changed, 19 insertions(+), 10 deletions(-) diff --git a/sklearn/exceptions.py b/sklearn/exceptions.py index ad7ae08c1fec0..1466ce783ee00 100644 --- a/sklearn/exceptions.py +++ b/sklearn/exceptions.py @@ -19,7 +19,7 @@ class UnsetMetadataPassedError(ValueError): """Exception class to raise if a metadata is passed which is not explicitly \ - requested. + requested (metadata=True) or not requested (metadata=False). .. versionadded:: 1.3 diff --git a/sklearn/model_selection/_validation.py b/sklearn/model_selection/_validation.py index d3110cb847b4c..07b229b57bf96 100644 --- a/sklearn/model_selection/_validation.py +++ b/sklearn/model_selection/_validation.py @@ -397,11 +397,16 @@ def cross_validate( # `process_routing` code, we pass `fit` as the caller. However, # the user is not calling `fit` directly, so we change the message # to make it more suitable for this case. + unrequested_params = sorted(e.unrequested_params) raise UnsetMetadataPassedError( message=( - f"{sorted(e.unrequested_params.keys())} are passed to cross" - " validation but are not explicitly requested or unrequested. See" - " the Metadata Routing User guide" + f"{unrequested_params} are passed to cross validation but are not" + " explicitly set as requested or not requested for cross_validate's" + f" estimator: {estimator.__class__.__name__}. Call" + " `.set_fit_request({{metadata}}=True)` on the estimator for" + f" each metadata in {unrequested_params} that you" + " want to use and `metadata=False` for not using it. See the" + " Metadata Routing User guide" " for more" " information." ), @@ -1238,13 +1243,17 @@ def cross_val_predict( # `process_routing` code, we pass `fit` as the caller. However, # the user is not calling `fit` directly, so we change the message # to make it more suitable for this case. + unrequested_params = sorted(e.unrequested_params) raise UnsetMetadataPassedError( message=( - f"{sorted(e.unrequested_params.keys())} are passed to cross" - " validation but are not explicitly requested or unrequested. See" - " the Metadata Routing User guide" - " for more" - " information." + f"{unrequested_params} are passed to `cross_val_predict` but are" + " not explicitly set as requested or not requested for" + f" cross_validate's estimator: {estimator.__class__.__name__} Call" + " `.set_fit_request({{metadata}}=True)` on the estimator for" + f" each metadata in {unrequested_params} that you want to use and" + " `metadata=False` for not using it. See the Metadata Routing User" + " guide " + " for more information." ), unrequested_params=e.unrequested_params, routed_params=e.routed_params, diff --git a/sklearn/model_selection/tests/test_validation.py b/sklearn/model_selection/tests/test_validation.py index acf4d27e0180e..e1ecfd14f45a3 100644 --- a/sklearn/model_selection/tests/test_validation.py +++ b/sklearn/model_selection/tests/test_validation.py @@ -2517,7 +2517,7 @@ def test_groups_with_routing_validation(cv_method): def test_passed_unrequested_metadata(cv_method): """Check that we raise an error when passing metadata that is not requested.""" - err_msg = re.escape("['metadata'] are passed to cross validation") + err_msg = re.escape("but are not explicitly set as requested or not requested") with pytest.raises(ValueError, match=err_msg): cv_method( estimator=ConsumingClassifier(), From 056864d9c558131d2706b46f6ddf084671b428b6 Mon Sep 17 00:00:00 2001 From: Maren Westermann Date: Sat, 6 Jan 2024 04:11:52 +0100 Subject: [PATCH 015/554] DOC Add links to KMeans examples in docstrings and the user guide (#27799) --- doc/modules/clustering.rst | 28 ++++++++++++++++----- examples/cluster/plot_cluster_iris.py | 9 +++---- examples/cluster/plot_color_quantization.py | 2 +- examples/text/plot_document_clustering.py | 5 ++-- sklearn/cluster/_kmeans.py | 18 +++++++++++++ 5 files changed, 48 insertions(+), 14 deletions(-) diff --git a/doc/modules/clustering.rst b/doc/modules/clustering.rst index b922b0640c083..4cd86a0bf70c1 100644 --- a/doc/modules/clustering.rst +++ b/doc/modules/clustering.rst @@ -182,6 +182,10 @@ It suffers from various drawbacks: :align: center :scale: 50 +For more detailed descriptions of the issues shown above and how to address them, +refer to the examples :ref:`sphx_glr_auto_examples_cluster_plot_kmeans_assumptions.py` +and :ref:`sphx_glr_auto_examples_cluster_plot_kmeans_silhouette_analysis.py`. + K-means is often referred to as Lloyd's algorithm. In basic terms, the algorithm has three steps. The first step chooses the initial centroids, with the most basic method being to choose :math:`k` samples from the dataset @@ -218,7 +222,9 @@ initializations of the centroids. One method to help address this issue is the k-means++ initialization scheme, which has been implemented in scikit-learn (use the ``init='k-means++'`` parameter). This initializes the centroids to be (generally) distant from each other, leading to probably better results than -random initialization, as shown in the reference. +random initialization, as shown in the reference. For a detailed example of +comaparing different initialization schemes, refer to +:ref:`sphx_glr_auto_examples_cluster_plot_kmeans_digits.py`. K-means++ can also be called independently to select seeds for other clustering algorithms, see :func:`sklearn.cluster.kmeans_plusplus` for details @@ -231,7 +237,17 @@ weight of 2 to a sample is equivalent to adding a duplicate of that sample to the dataset :math:`X`. K-means can be used for vector quantization. This is achieved using the -transform method of a trained model of :class:`KMeans`. +``transform`` method of a trained model of :class:`KMeans`. For an example of +performing vector quantization on an image refer to +:ref:`sphx_glr_auto_examples_cluster_plot_color_quantization.py`. + +.. topic:: Examples: + + * :ref:`sphx_glr_auto_examples_cluster_plot_cluster_iris.py`: Example usage of + :class:`KMeans` using the iris dataset + + * :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py`: Document clustering + using :class:`KMeans` and :class:`MiniBatchKMeans` based on sparse data Low-level parallelism --------------------- @@ -291,11 +307,11 @@ small, as shown in the example and cited reference. .. topic:: Examples: - * :ref:`sphx_glr_auto_examples_cluster_plot_mini_batch_kmeans.py`: Comparison of KMeans and - MiniBatchKMeans + * :ref:`sphx_glr_auto_examples_cluster_plot_mini_batch_kmeans.py`: Comparison of + :class:`KMeans` and :class:`MiniBatchKMeans` - * :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py`: Document clustering using sparse - MiniBatchKMeans + * :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py`: Document clustering + using :class:`KMeans` and :class:`MiniBatchKMeans` based on sparse data * :ref:`sphx_glr_auto_examples_cluster_plot_dict_face_patches.py` diff --git a/examples/cluster/plot_cluster_iris.py b/examples/cluster/plot_cluster_iris.py index 79e885be35b86..ad85c0c9910a7 100644 --- a/examples/cluster/plot_cluster_iris.py +++ b/examples/cluster/plot_cluster_iris.py @@ -7,13 +7,13 @@ - top left: What a K-means algorithm would yield using 8 clusters. -- top right: What the effect of a bad initialization is +- top right: What using three clusters would deliver. + +- bottom left: What the effect of a bad initialization is on the classification process: By setting n_init to only 1 (default is 10), the amount of times that the algorithm will be run with different centroid seeds is reduced. -- bottom left: What using eight clusters would deliver. - - bottom right: The ground truth. """ @@ -73,8 +73,7 @@ horizontalalignment="center", bbox=dict(alpha=0.2, edgecolor="w", facecolor="w"), ) -# Reorder the labels to have colors matching the cluster results -y = np.choose(y, [1, 2, 0]).astype(float) + ax.scatter(X[:, 3], X[:, 0], X[:, 2], c=y, edgecolor="k") ax.xaxis.set_ticklabels([]) diff --git a/examples/cluster/plot_color_quantization.py b/examples/cluster/plot_color_quantization.py index eef66be21b104..ec21949466daf 100644 --- a/examples/cluster/plot_color_quantization.py +++ b/examples/cluster/plot_color_quantization.py @@ -41,7 +41,7 @@ china = load_sample_image("china.jpg") # Convert to floats instead of the default 8 bits integer coding. Dividing by -# 255 is important so that plt.imshow behaves works well on float data (need to +# 255 is important so that plt.imshow works well on float data (need to # be in the range [0-1]) china = np.array(china, dtype=np.float64) / 255 diff --git a/examples/text/plot_document_clustering.py b/examples/text/plot_document_clustering.py index fa68b8bd312ea..2c3506f4ec32e 100644 --- a/examples/text/plot_document_clustering.py +++ b/examples/text/plot_document_clustering.py @@ -99,8 +99,9 @@ # assignment have an ARI of 0.0 in expectation. # # If the ground truth labels are not known, evaluation can only be performed -# using the model results itself. In that case, the Silhouette Coefficient comes -# in handy. +# using the model results itself. In that case, the Silhouette Coefficient comes in +# handy. See :ref:`sphx_glr_auto_examples_cluster_plot_kmeans_silhouette_analysis.py` +# for an example on how to do it. # # For more reference, see :ref:`clustering_evaluation`. diff --git a/sklearn/cluster/_kmeans.py b/sklearn/cluster/_kmeans.py index a7d6b5f7df050..59470aae6c13f 100644 --- a/sklearn/cluster/_kmeans.py +++ b/sklearn/cluster/_kmeans.py @@ -1208,6 +1208,9 @@ class KMeans(_BaseKMeans): The number of clusters to form as well as the number of centroids to generate. + For an example of how to choose an optimal value for `n_clusters` refer to + :ref:`sphx_glr_auto_examples_cluster_plot_kmeans_silhouette_analysis.py`. + init : {'k-means++', 'random'}, callable or array-like of shape \ (n_clusters, n_features), default='k-means++' Method for initialization: @@ -1364,6 +1367,21 @@ class KMeans(_BaseKMeans): >>> kmeans.cluster_centers_ array([[10., 2.], [ 1., 2.]]) + + For a more detailed example of K-Means using the iris dataset see + :ref:`sphx_glr_auto_examples_cluster_plot_cluster_iris.py`. + + For examples of common problems with K-Means and how to address them see + :ref:`sphx_glr_auto_examples_cluster_plot_kmeans_assumptions.py`. + + For an example of how to use K-Means to perform color quantization see + :ref:`sphx_glr_auto_examples_cluster_plot_color_quantization.py`. + + For a demonstration of how K-Means can be used to cluster text documents see + :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py`. + + For a comparison between K-Means and MiniBatchKMeans refer to example + :ref:`sphx_glr_auto_examples_cluster_plot_mini_batch_kmeans.py`. """ _parameter_constraints: dict = { From 21789c05f93ae8dd94cc62945eba0b9637d8b97f Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Mon, 8 Jan 2024 10:22:40 +0100 Subject: [PATCH 016/554] :lock: :robot: CI Update lock files for main CI build(s) :lock: :robot: (#28081) Co-authored-by: Lock file bot --- ...latest_conda_forge_mkl_linux-64_conda.lock | 16 +++++----- ...pylatest_conda_forge_mkl_osx-64_conda.lock | 24 +++++++------- ...test_conda_mkl_no_openmp_osx-64_conda.lock | 6 ++-- ...st_pip_openblas_pandas_linux-64_conda.lock | 8 ++--- ...pylatest_pip_scipy_dev_linux-64_conda.lock | 12 +++---- ...onda_defaults_openblas_linux-64_conda.lock | 4 +-- .../pymin_conda_forge_mkl_win-64_conda.lock | 12 +++---- ...e_openblas_ubuntu_2204_linux-64_conda.lock | 14 ++++---- build_tools/circle/doc_linux-64_conda.lock | 32 +++++++++---------- .../doc_min_dependencies_linux-64_conda.lock | 12 +++---- 10 files changed, 70 insertions(+), 70 deletions(-) diff --git a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock index 188936db093a6..422dc6f1f9626 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 06a1abd91fe199d0e020e5ac38efba4bc3d4a7752e01cf91e4b046c5d0ba8a93 +# input_hash: 7aa55d66dfbd0f6267a9aff8c750d1e9f42cd339726c8f9c4d1299341b064849 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -52,7 +52,7 @@ https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#b https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4-h59595ed_2.conda#7dbaa197d7ba6032caf7ae7f32c1efa0 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.2.0-hd590300_1.conda#603827b39ea2b835268adb8c821b8570 -https://conda.anaconda.org/conda-forge/linux-64/pixman-0.42.2-h59595ed_0.conda#700edd63ccd5fc66b70b1c028cea9a68 +https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.0-h59595ed_0.conda#6b4b43013628634b6cfdee6b74fd696b https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-h36c2ea0_1001.tar.bz2#22dad4df6e8630e8dff2428f6f6a7036 https://conda.anaconda.org/conda-forge/linux-64/rdma-core-28.9-h59595ed_1.conda#aeffb7c06b5f65e55e6c637408dc4100 https://conda.anaconda.org/conda-forge/linux-64/re2-2023.03.02-h8c504da_0.conda#206f8fa808748f6e90599c3368a1114e @@ -152,7 +152,7 @@ https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b46 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 +https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 https://conda.anaconda.org/conda-forge/linux-64/tbb-2021.11.0-h00ab1b0_0.conda#fde515afbbe6e36eb4564965c20b1058 https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.2.0-pyha21a80b_0.conda#978d03388b62173b8e6f79162cf52b86 @@ -175,31 +175,31 @@ https://conda.anaconda.org/conda-forge/linux-64/libclang-15.0.7-default_hb11cfb5 https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.12.0-hac9eb74_1.conda#0dee716254497604762957076ac76540 https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-h5d7e998_0.conda#d8edd0e29db6fb6b6988e1a28d35d994 https://conda.anaconda.org/conda-forge/linux-64/mkl-2022.2.1-h84fe81f_16997.conda#a7ce56d5757f5b57e7daabe703ade5bb -https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py311ha6c5da5_0.conda#83a988daf5c49e57f7d2086fb6781fe8 +https://conda.anaconda.org/conda-forge/linux-64/pillow-10.2.0-py311ha6c5da5_0.conda#a5ccd7f2271f28b7d2de0b02b64e3796 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py311hb755f60_0.conda#02336abab4cb5dd794010ef53c54bd09 https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.3.14-hf3aad02_1.conda#a968ffa7e9fe0c257628033d393e512f https://conda.anaconda.org/conda-forge/linux-64/blas-1.0-mkl.tar.bz2#349aef876b1d8c9dccae01de20d5b385 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_1.conda#1b52a89485ab573a5bb83a5225ff706e https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-16_linux64_mkl.tar.bz2#85f61af03fd291dae33150ffe89dc09a https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py311hb755f60_5.conda#e4d262cc3600e70b505a6761d29f6207 https://conda.anaconda.org/conda-forge/noarch/pytest-cov-4.1.0-pyhd8ed1ab_0.conda#06eb685a3a0b146347a58dda979485da https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.21.0-hb942446_5.conda#07d92ed5403ad7b5c66ffd7d5b8f7e57 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_1.conda#3926dab94fe06d88ade0e716d77b8cf8 https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-16_linux64_mkl.tar.bz2#361bf757b95488de76c4f123805742d3 https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-16_linux64_mkl.tar.bz2#a2f166748917d6d6e4707841ca1f519e https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/aws-sdk-cpp-1.10.57-h85b1a90_19.conda#0605d3d60857fc07bd6a11e878fe0f08 -https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.2-py311h64a7726_0.conda#fd2f142dcd680413b5ede5d0fb799205 +https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py311h64a7726_0.conda#231eef4f33640338f64ef9ab690ba08d https://conda.anaconda.org/conda-forge/linux-64/qt-main-5.15.8-h82b777d_17.conda#4f01e33dbb406085a16a2813ab067e95 https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py311h9547e67_0.conda#40828c5b36ef52433e21f89943e09f33 https://conda.anaconda.org/conda-forge/linux-64/libarrow-12.0.1-hb87d912_8_cpu.conda#3f3b11398fe79b578e3c44dd00a44e4a https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py311h320fe9a_0.conda#e44ccb61b6621bf3f8053ae66eba7397 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.2-py311hf926cbc_0.conda#18f12d27741769ae5432dacce21acc93 +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.3-py311h2bb2bab_1.conda#dfde94fef0b419cad560023fa277ef9e https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py311hf0fb5b6_5.conda#ec7e45bc76d9d0b69a74a2075932b8e8 https://conda.anaconda.org/conda-forge/linux-64/pytorch-1.13.1-cpu_py311h410fd25_1.conda#ddd2fadddf89e3dc3d541a2537fce010 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py311h64a7726_0.conda#9ac5334f1b5ed072d3dbc342503d7868 diff --git a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock index 3f1ea3d25b2ce..d412beaf30789 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: osx-64 -# input_hash: 1c061d421872c406aaefcd63aa475f5decae7806dd07d710dc5d742da72de61a +# input_hash: 02abef27514db5e5119c3cdc253e84a06374c1b308495298b46bdb14dcc52ae9 @EXPLICIT https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-h10d778d_5.conda#6097a6ca9ada32699b5fc4312dd6ef18 https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2023.11.17-h8857fd0_0.conda#c687e9d14c49e3d3946d50a413cdbf16 @@ -55,7 +55,7 @@ https://conda.anaconda.org/conda-forge/osx-64/brotli-1.1.0-h0dc2134_1.conda#9272 https://conda.anaconda.org/conda-forge/osx-64/lcms2-2.16-ha2f27b4_0.conda#1442db8f03517834843666c422238c9b https://conda.anaconda.org/conda-forge/osx-64/ld64_osx-64-609-ha20a434_15.conda#4709e6e1ce59f92f822470e16253bae1 https://conda.anaconda.org/conda-forge/osx-64/libcblas-3.9.0-20_osx64_mkl.conda#51089a4865eb4aec2bc5c7468bd07f9f -https://conda.anaconda.org/conda-forge/osx-64/libclang-cpp16-16.0.6-default_h6b1ee41_3.conda#2fc3e465e5c10d3c11e4017cdd1ee5ae +https://conda.anaconda.org/conda-forge/osx-64/libclang-cpp16-16.0.6-default_h6b1ee41_4.conda#0eea849d8d0b489bae1b9ae8656b62fb https://conda.anaconda.org/conda-forge/osx-64/libhiredis-1.0.2-h2beb688_0.tar.bz2#524282b2c46c9dedf051b3bc2ae05494 https://conda.anaconda.org/conda-forge/osx-64/liblapack-3.9.0-20_osx64_mkl.conda#58f08e12ad487fac4a08f90ff0b87aec https://conda.anaconda.org/conda-forge/osx-64/llvm-tools-16.0.6-hbedff68_3.conda#e9356b0807462e8f84c1384a8da539a5 @@ -65,7 +65,7 @@ https://conda.anaconda.org/conda-forge/osx-64/python-3.12.1-h9f0c242_1_cpython.c https://conda.anaconda.org/conda-forge/osx-64/ccache-4.8.1-h28e096f_0.conda#dcc8cc97fdab7a5fad9e1a6bbad9ed0e https://conda.anaconda.org/conda-forge/osx-64/cctools_osx-64-973.0.1-ha1c5b94_15.conda#c9dbe505cd17a5a4a6a787dbceea2dba https://conda.anaconda.org/conda-forge/noarch/certifi-2023.11.17-pyhd8ed1ab_0.conda#2011bcf45376341dd1d690263fdbc789 -https://conda.anaconda.org/conda-forge/osx-64/clang-16-16.0.6-default_h6b1ee41_3.conda#07654411a331ea916e6f93ae0d8363b7 +https://conda.anaconda.org/conda-forge/osx-64/clang-16-16.0.6-default_h6b1ee41_4.conda#ac26df83ef19d580af4674d46ea68bd8 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda#5cd86562580f274031ede6aa6aa24441 https://conda.anaconda.org/conda-forge/osx-64/cython-3.0.7-py312hede676d_0.conda#89a76a23df8d704d26a3f27e0a1c372d @@ -77,15 +77,15 @@ https://conda.anaconda.org/conda-forge/osx-64/kiwisolver-1.4.5-py312h49ebfd2_1.c https://conda.anaconda.org/conda-forge/osx-64/ld64-609-ha02d983_15.conda#1bd5c0a940ecc8946dbe2a5b84290049 https://conda.anaconda.org/conda-forge/osx-64/liblapacke-3.9.0-20_osx64_mkl.conda#124ae8e384268a8da66f1d64114a1eda https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2#2ba8498c1018c1e9c61eb99b973dfe19 -https://conda.anaconda.org/conda-forge/osx-64/numpy-1.26.2-py312hfd3bce2_0.conda#aba72e40976485051b7567b567336319 +https://conda.anaconda.org/conda-forge/osx-64/numpy-1.26.3-py312he3a82b2_0.conda#cc7cfa90fc5c70a62b788daa71b782ef https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 -https://conda.anaconda.org/conda-forge/osx-64/pillow-10.1.0-py312h0c70c2f_0.conda#50fc3446a464ff986aa4496e1eebf60b +https://conda.anaconda.org/conda-forge/osx-64/pillow-10.2.0-py312h0c70c2f_0.conda#0cc3674239ad12c6836cb4174f106c92 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 +https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.2.0-pyha21a80b_0.conda#978d03388b62173b8e6f79162cf52b86 https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2#f832c45a477c78bebd107098db465095 @@ -93,7 +93,7 @@ https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2#5 https://conda.anaconda.org/conda-forge/osx-64/tornado-6.3.3-py312h104f124_1.conda#6835d4940d6fbd41e1a32d58dfae8f06 https://conda.anaconda.org/conda-forge/osx-64/blas-devel-3.9.0-20_osx64_mkl.conda#cc3260179093918b801e373c6e888e02 https://conda.anaconda.org/conda-forge/osx-64/cctools-973.0.1-h40f6528_15.conda#bc85aa6ab5eea61c47f39015dbe34a88 -https://conda.anaconda.org/conda-forge/osx-64/clang-16.0.6-hac416ee_3.conda#b143a7f213c0d25ced055089a2baef46 +https://conda.anaconda.org/conda-forge/osx-64/clang-16.0.6-hac416ee_4.conda#8c9109ae105a10984b9077899100167a https://conda.anaconda.org/conda-forge/osx-64/contourpy-1.2.0-py312hbf0bb39_0.conda#74190e06053cda7139a0cb71f3e618fd https://conda.anaconda.org/conda-forge/osx-64/coverage-7.4.0-py312h41838bb_0.conda#8fdd619940b64e33b0702cb46d701f6e https://conda.anaconda.org/conda-forge/osx-64/fonttools-4.47.0-py312h41838bb_0.conda#73605f0b5026ee8445b68fceafb53941 @@ -102,7 +102,7 @@ https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/osx-64/scipy-1.11.4-py312heccc6a5_0.conda#b7b422b49ae2e5c8276bffd05f3ba63c https://conda.anaconda.org/conda-forge/osx-64/blas-2.120-mkl.conda#b041a7677a412f3d925d8208936cb1e2 -https://conda.anaconda.org/conda-forge/osx-64/clangxx-16.0.6-default_h6b1ee41_3.conda#0cd1aaa751aa374141fa4c802b88674a +https://conda.anaconda.org/conda-forge/osx-64/clangxx-16.0.6-default_h6b1ee41_4.conda#c5ed5a7857f12a3b8117f743e081286f https://conda.anaconda.org/conda-forge/osx-64/matplotlib-base-3.8.2-py312h302682c_0.conda#6a3b7c29d663a9cda13afb8f2638cc46 https://conda.anaconda.org/conda-forge/osx-64/pandas-2.1.4-py312haf8ecfc_0.conda#cb889a75192ef98a17c3f431f6518dd2 https://conda.anaconda.org/conda-forge/osx-64/pyamg-5.0.1-py312h674694f_1.conda#e5b9c0f8b5c367467425ff34353ef761 @@ -112,12 +112,12 @@ https://conda.anaconda.org/conda-forge/noarch/compiler-rt_osx-64-16.0.6-ha38d28d https://conda.anaconda.org/conda-forge/osx-64/matplotlib-3.8.2-py312hb401068_0.conda#926f479dcab7d6d26bba7fe39f67e3b2 https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/osx-64/compiler-rt-16.0.6-ha38d28d_2.conda#3b9e8c5c63b8e86234f499490acd85c2 -https://conda.anaconda.org/conda-forge/osx-64/clang_impl_osx-64-16.0.6-h8787910_7.conda#f93823bbbe0302466f65b8ae6094dfd7 -https://conda.anaconda.org/conda-forge/osx-64/clang_osx-64-16.0.6-hb91bd55_7.conda#fc6c3256ab948da5fa0b34b47bf90d27 +https://conda.anaconda.org/conda-forge/osx-64/clang_impl_osx-64-16.0.6-h8787910_8.conda#2e694b8880599d19aec8e489eb01580f +https://conda.anaconda.org/conda-forge/osx-64/clang_osx-64-16.0.6-hb91bd55_8.conda#831779e455d39ed7e8911be6e7d02814 https://conda.anaconda.org/conda-forge/osx-64/c-compiler-1.7.0-h282daa2_0.conda#4652f33fe8d895f61177e2783b289377 -https://conda.anaconda.org/conda-forge/osx-64/clangxx_impl_osx-64-16.0.6-h6d92fbe_7.conda#d142c2ab0739a3991585ae9615ba0f87 +https://conda.anaconda.org/conda-forge/osx-64/clangxx_impl_osx-64-16.0.6-h6d92fbe_8.conda#f2f85938b8d78c2380657efd92194490 https://conda.anaconda.org/conda-forge/osx-64/gfortran_osx-64-12.3.0-h18f7dce_1.conda#436af2384c47aedb94af78a128e174f1 -https://conda.anaconda.org/conda-forge/osx-64/clangxx_osx-64-16.0.6-hb91bd55_7.conda#1d7cf5384b8fc42ec6c19659fa8ec1f8 +https://conda.anaconda.org/conda-forge/osx-64/clangxx_osx-64-16.0.6-hb91bd55_8.conda#abc99f4ac92e65c4f829e4320ea200f8 https://conda.anaconda.org/conda-forge/osx-64/gfortran-12.3.0-h2c809b3_1.conda#c48adbaa8944234b80ef287c37e329b0 https://conda.anaconda.org/conda-forge/osx-64/cxx-compiler-1.7.0-h7728843_0.conda#8abaa2694c1fba2b6bd3753d00a60415 https://conda.anaconda.org/conda-forge/osx-64/fortran-compiler-1.7.0-h6c2ab21_0.conda#2c11db8b46df0a547997116f0fd54b8e diff --git a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock index a89638ebbdd83..63ccdf725e7dc 100644 --- a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: osx-64 -# input_hash: c8fdd08f1a9a3d91ec09f211e4444ef33921a111f684fa63428591be5ca1eb68 +# input_hash: 03f7604aefb9752d2367c457bdf4e4923158be96db35ac0dd1d5dc60a9981cd1 @EXPLICIT https://repo.anaconda.com/pkgs/main/osx-64/blas-1.0-mkl.conda#cb2c87e85ac8e0ceae776d26d4214c8a https://repo.anaconda.com/pkgs/main/osx-64/bzip2-1.0.8-h1de35cc_0.conda#19fcb113b170fe2a0be96b47801fed7d @@ -14,7 +14,7 @@ https://repo.anaconda.com/pkgs/main/osx-64/libffi-3.4.4-hecd8cb5_0.conda#c20b268 https://repo.anaconda.com/pkgs/main/osx-64/libwebp-base-1.3.2-h6c40b1e_0.conda#d8fd9f599dd4e012694e69d119016442 https://repo.anaconda.com/pkgs/main/osx-64/llvm-openmp-14.0.6-h0dcd299_0.conda#b5804d32b87dc61ca94561ade33d5f2d https://repo.anaconda.com/pkgs/main/osx-64/ncurses-6.4-hcec6c5f_0.conda#0214d1ee980e217fabc695f1e40662aa -https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023c-h04d1e81_0.conda#29db02adf8808f7c64642cead3e28acd +https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023d-h04d1e81_0.conda#fdb319536f351b2b828a350ffd1a35a1 https://repo.anaconda.com/pkgs/main/osx-64/xz-5.4.5-h6c40b1e_0.conda#351c5d33fe551018a2068e7a2ca8a6c1 https://repo.anaconda.com/pkgs/main/osx-64/zlib-1.2.13-h4dc903c_0.conda#d0202dd912bfb45d3422786531717882 https://repo.anaconda.com/pkgs/main/osx-64/ccache-3.7.9-hf120daa_0.conda#a01515a32e721c51d631283f991bc8ea @@ -37,7 +37,7 @@ https://repo.anaconda.com/pkgs/main/osx-64/sqlite-3.41.2-h6c40b1e_0.conda#6947a5 https://repo.anaconda.com/pkgs/main/osx-64/zstd-1.5.5-hc035e20_0.conda#5e0b7ddb1b7dc6b630e1f9a03499c19c https://repo.anaconda.com/pkgs/main/osx-64/brotli-1.0.9-hca72f7f_7.conda#68e54d12ec67591deb2ffd70348fb00f https://repo.anaconda.com/pkgs/main/osx-64/libtiff-4.5.1-hcec6c5f_0.conda#e127a800ffd9d300ed7d5e1b026944ec -https://repo.anaconda.com/pkgs/main/osx-64/python-3.11.5-hf27a42d_0.conda#f088169d190325a14aaa0dcb53a9864f +https://repo.anaconda.com/pkgs/main/osx-64/python-3.11.7-hf27a42d_0.conda#fe0cfacb8965d0a06f8098464d5a8402 https://repo.anaconda.com/pkgs/main/osx-64/coverage-7.2.2-py311h6c40b1e_0.conda#e15605553450156cf75c3ae38a920475 https://repo.anaconda.com/pkgs/main/noarch/cycler-0.11.0-pyhd3eb1b0_0.conda#f5e365d2cdb66d547eb8c3ab93843aab https://repo.anaconda.com/pkgs/main/osx-64/cython-3.0.6-py311h6c40b1e_0.conda#6c8a140209eb4814de054f52627f543c diff --git a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock index 5a314f7a7df3b..4d5e662a2d0f5 100644 --- a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock @@ -1,11 +1,11 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 51f374bd6034467b82c190398f401712163436d283f9536c2e5a1d07e9f7b1e2 +# input_hash: d01d23bd27bcd50d2b3643492f966c8e390822d72b69f31bf66c2fe98a265a4c @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.12.12-h06a4308_0.conda#12bf7315c3f5ca50300e8b48d1b4ef2e https://repo.anaconda.com/pkgs/main/linux-64/ld_impl_linux-64-2.38-h1181459_1.conda#68eedfd9c06f2b0e6888d8db345b7f5b -https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023c-h04d1e81_0.conda#29db02adf8808f7c64642cead3e28acd +https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023d-h04d1e81_0.conda#fdb319536f351b2b828a350ffd1a35a1 https://repo.anaconda.com/pkgs/main/linux-64/libgomp-11.2.0-h1234567_1.conda#b372c0eea9b60732fdae4b817a63c8cd https://repo.anaconda.com/pkgs/main/linux-64/libstdcxx-ng-11.2.0-h1234567_1.conda#57623d10a70e09e1d048c2b2b6f4e2dd https://repo.anaconda.com/pkgs/main/linux-64/_openmp_mutex-5.1-1_gnu.conda#71d281e9c2192cb3fa425655a8defb85 @@ -23,7 +23,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/python-3.9.18-h955ad1f_0.conda#65fb https://repo.anaconda.com/pkgs/main/linux-64/setuptools-68.2.2-py39h06a4308_0.conda#5b42cae5548732ae5c167bb1066085de https://repo.anaconda.com/pkgs/main/linux-64/wheel-0.41.2-py39h06a4308_0.conda#ec1b8213c3585defaa6042ed2f95861d https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685007e3dae59d211620f19926577bd6 -# pip alabaster @ https://files.pythonhosted.org/packages/64/88/c7083fc61120ab661c5d0b82cb77079fc1429d3f913a456c1c82cf4658f7/alabaster-0.7.13-py3-none-any.whl#sha256=1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3 +# pip alabaster @ https://files.pythonhosted.org/packages/a8/11/a3159174442867ea12826e60a9f1d6f6299c2ae3f896d2a47566ab826686/alabaster-0.7.15-py3-none-any.whl#sha256=d99c6fd0f7a86fca68ecc5231c9de45227991c10ee6facfb894cf6afb953b142 # pip babel @ https://files.pythonhosted.org/packages/0d/35/4196b21041e29a42dc4f05866d0c94fa26c9da88ce12c38c2265e42c82fb/Babel-2.14.0-py3-none-any.whl#sha256=efb1a25b7118e67ce3a259bed20545c29cb68be8ad2c784c83689981b7a57287 # pip certifi @ https://files.pythonhosted.org/packages/64/62/428ef076be88fa93716b576e4a01f919d25968913e817077a386fcbe4f42/certifi-2023.11.17-py3-none-any.whl#sha256=e036ab49d5b79556f99cfc2d9320b34cfbe5be05c5871b51de9329f0603b0474 # pip charset-normalizer @ https://files.pythonhosted.org/packages/98/69/5d8751b4b670d623aa7a47bef061d69c279e9f922f6705147983aa76c3ce/charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796 @@ -41,7 +41,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685 # pip lazy-loader @ https://files.pythonhosted.org/packages/a1/c3/65b3814e155836acacf720e5be3b5757130346670ac454fee29d3eda1381/lazy_loader-0.3-py3-none-any.whl#sha256=1e9e76ee8631e264c62ce10006718e80b2cfc74340d17d1031e0f84af7478554 # pip markupsafe @ https://files.pythonhosted.org/packages/de/63/cb7e71984e9159ec5f45b5e81e896c8bdd0e45fe3fc6ce02ab497f0d790e/MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e # pip networkx @ https://files.pythonhosted.org/packages/d5/f0/8fbc882ca80cf077f1b246c0e3c3465f7f415439bdea6b899f6b19f61f70/networkx-3.2.1-py3-none-any.whl#sha256=f18c69adc97877c42332c170849c96cefa91881c99a7cb3e95b7c659ebdc1ec2 -# pip numpy @ https://files.pythonhosted.org/packages/2f/75/f007cc0e6a373207818bef17f463d3305e9dd380a70db0e523e7660bf21f/numpy-1.26.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=baf8aab04a2c0e859da118f0b38617e5ee65d75b83795055fb66c0d5e9e9b818 +# pip numpy @ https://files.pythonhosted.org/packages/ea/ee/7a93594b78d7834d14ff49e74ba79e3f26b85604a542a790db81b1dd2326/numpy-1.26.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=b4d362e17bcb0011738c2d83e0a65ea8ce627057b2fdda37678f4374a382a137 # pip packaging @ https://files.pythonhosted.org/packages/ec/1a/610693ac4ee14fcdf2d9bf3c493370e4f2ef7ae2e19217d7a237ff42367d/packaging-23.2-py3-none-any.whl#sha256=8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7 # pip pillow @ https://files.pythonhosted.org/packages/87/0d/8f5136a5481731c342a901ff155c587ce7804114db069345e1894ab4978a/pillow-10.2.0-cp39-cp39-manylinux_2_28_x86_64.whl#sha256=b6f491cdf80ae540738859d9766783e3b3c8e5bd37f5dfa0b76abdecc5081f13 # pip pluggy @ https://files.pythonhosted.org/packages/05/b8/42ed91898d4784546c5f06c60506400548db3f7a4b3fb441cba4e5c17952/pluggy-1.3.0-py3-none-any.whl#sha256=d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7 diff --git a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock index 53dee069acde5..dd8c6560f66c7 100644 --- a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock @@ -3,9 +3,9 @@ # input_hash: 28ec764eefc982520846833c9ea571cf6ea5a0593dee76d7a7560b34e341e35b @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 -https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.08.22-h06a4308_0.conda#243d5065a09a3e85ab888c05f5b6445a +https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.12.12-h06a4308_0.conda#12bf7315c3f5ca50300e8b48d1b4ef2e https://repo.anaconda.com/pkgs/main/linux-64/ld_impl_linux-64-2.38-h1181459_1.conda#68eedfd9c06f2b0e6888d8db345b7f5b -https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023c-h04d1e81_0.conda#29db02adf8808f7c64642cead3e28acd +https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023d-h04d1e81_0.conda#fdb319536f351b2b828a350ffd1a35a1 https://repo.anaconda.com/pkgs/main/linux-64/libgomp-11.2.0-h1234567_1.conda#b372c0eea9b60732fdae4b817a63c8cd https://repo.anaconda.com/pkgs/main/linux-64/libstdcxx-ng-11.2.0-h1234567_1.conda#57623d10a70e09e1d048c2b2b6f4e2dd https://repo.anaconda.com/pkgs/main/linux-64/_openmp_mutex-5.1-1_gnu.conda#71d281e9c2192cb3fa425655a8defb85 @@ -21,15 +21,15 @@ https://repo.anaconda.com/pkgs/main/linux-64/ccache-3.7.9-hfe4627d_0.conda#bef6f https://repo.anaconda.com/pkgs/main/linux-64/readline-8.2-h5eee18b_0.conda#be42180685cce6e6b0329201d9f48efb https://repo.anaconda.com/pkgs/main/linux-64/tk-8.6.12-h1ccaba5_0.conda#fa10ff4aa631fa4aa090a6234d7770b9 https://repo.anaconda.com/pkgs/main/linux-64/sqlite-3.41.2-h5eee18b_0.conda#c7086c9ceb6cfe1c4c729a774a2d88a5 -https://repo.anaconda.com/pkgs/main/linux-64/python-3.11.5-h955ad1f_0.conda#3fd62f043c124c7aad747122e3a9edf2 +https://repo.anaconda.com/pkgs/main/linux-64/python-3.11.7-h955ad1f_0.conda#721e0e84035214979d06e677d5afa9f4 https://repo.anaconda.com/pkgs/main/linux-64/setuptools-68.2.2-py311h06a4308_0.conda#264aaac990aa82ff86442ad8249787a3 https://repo.anaconda.com/pkgs/main/linux-64/wheel-0.41.2-py311h06a4308_0.conda#2d4ff85d3dfb7749ae0485ee148d4ea5 https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py311h06a4308_0.conda#6fdb2a3c731f093b0014450a071c7f7f -# pip alabaster @ https://files.pythonhosted.org/packages/64/88/c7083fc61120ab661c5d0b82cb77079fc1429d3f913a456c1c82cf4658f7/alabaster-0.7.13-py3-none-any.whl#sha256=1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3 +# pip alabaster @ https://files.pythonhosted.org/packages/a8/11/a3159174442867ea12826e60a9f1d6f6299c2ae3f896d2a47566ab826686/alabaster-0.7.15-py3-none-any.whl#sha256=d99c6fd0f7a86fca68ecc5231c9de45227991c10ee6facfb894cf6afb953b142 # pip babel @ https://files.pythonhosted.org/packages/0d/35/4196b21041e29a42dc4f05866d0c94fa26c9da88ce12c38c2265e42c82fb/Babel-2.14.0-py3-none-any.whl#sha256=efb1a25b7118e67ce3a259bed20545c29cb68be8ad2c784c83689981b7a57287 # pip certifi @ https://files.pythonhosted.org/packages/64/62/428ef076be88fa93716b576e4a01f919d25968913e817077a386fcbe4f42/certifi-2023.11.17-py3-none-any.whl#sha256=e036ab49d5b79556f99cfc2d9320b34cfbe5be05c5871b51de9329f0603b0474 # pip charset-normalizer @ https://files.pythonhosted.org/packages/40/26/f35951c45070edc957ba40a5b1db3cf60a9dbb1b350c2d5bef03e01e61de/charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8 -# pip coverage @ https://files.pythonhosted.org/packages/ce/9f/20406e0dc07f6bba211a0ae40bb7a716daebdb715ba03ce6f611d01cb79d/coverage-7.3.3-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=ff4800783d85bff132f2cc7d007426ec698cdce08c3062c8d501ad3f4ea3d16c +# pip coverage @ https://files.pythonhosted.org/packages/3b/35/c5aa0de6a3c40f42b7702298de7b0a67c96bfe0c44ed9d0a953d069b23dc/coverage-7.4.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=485e9f897cf4856a65a57c7f6ea3dc0d4e6c076c87311d4bc003f82cfe199d25 # pip docutils @ https://files.pythonhosted.org/packages/26/87/f238c0670b94533ac0353a4e2a1a771a0cc73277b88bff23d3ae35a256c1/docutils-0.20.1-py3-none-any.whl#sha256=96f387a2c5562db4476f09f13bbab2192e764cac08ebbf3a34a95d9b1e4a59d6 # pip execnet @ https://files.pythonhosted.org/packages/e8/9c/a079946da30fac4924d92dbc617e5367d454954494cf1e71567bcc4e00ee/execnet-2.0.2-py3-none-any.whl#sha256=88256416ae766bc9e8895c76a87928c0012183da3cc4fc18016e6f050e025f41 # pip idna @ https://files.pythonhosted.org/packages/c2/e7/a82b05cf63a603df6e68d59ae6a68bf5064484a0718ea5033660af4b54a9/idna-3.6-py3-none-any.whl#sha256=c05567e9c24a6b9faaa835c4821bad0590fbb9d5779e7caa6e1cc4978e7eb24f @@ -48,7 +48,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py311h06a4308_0.conda#6f # pip threadpoolctl @ https://files.pythonhosted.org/packages/81/12/fd4dea011af9d69e1cad05c75f3f7202cdcbeac9b712eea58ca779a72865/threadpoolctl-3.2.0-py3-none-any.whl#sha256=2b7818516e423bdaebb97c723f86a7c6b0a83d3f3b0970328d66f4d9104dc032 # pip urllib3 @ https://files.pythonhosted.org/packages/96/94/c31f58c7a7f470d5665935262ebd7455c7e4c7782eb525658d3dbf4b9403/urllib3-2.1.0-py3-none-any.whl#sha256=55901e917a5896a349ff771be919f8bd99aff50b79fe58fec595eb37bbc56bb3 # pip jinja2 @ https://files.pythonhosted.org/packages/bc/c3/f068337a370801f372f2f8f6bad74a5c140f6fda3d9de154052708dd3c65/Jinja2-3.1.2-py3-none-any.whl#sha256=6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61 -# pip pytest @ https://files.pythonhosted.org/packages/f3/8c/f16efd81ca8e293b2cc78f111190a79ee539d0d5d36ccd49975cb3beac60/pytest-7.4.3-py3-none-any.whl#sha256=0d009c083ea859a71b76adf7c1d502e4bc170b80a8ef002da5806527b9591fac +# pip pytest @ https://files.pythonhosted.org/packages/51/ff/f6e8b8f39e08547faece4bd80f89d5a8de68a38b2d179cc1c4490ffa3286/pytest-7.4.4-py3-none-any.whl#sha256=b090cdf5ed60bf4c45261be03239c2c1c22df034fbffe691abe93cd80cea01d8 # pip python-dateutil @ https://files.pythonhosted.org/packages/36/7a/87837f39d0296e723bb9b62bbb257d0355c7f6128853c78955f57342a56d/python_dateutil-2.8.2-py2.py3-none-any.whl#sha256=961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9 # pip requests @ https://files.pythonhosted.org/packages/70/8e/0e2d847013cb52cd35b38c009bb167a1a26b2ce6cd6965bf26b47bc0bf44/requests-2.31.0-py3-none-any.whl#sha256=58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f # pip pooch @ https://files.pythonhosted.org/packages/1a/a5/5174dac3957ac412e80a00f30b6507031fcab7000afc9ea0ac413bddcff2/pooch-1.8.0-py3-none-any.whl#sha256=1bfba436d9e2ad5199ccad3583cca8c241b8736b5bb23fe67c213d52650dbb66 diff --git a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock index 7cdaba97d29c6..159ab024cc0c1 100644 --- a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock @@ -1,13 +1,13 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: c63ec98efe67f85fd681c6634249719a3658c65049b5eeb017b5f0259990901a +# input_hash: b4bfe38c127d42c34beb5fbcbb6d7a983e7063f8a6ec415182acb410dfc68d8d @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 https://repo.anaconda.com/pkgs/main/linux-64/blas-1.0-openblas.conda#9ddfcaef10d79366c90128f5dc444be8 https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.12.12-h06a4308_0.conda#12bf7315c3f5ca50300e8b48d1b4ef2e https://repo.anaconda.com/pkgs/main/linux-64/ld_impl_linux-64-2.38-h1181459_1.conda#68eedfd9c06f2b0e6888d8db345b7f5b https://repo.anaconda.com/pkgs/main/linux-64/libgfortran5-11.2.0-h1234567_1.conda#36a01a8c30e0cadf0d3e842c50b73f3b -https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023c-h04d1e81_0.conda#29db02adf8808f7c64642cead3e28acd +https://repo.anaconda.com/pkgs/main/noarch/tzdata-2023d-h04d1e81_0.conda#fdb319536f351b2b828a350ffd1a35a1 https://repo.anaconda.com/pkgs/main/linux-64/libgfortran-ng-11.2.0-h00389a5_1.conda#7429b67ab7b1d7cb99b9d1f3ddaec6e3 https://repo.anaconda.com/pkgs/main/linux-64/libgomp-11.2.0-h1234567_1.conda#b372c0eea9b60732fdae4b817a63c8cd https://repo.anaconda.com/pkgs/main/linux-64/libstdcxx-ng-11.2.0-h1234567_1.conda#57623d10a70e09e1d048c2b2b6f4e2dd diff --git a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock index f0f5a1834d75b..10b0d4ec2291f 100644 --- a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: win-64 -# input_hash: 74fe5aa9801e09d66b9a87902cfa12e2e9343f9b8337d0126093f48d00544ab6 +# input_hash: af544b6135127d0b6abf1eedcc8ba32a4d5e2e1d2904d4592abc7f3dba338569 @EXPLICIT https://conda.anaconda.org/conda-forge/win-64/ca-certificates-2023.11.17-h56e8100_0.conda#1163114b483f26761f993c709e65271f https://conda.anaconda.org/conda-forge/win-64/intel-openmp-2023.2.0-h57928b3_50497.conda#a401f3cae152deb75bbed766a90a6312 @@ -65,7 +65,7 @@ https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd715 https://conda.anaconda.org/conda-forge/win-64/pthread-stubs-0.4-hcd874cb_1001.tar.bz2#a1f820480193ea83582b13249a7e7bd9 https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 +https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.2.0-pyha21a80b_0.conda#978d03388b62173b8e6f79162cf52b86 https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2#f832c45a477c78bebd107098db465095 @@ -93,19 +93,19 @@ https://conda.anaconda.org/conda-forge/win-64/fonttools-4.47.0-py39ha55989b_0.co https://conda.anaconda.org/conda-forge/win-64/glib-2.78.3-h12be248_0.conda#a14440f1d004a2ddccd9c1354dbeffdf https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/win-64/mkl-2023.2.0-h6a75c08_50497.conda#064cea9f45531e7b53584acf4bd8b044 -https://conda.anaconda.org/conda-forge/win-64/pillow-10.1.0-py39h368b509_0.conda#131540ebb3d6b88d9a190ce39aeecc50 +https://conda.anaconda.org/conda-forge/win-64/pillow-10.2.0-py39h368b509_0.conda#706d6e5bbc4b5d2ac7b8a6077319294d https://conda.anaconda.org/conda-forge/win-64/pyqt5-sip-12.12.2-py39h99910a6_5.conda#dffbcea794c524c471772a5f697c2aea https://conda.anaconda.org/conda-forge/noarch/pytest-cov-4.1.0-pyhd8ed1ab_0.conda#06eb685a3a0b146347a58dda979485da https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 -https://conda.anaconda.org/conda-forge/win-64/gstreamer-1.22.8-hb4038d2_0.conda#498ec8375c067d237a6c85771f395138 +https://conda.anaconda.org/conda-forge/win-64/gstreamer-1.22.8-hb4038d2_1.conda#d24ef655de29ac3b1e14aae9cc2eb66b https://conda.anaconda.org/conda-forge/win-64/libblas-3.9.0-20_win64_mkl.conda#6cad6cd2fbdeef4d651b8f752a4da960 https://conda.anaconda.org/conda-forge/win-64/mkl-devel-2023.2.0-h57928b3_50497.conda#0d52cfab24361c77268b54920c11903c https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e -https://conda.anaconda.org/conda-forge/win-64/gst-plugins-base-1.22.8-h001b923_0.conda#4871a223a0b53452cbd34fd4c0c518e6 +https://conda.anaconda.org/conda-forge/win-64/gst-plugins-base-1.22.8-h001b923_1.conda#abe4d4f0820e367987d2ba73a84cf328 https://conda.anaconda.org/conda-forge/win-64/libcblas-3.9.0-20_win64_mkl.conda#e6d36cfcb2f2dff0f659d2aa0813eb2d https://conda.anaconda.org/conda-forge/win-64/liblapack-3.9.0-20_win64_mkl.conda#9510d07424d70fcac553d86b3e4a7c14 https://conda.anaconda.org/conda-forge/win-64/liblapacke-3.9.0-20_win64_mkl.conda#960008cd6e9827a5c9b68e77fdf3d29f -https://conda.anaconda.org/conda-forge/win-64/numpy-1.26.2-py39hddb5d58_0.conda#59f29cc03dd8a2768749cf73e8b1ce58 +https://conda.anaconda.org/conda-forge/win-64/numpy-1.26.3-py39hddb5d58_0.conda#5cd2960dafe35dbaf816b7c79d6c8178 https://conda.anaconda.org/conda-forge/win-64/qt-main-5.15.8-h9e85ed6_18.conda#8427460072b90560c0675c37c30386ef https://conda.anaconda.org/conda-forge/win-64/blas-devel-3.9.0-20_win64_mkl.conda#40f21d1e894795983dec1036847e7460 https://conda.anaconda.org/conda-forge/win-64/contourpy-1.2.0-py39h1f6ef14_0.conda#9eeea323eacb6549cbb3df3d81181cb2 diff --git a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock index c864e4e354f2e..a6893000d1871 100644 --- a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: dfda5c3b73321eb2a8bdc6c50490846e4a7a71dc4c8229f1f1b7a175acd8de80 +# input_hash: d70964a380150a9fdd34471eab9c13547ec7744156a6719ec0e4b97fc7d298fa @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -44,7 +44,7 @@ https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#b https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4-h59595ed_2.conda#7dbaa197d7ba6032caf7ae7f32c1efa0 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.2.0-hd590300_1.conda#603827b39ea2b835268adb8c821b8570 -https://conda.anaconda.org/conda-forge/linux-64/pixman-0.42.2-h59595ed_0.conda#700edd63ccd5fc66b70b1c028cea9a68 +https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.0-h59595ed_0.conda#6b4b43013628634b6cfdee6b74fd696b https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-h36c2ea0_1001.tar.bz2#22dad4df6e8630e8dff2428f6f6a7036 https://conda.anaconda.org/conda-forge/linux-64/xorg-kbproto-1.0.7-h7f98852_1002.tar.bz2#4b230e8381279d76131116660f5a241a https://conda.anaconda.org/conda-forge/linux-64/xorg-libice-1.1.1-hd590300_0.conda#b462a33c0be1421532f28bfe8f4a7514 @@ -133,7 +133,7 @@ https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2#2a7de29fb590ca14b5243c4c812c8025 https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 +https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 https://conda.anaconda.org/conda-forge/noarch/snowballstemmer-2.2.0-pyhd8ed1ab_0.tar.bz2#4d22a9315e78c6827f806065957d566e https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-jsmath-1.0.1-pyhd8ed1ab_0.conda#da1d979339e2714c30a8e806a33ec087 @@ -160,23 +160,23 @@ https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-20_linux64_openbl https://conda.anaconda.org/conda-forge/linux-64/libclang-15.0.7-default_hb11cfb5_4.conda#c90f4cbb57839c98fef8f830e4b9972f https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-20_linux64_openblas.conda#6fabc51f5e647d09cc010c40061557e0 https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-hd429924_1.conda#1dbcc04604fdf1e526e6d1b0b6938396 -https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py39had0adad_0.conda#eeaa413fddccecb2ab7f747bdb55b07f +https://conda.anaconda.org/conda-forge/linux-64/pillow-10.2.0-py39had0adad_0.conda#2972754dc054bb079d1d121918b5126f https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda#e667a3ab0df62c54e60e1843d2e6defb https://conda.anaconda.org/conda-forge/noarch/urllib3-2.1.0-pyhd8ed1ab_0.conda#f8ced8ee63830dec7ecc1be048d1470a -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_1.conda#1b52a89485ab573a5bb83a5225ff706e https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae -https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.2-py39h474f0d3_0.conda#459a58eda3e74dd5e3d596c618e7f20a +https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py39h474f0d3_0.conda#a1f1ad2d8ebf63f13f45fb21b7f49dfb https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py39h3d6467e_5.conda#93aff412f3e49fdb43361c0215cbd72d https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py39h7633fee_0.conda#ed71ad3e30eb03da363fb797419cce98 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_1.conda#3926dab94fe06d88ade0e716d77b8cf8 https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py39hddac248_0.conda#dcfd2f15c6f8f0bbf234412b18a2a5d0 https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py39h474f0d3_0.conda#4b401c1516417b4b14aa1249d2f7929d diff --git a/build_tools/circle/doc_linux-64_conda.lock b/build_tools/circle/doc_linux-64_conda.lock index ba98b2e05d26b..e72fe4615571b 100644 --- a/build_tools/circle/doc_linux-64_conda.lock +++ b/build_tools/circle/doc_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 2220bb165ad69a88fcb9db817451817fe8405e7de686ad25121b4e4239916e10 +# input_hash: f0d8179a97f73d1dd3fcaabd7b81b8f4ee3eeb0b07c038be883b60160b96c3e9 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -60,7 +60,7 @@ https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#b https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4-h59595ed_2.conda#7dbaa197d7ba6032caf7ae7f32c1efa0 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.2.0-hd590300_1.conda#603827b39ea2b835268adb8c821b8570 -https://conda.anaconda.org/conda-forge/linux-64/pixman-0.42.2-h59595ed_0.conda#700edd63ccd5fc66b70b1c028cea9a68 +https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.0-h59595ed_0.conda#6b4b43013628634b6cfdee6b74fd696b https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-h36c2ea0_1001.tar.bz2#22dad4df6e8630e8dff2428f6f6a7036 https://conda.anaconda.org/conda-forge/linux-64/rav1e-0.6.6-he8a937b_2.conda#77d9955b4abddb811cb8ab1aa7d743e4 https://conda.anaconda.org/conda-forge/linux-64/snappy-1.1.10-h9fff704_0.conda#e6d228cd0bb74a51dd18f5bfce0b4115 @@ -169,7 +169,7 @@ https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2#2a7de29fb590ca14b5243c4c812c8025 https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 +https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 https://conda.anaconda.org/conda-forge/noarch/snowballstemmer-2.2.0-pyhd8ed1ab_0.tar.bz2#4d22a9315e78c6827f806065957d566e https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-jsmath-1.0.1-pyhd8ed1ab_0.conda#da1d979339e2714c30a8e806a33ec087 @@ -203,7 +203,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libclang-15.0.7-default_hb11cfb5 https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-20_linux64_openblas.conda#6fabc51f5e647d09cc010c40061557e0 https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-hd429924_1.conda#1dbcc04604fdf1e526e6d1b0b6938396 https://conda.anaconda.org/conda-forge/noarch/memory_profiler-0.61.0-pyhd8ed1ab_0.tar.bz2#8b45f9f2b2f7a98b0ec179c8991a4a9b -https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py39had0adad_0.conda#eeaa413fddccecb2ab7f747bdb55b07f +https://conda.anaconda.org/conda-forge/linux-64/pillow-10.2.0-py39had0adad_0.conda#2972754dc054bb079d1d121918b5126f https://conda.anaconda.org/conda-forge/noarch/pip-23.3.2-pyhd8ed1ab_0.conda#8591c748f98dcc02253003533bc2e4b1 https://conda.anaconda.org/conda-forge/noarch/plotly-5.18.0-pyhd8ed1ab_0.conda#9f6a8664f1fe752f79473eeb9bf33a60 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 @@ -212,22 +212,22 @@ https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda#e667a3ab0df62c54e60e1843d2e6defb https://conda.anaconda.org/conda-forge/noarch/urllib3-2.1.0-pyhd8ed1ab_0.conda#f8ced8ee63830dec7ecc1be048d1470a https://conda.anaconda.org/conda-forge/linux-64/compilers-1.7.0-ha770c72_0.conda#81458b3aed8ab8711951ec3c0c04e097 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_1.conda#1b52a89485ab573a5bb83a5225ff706e https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae -https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.2-py39h474f0d3_0.conda#459a58eda3e74dd5e3d596c618e7f20a +https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py39h474f0d3_0.conda#a1f1ad2d8ebf63f13f45fb21b7f49dfb https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py39h3d6467e_5.conda#93aff412f3e49fdb43361c0215cbd72d https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py39h7633fee_0.conda#ed71ad3e30eb03da363fb797419cce98 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_1.conda#3926dab94fe06d88ade0e716d77b8cf8 https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-2024.1.1-py39hf9b8f0e_0.conda#9ddd29852457d1152ca235eb87bc74fb https://conda.anaconda.org/conda-forge/noarch/imageio-2.33.1-pyh8c1a49c_0.conda#1c34d58ac469a34e7e96832861368bce https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py39hddac248_0.conda#dcfd2f15c6f8f0bbf234412b18a2a5d0 https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.4-pyhd8ed1ab_0.conda#1184267eddebb57e47f8e1419c225595 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.2-py39h90d8ae4_0.conda#8e63cf0a9bfbdb45c794de1aa6ff6806 +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.3-py39h927a070_1.conda#9228d65338fc75b9f7040c30465cd84b https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.0-pyhd8ed1ab_0.conda#134b2b57b7865d2316a7cce1915a51ed https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/pywavelets-1.4.1-py39h44dd56e_1.conda#d037c20e3da2e85f03ebd20ad480c359 @@ -240,9 +240,9 @@ https://conda.anaconda.org/conda-forge/linux-64/statsmodels-0.14.1-py39h44dd56e_ https://conda.anaconda.org/conda-forge/noarch/tifffile-2023.12.9-pyhd8ed1ab_0.conda#454bc0aff84f35fa53ba9e0369737a9b https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py39h52134e7_5.conda#e1f148e57d071b09187719df86f513c1 https://conda.anaconda.org/conda-forge/linux-64/scikit-image-0.22.0-py39hddac248_2.conda#8d502a4d2cbe5a45ff35ca8af8cbec0a -https://conda.anaconda.org/conda-forge/noarch/seaborn-base-0.13.0-pyhd8ed1ab_0.conda#082666331726b2438986cfe33ae9a8ee +https://conda.anaconda.org/conda-forge/noarch/seaborn-base-0.13.1-pyhd8ed1ab_0.conda#c1c0e175f993a4677c3163b26652b96c https://conda.anaconda.org/conda-forge/linux-64/matplotlib-3.8.2-py39hf3d152e_0.conda#18d40a5ada9a801cabaf5d47c15c6282 -https://conda.anaconda.org/conda-forge/noarch/seaborn-0.13.0-hd8ed1ab_0.conda#ebd31a95a7008b7e164dad9dbbb5bb5a +https://conda.anaconda.org/conda-forge/noarch/seaborn-0.13.1-hd8ed1ab_0.conda#8d9b6f5e94b7840210b2b9ed235068c7 https://conda.anaconda.org/conda-forge/noarch/numpydoc-1.6.0-pyhd8ed1ab_0.conda#191b8a622191a403700d16a2008e4e29 https://conda.anaconda.org/conda-forge/noarch/sphinx-copybutton-0.5.2-pyhd8ed1ab_0.conda#ac832cc43adc79118cf6e23f1f9b8995 https://conda.anaconda.org/conda-forge/noarch/sphinx-gallery-0.15.0-pyhd8ed1ab_0.conda#1a49ca9515ef9a96edff2eea06143dc6 @@ -276,8 +276,8 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip send2trash @ https://files.pythonhosted.org/packages/a9/78/e4df1e080ed790acf3a704edf521006dd96b9841bd2e2a462c0d255e0565/Send2Trash-1.8.2-py3-none-any.whl#sha256=a384719d99c07ce1eefd6905d2decb6f8b7ed054025bb0e618919f945de4f679 # pip sniffio @ https://files.pythonhosted.org/packages/c3/a0/5dba8ed157b0136607c7f2151db695885606968d1fae123dc3391e0cfdbf/sniffio-1.3.0-py3-none-any.whl#sha256=eecefdce1e5bbfb7ad2eeaabf7c1eeb404d7757c379bd1f7e5cce9d8bf425384 # pip soupsieve @ https://files.pythonhosted.org/packages/4c/f3/038b302fdfbe3be7da016777069f26ceefe11a681055ea1f7817546508e3/soupsieve-2.5-py3-none-any.whl#sha256=eaa337ff55a1579b6549dc679565eac1e3d000563bcb1c8ab0d0fefbc0c2cdc7 -# pip traitlets @ https://files.pythonhosted.org/packages/a7/1d/7d07e1b152b419a8a9c7f812eeefd408a0610d869489ee2e86973486713f/traitlets-5.14.0-py3-none-any.whl#sha256=f14949d23829023013c47df20b4a76ccd1a85effb786dc060f34de7948361b33 -# pip types-python-dateutil @ https://files.pythonhosted.org/packages/1c/af/5af2e2a02bc464c1c7818c260606343020b96c0d5b64f637d9e91aee24fe/types_python_dateutil-2.8.19.14-py3-none-any.whl#sha256=f977b8de27787639986b4e28963263fd0e5158942b3ecef91b9335c130cb1ce9 +# pip traitlets @ https://files.pythonhosted.org/packages/45/34/5dc77fdc7bb4bd198317eea5679edf9cc0a186438b5b19dbb9062fb0f4d5/traitlets-5.14.1-py3-none-any.whl#sha256=2e5a030e6eff91737c643231bfcf04a65b0132078dad75e4936700b213652e74 +# pip types-python-dateutil @ https://files.pythonhosted.org/packages/28/50/8ed67814241e2684369f4b8b881c7d31a0816e76c8690ea8518017a35b7e/types_python_dateutil-2.8.19.20240106-py3-none-any.whl#sha256=efbbdc54590d0f16152fa103c9879c7d4a00e82078f6e2cf01769042165acaa2 # pip uri-template @ https://files.pythonhosted.org/packages/e7/00/3fca040d7cf8a32776d3d81a00c8ee7457e00f80c649f1e4a863c8321ae9/uri_template-1.3.0-py3-none-any.whl#sha256=a44a133ea12d44a0c0f06d7d42a52d71282e77e2f937d8abd5655b8d56fc1363 # pip webcolors @ https://files.pythonhosted.org/packages/d5/e1/3e9013159b4cbb71df9bd7611cbf90dc2c621c8aeeb677fc41dad72f2261/webcolors-1.13-py3-none-any.whl#sha256=29bc7e8752c0a1bd4a1f03c14d6e6a72e93d82193738fa860cbff59d0fcc11bf # pip webencodings @ https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl#sha256=a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78 @@ -288,8 +288,8 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip bleach @ https://files.pythonhosted.org/packages/ea/63/da7237f805089ecc28a3f36bca6a21c31fcbc2eb380f3b8f1be3312abd14/bleach-6.1.0-py3-none-any.whl#sha256=3225f354cfc436b9789c66c4ee030194bee0568fbf9cbdad3bc8b5c26c5f12b6 # pip cffi @ https://files.pythonhosted.org/packages/ea/ac/e9e77bc385729035143e54cc8c4785bd480eaca9df17565963556b0b7a93/cffi-1.16.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=8f8e709127c6c77446a8c0a8c8bf3c8ee706a06cd44b1e827c3e6a2ee6b8c098 # pip doit @ https://files.pythonhosted.org/packages/44/83/a2960d2c975836daa629a73995134fd86520c101412578c57da3d2aa71ee/doit-0.36.0-py3-none-any.whl#sha256=ebc285f6666871b5300091c26eafdff3de968a6bd60ea35dd1e3fc6f2e32479a -# pip jupyter-core @ https://files.pythonhosted.org/packages/9d/27/38fa0cac8acc54a202dd432f98553ddd1826da9633fe875e72b09a9e2b98/jupyter_core-5.6.1-py3-none-any.whl#sha256=3d16aec2e1ec84b69f7794e49c32830c1d950ad149526aec954c100047c5f3a7 -# pip referencing @ https://files.pythonhosted.org/packages/b4/11/d121780c173336c9bc3a5b8240ed31f518957cc22f6311c76259cb0fcf32/referencing-0.32.0-py3-none-any.whl#sha256=bdcd3efb936f82ff86f993093f6da7435c7de69a3b3a5a06678a6050184bee99 +# pip jupyter-core @ https://files.pythonhosted.org/packages/4f/64/c15b7ac8915f7cae6c64718a6ffbb5e75fd398cda05d0a8aca2f570f0ed5/jupyter_core-5.7.0-py3-none-any.whl#sha256=16eea462f7dad23ba9f86542bdf17f830804e2028eb48d609b6134d91681e983 +# pip referencing @ https://files.pythonhosted.org/packages/14/2a/0a9f649354cd2d40f6c4f16eadabd9727377e3b9bc2ccec6cb630d9a6765/referencing-0.32.1-py3-none-any.whl#sha256=7e4dc12271d8e15612bfe35792f5ea1c40970dadf8624602e33db2758f7ee554 # pip rfc3339-validator @ https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl#sha256=24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa # pip terminado @ https://files.pythonhosted.org/packages/69/df/deebc9fb14a49062a3330f673e80b100e665b54d998163b3f62620b6240c/terminado-0.18.0-py3-none-any.whl#sha256=87b0d96642d0fe5f5abd7783857b9cab167f221a39ff98e3b9619a788a3c0f2e # pip tinycss2 @ https://files.pythonhosted.org/packages/da/99/fd23634d6962c2791fb8cb6ccae1f05dcbfc39bce36bba8b1c9a8d92eae8/tinycss2-1.2.1-py3-none-any.whl#sha256=2b80a96d41e7c3914b8cda8bc7f705a4d9c49275616e886103dd839dfc847847 @@ -297,7 +297,7 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip isoduration @ https://files.pythonhosted.org/packages/7b/55/e5326141505c5d5e34c5e0935d2908a74e4561eca44108fbfb9c13d2911a/isoduration-20.11.0-py3-none-any.whl#sha256=b2904c2a4228c3d44f409c8ae8e2370eb21a26f7ac2ec5446df141dde3452042 # pip jsonschema-specifications @ https://files.pythonhosted.org/packages/ee/07/44bd408781594c4d0a027666ef27fab1e441b109dc3b76b4f836f8fd04fe/jsonschema_specifications-2023.12.1-py3-none-any.whl#sha256=87e4fdf3a94858b8a2ba2778d9ba57d8a9cafca7c7489c46ba0d30a8bc6a9c3c # pip jupyter-server-terminals @ https://files.pythonhosted.org/packages/13/50/9e4688558eb1a20d16e99171af9026be27d31a8b212c241595241736811a/jupyter_server_terminals-0.5.1-py3-none-any.whl#sha256=5e63e947ddd97bb2832db5ef837a258d9ccd4192cd608c1270850ad947ae5dd7 -# pip jupyterlite-core @ https://files.pythonhosted.org/packages/2f/0b/58eb568cbce3bbaa8702c6ce297870402828b222598a1db10e23e7190f52/jupyterlite_core-0.2.1-py3-none-any.whl#sha256=3f6161c4ad609bca913a42598005ff577611daae8dce448292fbb2c15db6b393 +# pip jupyterlite-core @ https://files.pythonhosted.org/packages/93/62/4387ca1578447027560863e8a4ebabd5d919ac990c99dc124a45a45846b2/jupyterlite_core-0.2.2-py3-none-any.whl#sha256=1f1babdbe630d429f631a508f0e3b3ffb4dfa005aeb748831e854c24025e766f # pip pyzmq @ https://files.pythonhosted.org/packages/76/8b/6fca99e22c6316917de32b17be299dea431544209d619da16b6d9ec85c83/pyzmq-25.1.2-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl#sha256=c0b5ca88a8928147b7b1e2dfa09f3b6c256bc1135a1338536cbc9ea13d3b7add # pip argon2-cffi @ https://files.pythonhosted.org/packages/a4/6a/e8a041599e78b6b3752da48000b14c8d1e8a04ded09c88c714ba047f34f5/argon2_cffi-23.1.0-py3-none-any.whl#sha256=c670642b78ba29641818ab2e68bd4e6a78ba53b7eff7b4c3815ae16abf91c7ea # pip jsonschema @ https://files.pythonhosted.org/packages/0f/ed/0058234d8dd2b1fc6beeea8eab945191a05e9d391a63202f49fe23327586/jsonschema-4.20.0-py3-none-any.whl#sha256=ed6231f0429ecf966f5bc8dfef245998220549cbbcf140f913b7464c52c3b6b3 @@ -307,6 +307,6 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip nbformat @ https://files.pythonhosted.org/packages/f4/e7/ef30a90b70eba39e675689b9eaaa92530a71d7435ab8f9cae520814e0caf/nbformat-5.9.2-py3-none-any.whl#sha256=1c5172d786a41b82bcfd0c23f9e6b6f072e8fb49c39250219e4acfff1efe89e9 # pip nbclient @ https://files.pythonhosted.org/packages/6b/3a/607149974149f847125c38a62b9ea2b8267eb74823bbf8d8c54ae0212a00/nbclient-0.9.0-py3-none-any.whl#sha256=a3a1ddfb34d4a9d17fc744d655962714a866639acd30130e9be84191cd97cd15 # pip nbconvert @ https://files.pythonhosted.org/packages/7f/ba/3a8a9870a8b42e63e8f5e770adedd191d5adc2348f3097fc0e7c83a39439/nbconvert-7.14.0-py3-none-any.whl#sha256=483dde47facdaa4875903d651305ad53cd76e2255ae3c61efe412a95f2d22a24 -# pip jupyter-server @ https://files.pythonhosted.org/packages/ed/20/2437a3865083360103b0218e82a910c4c35f3bf7248c5cdae6934ba4d01c/jupyter_server-2.12.1-py3-none-any.whl#sha256=fd030dd7be1ca572e4598203f718df6630c12bd28a599d7f1791c4d7938e1010 +# pip jupyter-server @ https://files.pythonhosted.org/packages/0c/3b/24a511c81b580a038aca06c91fc89df0464815903044bae1c85145cdf03c/jupyter_server-2.12.2-py3-none-any.whl#sha256=abcfa33f98a959f908c8733aa2d9fa0101d26941cbd49b148f4cef4d3046fc61 # pip jupyterlab-server @ https://files.pythonhosted.org/packages/a2/97/abbbe35fc67b6f9423309988f2e411f7cb117b08321866d3d8b720f4c0d4/jupyterlab_server-2.25.2-py3-none-any.whl#sha256=5b1798c9cc6a44f65c757de9f97fc06fc3d42535afbf47d2ace5e964ab447aaf # pip jupyterlite-sphinx @ https://files.pythonhosted.org/packages/9c/bd/1695eebeb376315c9fc5cbd41c54fb84bb69c68e69651bfc6f03aa4fe659/jupyterlite_sphinx-0.11.0-py3-none-any.whl#sha256=2a0762167e89ec6acd267c73bb90b528728fdba5e30390ea4fe37ddcec277191 diff --git a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock index b105d3629d947..4c0b70b6b260e 100644 --- a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock +++ b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 38f0008ad0777e0e6c0aed8337cd71641123af41d0a9025d70195fbb550b1f6f +# input_hash: 63e92fdc759dcf030bf7e6d4a5d86bec102c98562cfb7ebd4d3d4991c895678b @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -52,7 +52,7 @@ https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.3-h59595ed_0.conda#b https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4-h59595ed_2.conda#7dbaa197d7ba6032caf7ae7f32c1efa0 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.2.0-hd590300_1.conda#603827b39ea2b835268adb8c821b8570 -https://conda.anaconda.org/conda-forge/linux-64/pixman-0.42.2-h59595ed_0.conda#700edd63ccd5fc66b70b1c028cea9a68 +https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.0-h59595ed_0.conda#6b4b43013628634b6cfdee6b74fd696b https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-h36c2ea0_1001.tar.bz2#22dad4df6e8630e8dff2428f6f6a7036 https://conda.anaconda.org/conda-forge/linux-64/xorg-kbproto-1.0.7-h7f98852_1002.tar.bz2#4b230e8381279d76131116660f5a241a https://conda.anaconda.org/conda-forge/linux-64/xorg-libice-1.1.1-hd590300_0.conda#b462a33c0be1421532f28bfe8f4a7514 @@ -184,7 +184,7 @@ https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-20_linux64_openb https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.6.0-hd429924_1.conda#1dbcc04604fdf1e526e6d1b0b6938396 https://conda.anaconda.org/conda-forge/noarch/memory_profiler-0.61.0-pyhd8ed1ab_0.tar.bz2#8b45f9f2b2f7a98b0ec179c8991a4a9b https://conda.anaconda.org/conda-forge/noarch/partd-1.4.1-pyhd8ed1ab_0.conda#acf4b7c0bcd5fa3b0e05801c4d2accd6 -https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py39had0adad_0.conda#eeaa413fddccecb2ab7f747bdb55b07f +https://conda.anaconda.org/conda-forge/linux-64/pillow-10.2.0-py39had0adad_0.conda#2972754dc054bb079d1d121918b5126f https://conda.anaconda.org/conda-forge/noarch/pip-23.3.2-pyhd8ed1ab_0.conda#8591c748f98dcc02253003533bc2e4b1 https://conda.anaconda.org/conda-forge/noarch/plotly-5.14.0-pyhd8ed1ab_0.conda#6a7bcc42ef58dd6cf3da9333ea102433 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-16.1-hb77b528_5.conda#ac902ff3c1c6d750dd0dfc93a974ab74 @@ -193,7 +193,7 @@ https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda#e667a3ab0df62c54e60e1843d2e6defb https://conda.anaconda.org/conda-forge/noarch/urllib3-2.1.0-pyhd8ed1ab_0.conda#f8ced8ee63830dec7ecc1be048d1470a https://conda.anaconda.org/conda-forge/linux-64/compilers-1.7.0-ha770c72_0.conda#81458b3aed8ab8711951ec3c0c04e097 -https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_0.conda#a068fe1588dda3d29f568d536eeebae7 +https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.22.8-h98fc4e7_1.conda#1b52a89485ab573a5bb83a5225ff706e https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda#5a6f6c00ef982a9bc83558d9ac8f64a0 https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-7.0.1-hd8ed1ab_0.conda#4a2f43a20fa404b998859c6a470ba316 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae @@ -203,12 +203,12 @@ https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.c https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/noarch/dask-core-2023.12.1-pyhd8ed1ab_0.conda#bf6ad72d882bc3f04e6a0fb50fd2cce8 -https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_0.conda#307cf29b6c19238c17182f30ddaf1a50 +https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_1.conda#3926dab94fe06d88ade0e716d77b8cf8 https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-lite-2019.12.3-py39hd257fcd_5.tar.bz2#32dba66d6abc2b4b5b019c9e54307312 https://conda.anaconda.org/conda-forge/noarch/imageio-2.33.1-pyh8c1a49c_0.conda#1c34d58ac469a34e7e96832861368bce https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.3.4-py39h2fa2bec_0.tar.bz2#9ec0b2186fab9121c54f4844f93ee5b7 https://conda.anaconda.org/conda-forge/linux-64/pandas-1.1.5-py39hde0f152_0.tar.bz2#79fc4b5b3a865b90dd3701cecf1ad33c -https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.5-pyhd8ed1ab_0.conda#f266f66ba1dcae0dbcc771a491acbea4 +https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.6-pyhd8ed1ab_0.conda#a5b55d1cb110cdcedc748b5c3e16e687 https://conda.anaconda.org/conda-forge/linux-64/polars-0.19.12-py39h90d8ae4_0.conda#191828961c95f8d59fa2b86a590f9905 https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.0-pyhd8ed1ab_0.conda#134b2b57b7865d2316a7cce1915a51ed https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e From f2a7d39159a85c9488eaa7c604e8f9dd0490191f Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Mon, 8 Jan 2024 10:23:11 +0100 Subject: [PATCH 017/554] :lock: :robot: CI Update lock files for pypy CI build(s) :lock: :robot: (#28080) Co-authored-by: Lock file bot --- build_tools/azure/pypy3_linux-64_conda.lock | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/build_tools/azure/pypy3_linux-64_conda.lock b/build_tools/azure/pypy3_linux-64_conda.lock index 5fd5f84fcf17e..d5f5f842033c0 100644 --- a/build_tools/azure/pypy3_linux-64_conda.lock +++ b/build_tools/azure/pypy3_linux-64_conda.lock @@ -68,14 +68,14 @@ https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda https://conda.anaconda.org/conda-forge/linux-64/kiwisolver-1.4.5-py39ha90811c_1.conda#25edffabcb0760fc1821597c4ce920db https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2#2ba8498c1018c1e9c61eb99b973dfe19 -https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.2-py39h6dedee3_0.conda#be8411b206cee82a218fd8fc219d1ae9 +https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py39h6dedee3_0.conda#fcab766baac334344078d0aaf0945ec4 https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 -https://conda.anaconda.org/conda-forge/linux-64/pillow-10.1.0-py39hcf8a34e_0.conda#2bcde78b6e284e4266eee50ed5d0897d +https://conda.anaconda.org/conda-forge/linux-64/pillow-10.2.0-py39hcf8a34e_0.conda#8a406ee5a979c2591f4c734d6fe4a958 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/pypy-7.3.13-0_pypy39.conda#0973de0664d1bd004c1bc64a7aab8f2e -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 +https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.2.0-pyha21a80b_0.conda#978d03388b62173b8e6f79162cf52b86 https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2#5844808ffab9ebdb694585b50ba02a96 From 3c1aecec8dd24948d490fcdbe424fb953f5efe5c Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Mon, 8 Jan 2024 10:23:41 +0100 Subject: [PATCH 018/554] :lock: :robot: CI Update lock files for cirrus-arm CI build(s) :lock: :robot: (#28079) Co-authored-by: Lock file bot --- .../cirrus/pymin_conda_forge_linux-aarch64_conda.lock | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock index d622189ad1ac3..c8e8d0baf5236 100644 --- a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock +++ b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock @@ -62,7 +62,7 @@ https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda# https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb -https://conda.anaconda.org/conda-forge/noarch/setuptools-68.2.2-pyhd8ed1ab_0.conda#fc2166155db840c634a1291a5c35a709 +https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.2.0-pyha21a80b_0.conda#978d03388b62173b8e6f79162cf52b86 https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2#5844808ffab9ebdb694585b50ba02a96 @@ -75,13 +75,13 @@ https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.1.1-pyhd8ed1 https://conda.anaconda.org/conda-forge/noarch/joblib-1.3.2-pyhd8ed1ab_0.conda#4da50d410f553db77e62ab62ffaa1abc https://conda.anaconda.org/conda-forge/linux-aarch64/libcblas-3.9.0-20_linuxaarch64_openblas.conda#b41e55ae2cb9d3518da2cbe3677b3b3b https://conda.anaconda.org/conda-forge/linux-aarch64/liblapack-3.9.0-20_linuxaarch64_openblas.conda#e7412a592d9ee7c92026eb1189687271 -https://conda.anaconda.org/conda-forge/linux-aarch64/pillow-10.1.0-py39h8ce38d7_0.conda#afedc0abb518dac535cb861f24585160 +https://conda.anaconda.org/conda-forge/linux-aarch64/pillow-10.2.0-py39h8ce38d7_0.conda#cf4745fb7f7cb5d0b90c476116c7d8ac https://conda.anaconda.org/conda-forge/noarch/pip-23.3.2-pyhd8ed1ab_0.conda#8591c748f98dcc02253003533bc2e4b1 https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2#dd999d1cc9f79e67dbb855c8924c7984 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-aarch64/liblapacke-3.9.0-20_linuxaarch64_openblas.conda#1b8192f036a2dc41fec67700bb8bacef -https://conda.anaconda.org/conda-forge/linux-aarch64/numpy-1.26.2-py39h91c28bb_0.conda#fc8077e28f5f86b80f6f0d86263ce72d +https://conda.anaconda.org/conda-forge/linux-aarch64/numpy-1.26.3-py39h91c28bb_0.conda#9e10c6f9e309c2ada0d41c945e0f9b56 https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 https://conda.anaconda.org/conda-forge/linux-aarch64/blas-devel-3.9.0-20_linuxaarch64_openblas.conda#211c74d7600d8d1dec226daf5e28e2dc https://conda.anaconda.org/conda-forge/linux-aarch64/contourpy-1.2.0-py39hd16970a_0.conda#dc11a4a2e020d1d71350baa7cb4980e4 From eecc66ecab9a1e3f43660f312bcef085df431582 Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Mon, 8 Jan 2024 11:27:57 +0100 Subject: [PATCH 019/554] :lock: :robot: CI Update lock files for scipy-dev CI build(s) :lock: :robot: (#28016) Co-authored-by: Lock file bot From 3b212bdbc324188c68a05c58d857b116aad3697b Mon Sep 17 00:00:00 2001 From: Franck Charras <29153872+fcharras@users.noreply.github.com> Date: Tue, 9 Jan 2024 11:52:25 +0100 Subject: [PATCH 020/554] Small cleaning of `_atol_for_dtype` and `get_namespace` usage for consistency (#28057) Co-authored-by: Olivier Grisel --- sklearn/discriminant_analysis.py | 2 +- sklearn/metrics/cluster/_unsupervised.py | 4 +++- sklearn/preprocessing/_data.py | 6 +++--- sklearn/utils/_array_api.py | 17 ++++++++++------- sklearn/utils/estimator_checks.py | 13 ++++++------- 5 files changed, 23 insertions(+), 19 deletions(-) diff --git a/sklearn/discriminant_analysis.py b/sklearn/discriminant_analysis.py index 29146ca857694..46cb96ddd2886 100644 --- a/sklearn/discriminant_analysis.py +++ b/sklearn/discriminant_analysis.py @@ -697,7 +697,7 @@ def predict_proba(self, X): xp, is_array_api_compliant = get_namespace(X) decision = self.decision_function(X) if size(self.classes_) == 2: - proba = _expit(decision) + proba = _expit(decision, xp) return xp.stack([1 - proba, proba], axis=1) else: return softmax(decision) diff --git a/sklearn/metrics/cluster/_unsupervised.py b/sklearn/metrics/cluster/_unsupervised.py index 10749c23dacbe..ccbe473a5f645 100644 --- a/sklearn/metrics/cluster/_unsupervised.py +++ b/sklearn/metrics/cluster/_unsupervised.py @@ -14,6 +14,7 @@ from ...preprocessing import LabelEncoder from ...utils import _safe_indexing, check_random_state, check_X_y +from ...utils._array_api import _atol_for_type from ...utils._param_validation import ( Interval, StrOptions, @@ -263,7 +264,8 @@ def silhouette_samples(X, labels, *, metric="euclidean", **kwds): "elements on the diagonal. Use np.fill_diagonal(X, 0)." ) if X.dtype.kind == "f": - atol = np.finfo(X.dtype).eps * 100 + atol = _atol_for_type(X.dtype) + if np.any(np.abs(X.diagonal()) > atol): raise error_msg elif np.any(X.diagonal() != 0): # integral dtype diff --git a/sklearn/preprocessing/_data.py b/sklearn/preprocessing/_data.py index 9120384588ef2..4cbae0e1d3591 100644 --- a/sklearn/preprocessing/_data.py +++ b/sklearn/preprocessing/_data.py @@ -483,8 +483,8 @@ def partial_fit(self, X, y=None): force_all_finite="allow-nan", ) - data_min = _array_api._nanmin(X, axis=0) - data_max = _array_api._nanmax(X, axis=0) + data_min = _array_api._nanmin(X, axis=0, xp=xp) + data_max = _array_api._nanmax(X, axis=0, xp=xp) if first_pass: self.n_samples_seen_ = X.shape[0] @@ -1234,7 +1234,7 @@ def partial_fit(self, X, y=None): mins, maxs = min_max_axis(X, axis=0, ignore_nan=True) max_abs = np.maximum(np.abs(mins), np.abs(maxs)) else: - max_abs = _array_api._nanmax(xp.abs(X), axis=0) + max_abs = _array_api._nanmax(xp.abs(X), axis=0, xp=xp) if first_pass: self.n_samples_seen_ = X.shape[0] diff --git a/sklearn/utils/_array_api.py b/sklearn/utils/_array_api.py index 0c386a843bffb..1131cb3560287 100644 --- a/sklearn/utils/_array_api.py +++ b/sklearn/utils/_array_api.py @@ -396,8 +396,9 @@ def get_namespace(*arrays): return namespace, is_array_api_compliant -def _expit(X): - xp, _ = get_namespace(X) +def _expit(X, xp=None): + if xp is None: + xp = get_namespace(X) if _is_numpy_namespace(xp): return xp.asarray(special.expit(numpy.asarray(X))) @@ -464,10 +465,11 @@ def _weighted_sum(sample_score, sample_weight, normalize=False, xp=None): return float(xp.sum(sample_score)) -def _nanmin(X, axis=None): +def _nanmin(X, axis=None, xp=None): # TODO: refactor once nan-aware reductions are standardized: # https://github.com/data-apis/array-api/issues/621 - xp, _ = get_namespace(X) + if xp is None: + xp, _ = get_namespace(X) if _is_numpy_namespace(xp): return xp.asarray(numpy.nanmin(X, axis=axis)) @@ -481,10 +483,11 @@ def _nanmin(X, axis=None): return X -def _nanmax(X, axis=None): +def _nanmax(X, axis=None, xp=None): # TODO: refactor once nan-aware reductions are standardized: # https://github.com/data-apis/array-api/issues/621 - xp, _ = get_namespace(X) + if xp is None: + xp, _ = get_namespace(X) if _is_numpy_namespace(xp): return xp.asarray(numpy.nanmax(X, axis=axis)) @@ -571,5 +574,5 @@ def _estimator_with_converted_arrays(estimator, converter): def _atol_for_type(dtype): - """Return the absolute tolerance for a given dtype.""" + """Return the absolute tolerance for a given numpy dtype.""" return numpy.finfo(dtype).eps * 100 diff --git a/sklearn/utils/estimator_checks.py b/sklearn/utils/estimator_checks.py index 4d87357d6882d..b3135d30b362a 100644 --- a/sklearn/utils/estimator_checks.py +++ b/sklearn/utils/estimator_checks.py @@ -51,13 +51,12 @@ from ..random_projection import BaseRandomProjection from ..tree import DecisionTreeClassifier, DecisionTreeRegressor from ..utils._array_api import ( + _atol_for_type, _convert_to_numpy, get_namespace, yield_namespace_device_dtype_combinations, ) -from ..utils._array_api import ( - device as array_device, -) +from ..utils._array_api import device as array_device from ..utils._param_validation import ( InvalidParameterError, generate_invalid_param_val, @@ -922,7 +921,7 @@ def check_array_api_input( attribute, est_xp_param_np, err_msg=f"{key} not the same", - atol=np.finfo(X.dtype).eps * 100, + atol=_atol_for_type(X.dtype), ) else: assert attribute.shape == est_xp_param_np.shape @@ -952,7 +951,7 @@ def check_array_api_input( assert isinstance(result, float) assert isinstance(result_xp, float) if check_values: - assert abs(result - result_xp) < np.finfo(X.dtype).eps * 100 + assert abs(result - result_xp) < _atol_for_type(X.dtype) continue else: result = method(X) @@ -974,7 +973,7 @@ def check_array_api_input( result, result_xp_np, err_msg=f"{method} did not the return the same result", - atol=np.finfo(X.dtype).eps * 100, + atol=_atol_for_type(X.dtype), ) else: if hasattr(result, "shape"): @@ -999,7 +998,7 @@ def check_array_api_input( inverse_result, invese_result_xp_np, err_msg="inverse_transform did not the return the same result", - atol=np.finfo(X.dtype).eps * 100, + atol=_atol_for_type(X.dtype), ) else: assert inverse_result.shape == invese_result_xp_np.shape From 71deeb83d31222b74f0446c32524b94d9f1d58dc Mon Sep 17 00:00:00 2001 From: Olivier Grisel Date: Tue, 9 Jan 2024 17:17:42 +0100 Subject: [PATCH 021/554] FIX unstable test_pca_mle_array_api_compliance with PyTorch / CPU / float32 on macOS (#28067) --- sklearn/decomposition/tests/test_pca.py | 62 +++++++++++++++++++++++-- 1 file changed, 58 insertions(+), 4 deletions(-) diff --git a/sklearn/decomposition/tests/test_pca.py b/sklearn/decomposition/tests/test_pca.py index 83f71381c0ba7..44281b9038697 100644 --- a/sklearn/decomposition/tests/test_pca.py +++ b/sklearn/decomposition/tests/test_pca.py @@ -8,7 +8,7 @@ from sklearn import config_context, datasets from sklearn.base import clone -from sklearn.datasets import load_iris +from sklearn.datasets import load_iris, make_classification from sklearn.decomposition import PCA from sklearn.decomposition._pca import _assess_dimension, _infer_dimension from sklearn.utils._array_api import ( @@ -16,10 +16,10 @@ _convert_to_numpy, yield_namespace_device_dtype_combinations, ) +from sklearn.utils._array_api import device as array_device from sklearn.utils._testing import _array_api_for_tests, assert_allclose from sklearn.utils.estimator_checks import ( _get_check_estimator_ids, - check_array_api_input, check_array_api_input_and_values, ) from sklearn.utils.fixes import CSC_CONTAINERS, CSR_CONTAINERS @@ -882,14 +882,17 @@ def test_pca_array_api_compliance( ) @pytest.mark.parametrize( "check", - [check_array_api_input, check_array_api_get_precision], + [check_array_api_get_precision], ids=_get_check_estimator_ids, ) @pytest.mark.parametrize( "estimator", [ # PCA with mle cannot use check_array_api_input_and_values because of - # rounding errors in the noisy (low variance) components. + # rounding errors in the noisy (low variance) components. Even checking + # the shape of the `components_` is problematic because the number of + # components depends on trimming threshold of the mle algorithm which + # can depend on device-specific rounding errors. PCA(n_components="mle", svd_solver="full"), ], ids=_get_check_estimator_ids, @@ -900,6 +903,57 @@ def test_pca_mle_array_api_compliance( name = estimator.__class__.__name__ check(name, estimator, array_namespace, device=device, dtype_name=dtype_name) + # Simpler variant of the generic check_array_api_input checker tailored for + # the specific case of PCA with mle-trimmed components. + xp = _array_api_for_tests(array_namespace, device) + + X, y = make_classification(random_state=42) + X = X.astype(dtype_name, copy=False) + atol = _atol_for_type(X.dtype) + + est = clone(estimator) + + X_xp = xp.asarray(X, device=device) + y_xp = xp.asarray(y, device=device) + + est.fit(X, y) + + components_np = est.components_ + explained_variance_np = est.explained_variance_ + + est_xp = clone(est) + with config_context(array_api_dispatch=True): + est_xp.fit(X_xp, y_xp) + components_xp = est_xp.components_ + assert array_device(components_xp) == array_device(X_xp) + components_xp_np = _convert_to_numpy(components_xp, xp=xp) + + explained_variance_xp = est_xp.explained_variance_ + assert array_device(explained_variance_xp) == array_device(X_xp) + explained_variance_xp_np = _convert_to_numpy(explained_variance_xp, xp=xp) + + assert components_xp_np.dtype == components_np.dtype + assert components_xp_np.shape[1] == components_np.shape[1] + assert explained_variance_xp_np.dtype == explained_variance_np.dtype + + # Check that the explained variance values match for the + # common components: + min_components = min(components_xp_np.shape[0], components_np.shape[0]) + assert_allclose( + explained_variance_xp_np[:min_components], + explained_variance_np[:min_components], + atol=atol, + ) + + # If the number of components differ, check that the explained variance of + # the trimmed components is very small. + if components_xp_np.shape[0] != components_np.shape[0]: + reference_variance = explained_variance_np[-1] + extra_variance_np = explained_variance_np[min_components:] + extra_variance_xp_np = explained_variance_xp_np[min_components:] + assert all(np.abs(extra_variance_np - reference_variance) < atol) + assert all(np.abs(extra_variance_xp_np - reference_variance) < atol) + def test_array_api_error_and_warnings_on_unsupported_params(): pytest.importorskip("array_api_compat") From 5ad8e458e4cb14c68609390bc0293d7b5458d74a Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Tue, 9 Jan 2024 19:57:59 +0100 Subject: [PATCH 022/554] FIX more precise log loss gradient and hessian (#28048) --- doc/whats_new/v1.4.rst | 11 +++ sklearn/_loss/_loss.pyx.tp | 62 +++++++++----- sklearn/_loss/tests/test_loss.py | 140 ++++++++++++++++++++++++++----- 3 files changed, 172 insertions(+), 41 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index d2de5ee433f94..a932391b732cd 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -220,6 +220,17 @@ See :ref:`array_api` for more details. - :class:`preprocessing.MinMaxScaler` in :pr:`26243` by `Tim Head`_; - :class:`preprocessing.Normalizer` in :pr:`27558` by :user:`Edoardo Abati `. +Private Loss Function Module +---------------------------- + +- |FIX| The gradient computation of the binomial log loss is now numerically + more stable for very large, in absolute value, input (raw predictions). Before, it + could result in `np.nan`. Among the models that profit from this change are + :class:`ensemble.GradientBoostingClassifier`, + :class:`ensemble.HistGradientBoostingClassifier` and + :class:`linear_model.LogisticRegression`. + :pr:`28048` by :user:`Christian Lorentzen `. + Changelog --------- diff --git a/sklearn/_loss/_loss.pyx.tp b/sklearn/_loss/_loss.pyx.tp index 0ce653de84310..da974a3c3f4fd 100644 --- a/sklearn/_loss/_loss.pyx.tp +++ b/sklearn/_loss/_loss.pyx.tp @@ -695,9 +695,8 @@ cdef inline double cgradient_half_binomial( double y_true, double raw_prediction ) noexcept nogil: - # y_pred - y_true = expit(raw_prediction) - y_true - # Numerically more stable, see - # http://fa.bianp.net/blog/2019/evaluate_logistic/ + # gradient = y_pred - y_true = expit(raw_prediction) - y_true + # Numerically more stable, see http://fa.bianp.net/blog/2019/evaluate_logistic/ # if raw_prediction < 0: # exp_tmp = exp(raw_prediction) # return ((1 - y_true) * exp_tmp - y_true) / (1 + exp_tmp) @@ -708,12 +707,22 @@ cdef inline double cgradient_half_binomial( # return expit(raw_prediction) - y_true # i.e. no "if else" and an own inline implementation of expit instead of # from scipy.special.cython_special cimport expit - # The case distinction raw_prediction < 0 in the stable implementation - # does not provide significant better precision. Therefore we go without - # it. + # The case distinction raw_prediction < 0 in the stable implementation does not + # provide significant better precision apart from protecting overflow of exp(..). + # The branch (if else), however, can incur runtime costs of up to 30%. + # Instead, we help branch prediction by almost always ending in the first if clause + # and making the second branch (else) a bit simpler. This has the exact same + # precision but is faster than the stable implementation. + # As branching criteria, we use the same cutoff as in log1pexp. Note that the + # maximal value to get gradient = -1 with y_true = 1 is -37.439198610162731 + # (based on mpmath), and scipy.special.logit(np.finfo(float).eps) ~ -36.04365. cdef double exp_tmp - exp_tmp = exp(-raw_prediction) - return ((1 - y_true) - y_true * exp_tmp) / (1 + exp_tmp) + if raw_prediction > -37: + exp_tmp = exp(-raw_prediction) + return ((1 - y_true) - y_true * exp_tmp) / (1 + exp_tmp) + else: + # expit(raw_prediction) = exp(raw_prediction) for raw_prediction <= -37 + return exp(raw_prediction) - y_true cdef inline double_pair closs_grad_half_binomial( @@ -721,21 +730,24 @@ cdef inline double_pair closs_grad_half_binomial( double raw_prediction ) noexcept nogil: cdef double_pair lg - if raw_prediction <= 0: + # Same if else conditions as in log1pexp. + if raw_prediction <= -37: lg.val2 = exp(raw_prediction) # used as temporary - if raw_prediction <= -37: - lg.val1 = lg.val2 - y_true * raw_prediction # loss - else: - lg.val1 = log1p(lg.val2) - y_true * raw_prediction # loss + lg.val1 = lg.val2 - y_true * raw_prediction # loss + lg.val2 -= y_true # gradient + elif raw_prediction <= -2: + lg.val2 = exp(raw_prediction) # used as temporary + lg.val1 = log1p(lg.val2) - y_true * raw_prediction # loss lg.val2 = ((1 - y_true) * lg.val2 - y_true) / (1 + lg.val2) # gradient + elif raw_prediction <= 18: + lg.val2 = exp(-raw_prediction) # used as temporary + # log1p(exp(x)) = log(1 + exp(x)) = x + log1p(exp(-x)) + lg.val1 = log1p(lg.val2) + (1 - y_true) * raw_prediction # loss + lg.val2 = ((1 - y_true) - y_true * lg.val2) / (1 + lg.val2) # gradient else: lg.val2 = exp(-raw_prediction) # used as temporary - if raw_prediction <= 18: - # log1p(exp(x)) = log(1 + exp(x)) = x + log1p(exp(-x)) - lg.val1 = log1p(lg.val2) + (1 - y_true) * raw_prediction # loss - else: - lg.val1 = lg.val2 + (1 - y_true) * raw_prediction # loss - lg.val2 = ((1 - y_true) - y_true * lg.val2) / (1 + lg.val2) # gradient + lg.val1 = lg.val2 + (1 - y_true) * raw_prediction # loss + lg.val2 = ((1 - y_true) - y_true * lg.val2) / (1 + lg.val2) # gradient return lg @@ -747,9 +759,15 @@ cdef inline double_pair cgrad_hess_half_binomial( # hessian = y_pred * (1 - y_pred) = exp( raw) / (1 + exp( raw))**2 # = exp(-raw) / (1 + exp(-raw))**2 cdef double_pair gh - gh.val2 = exp(-raw_prediction) # used as temporary - gh.val1 = ((1 - y_true) - y_true * gh.val2) / (1 + gh.val2) # gradient - gh.val2 = gh.val2 / (1 + gh.val2)**2 # hessian + # See comment in cgradient_half_binomial. + if raw_prediction > -37: + gh.val2 = exp(-raw_prediction) # used as temporary + gh.val1 = ((1 - y_true) - y_true * gh.val2) / (1 + gh.val2) # gradient + gh.val2 = gh.val2 / (1 + gh.val2)**2 # hessian + else: + gh.val2 = exp(raw_prediction) + gh.val1 = gh.val2 - y_true + gh.val2 *= (1 - gh.val2) return gh diff --git a/sklearn/_loss/tests/test_loss.py b/sklearn/_loss/tests/test_loss.py index c018bb7147ce9..9c8bba4d717d1 100644 --- a/sklearn/_loss/tests/test_loss.py +++ b/sklearn/_loss/tests/test_loss.py @@ -224,48 +224,150 @@ def test_loss_boundary_y_pred(loss, y_pred_success, y_pred_fail): @pytest.mark.parametrize( - "loss, y_true, raw_prediction, loss_true", + "loss, y_true, raw_prediction, loss_true, gradient_true, hessian_true", [ - (HalfSquaredError(), 1.0, 5.0, 8), - (AbsoluteError(), 1.0, 5.0, 4), - (PinballLoss(quantile=0.5), 1.0, 5.0, 2), - (PinballLoss(quantile=0.25), 1.0, 5.0, 4 * (1 - 0.25)), - (PinballLoss(quantile=0.25), 5.0, 1.0, 4 * 0.25), - (HuberLoss(quantile=0.5, delta=3), 1.0, 5.0, 3 * (4 - 3 / 2)), - (HuberLoss(quantile=0.5, delta=3), 1.0, 3.0, 0.5 * 2**2), - (HalfPoissonLoss(), 2.0, np.log(4), 4 - 2 * np.log(4)), - (HalfGammaLoss(), 2.0, np.log(4), np.log(4) + 2 / 4), - (HalfTweedieLoss(power=3), 2.0, np.log(4), -1 / 4 + 1 / 4**2), - (HalfTweedieLossIdentity(power=1), 2.0, 4.0, 2 - 2 * np.log(2)), - (HalfTweedieLossIdentity(power=2), 2.0, 4.0, np.log(2) - 1 / 2), - (HalfTweedieLossIdentity(power=3), 2.0, 4.0, -1 / 4 + 1 / 4**2 + 1 / 2 / 2), - (HalfBinomialLoss(), 0.25, np.log(4), np.log(5) - 0.25 * np.log(4)), + (HalfSquaredError(), 1.0, 5.0, 8, 4, 1), + (AbsoluteError(), 1.0, 5.0, 4.0, 1.0, None), + (PinballLoss(quantile=0.5), 1.0, 5.0, 2, 0.5, None), + (PinballLoss(quantile=0.25), 1.0, 5.0, 4 * (1 - 0.25), 1 - 0.25, None), + (PinballLoss(quantile=0.25), 5.0, 1.0, 4 * 0.25, -0.25, None), + (HuberLoss(quantile=0.5, delta=3), 1.0, 5.0, 3 * (4 - 3 / 2), None, None), + (HuberLoss(quantile=0.5, delta=3), 1.0, 3.0, 0.5 * 2**2, None, None), + (HalfPoissonLoss(), 2.0, np.log(4), 4 - 2 * np.log(4), 4 - 2, 4), + (HalfGammaLoss(), 2.0, np.log(4), np.log(4) + 2 / 4, 1 - 2 / 4, 2 / 4), + (HalfTweedieLoss(power=3), 2.0, np.log(4), -1 / 4 + 1 / 4**2, None, None), + (HalfTweedieLossIdentity(power=1), 2.0, 4.0, 2 - 2 * np.log(2), None, None), + (HalfTweedieLossIdentity(power=2), 2.0, 4.0, np.log(2) - 1 / 2, None, None), + ( + HalfTweedieLossIdentity(power=3), + 2.0, + 4.0, + -1 / 4 + 1 / 4**2 + 1 / 2 / 2, + None, + None, + ), + ( + HalfBinomialLoss(), + 0.25, + np.log(4), + np.log1p(4) - 0.25 * np.log(4), + None, + None, + ), + # Extreme log loss cases, checked with mpmath: + # import mpmath as mp + # + # # Stolen from scipy + # def mpf2float(x): + # return float(mp.nstr(x, 17, min_fixed=0, max_fixed=0)) + # + # def mp_logloss(y_true, raw): + # with mp.workdps(100): + # y_true, raw = mp.mpf(float(y_true)), mp.mpf(float(raw)) + # out = mp.log1p(mp.exp(raw)) - y_true * raw + # return mpf2float(out) + # + # def mp_gradient(y_true, raw): + # with mp.workdps(100): + # y_true, raw = mp.mpf(float(y_true)), mp.mpf(float(raw)) + # out = mp.mpf(1) / (mp.mpf(1) + mp.exp(-raw)) - y_true + # return mpf2float(out) + # + # def mp_hessian(y_true, raw): + # with mp.workdps(100): + # y_true, raw = mp.mpf(float(y_true)), mp.mpf(float(raw)) + # p = mp.mpf(1) / (mp.mpf(1) + mp.exp(-raw)) + # out = p * (mp.mpf(1) - p) + # return mpf2float(out) + # + # y, raw = 0.0, 37. + # mp_logloss(y, raw), mp_gradient(y, raw), mp_hessian(y, raw) + (HalfBinomialLoss(), 0.0, -1e20, 0, 0, 0), + (HalfBinomialLoss(), 1.0, -1e20, 1e20, -1, 0), + (HalfBinomialLoss(), 0.0, -1e3, 0, 0, 0), + (HalfBinomialLoss(), 1.0, -1e3, 1e3, -1, 0), + (HalfBinomialLoss(), 1.0, -37.5, 37.5, -1, 0), + (HalfBinomialLoss(), 1.0, -37.0, 37, 1e-16 - 1, 8.533047625744065e-17), + (HalfBinomialLoss(), 0.0, -37.0, *[8.533047625744065e-17] * 3), + (HalfBinomialLoss(), 1.0, -36.9, 36.9, 1e-16 - 1, 9.430476078526806e-17), + (HalfBinomialLoss(), 0.0, -36.9, *[9.430476078526806e-17] * 3), + (HalfBinomialLoss(), 0.0, 37.0, 37, 1 - 1e-16, 8.533047625744065e-17), + (HalfBinomialLoss(), 1.0, 37.0, *[8.533047625744066e-17] * 3), + (HalfBinomialLoss(), 0.0, 37.5, 37.5, 1, 5.175555005801868e-17), + (HalfBinomialLoss(), 0.0, 232.8, 232.8, 1, 1.4287342391028437e-101), + (HalfBinomialLoss(), 1.0, 1e20, 0, 0, 0), + (HalfBinomialLoss(), 0.0, 1e20, 1e20, 1, 0), + ( + HalfBinomialLoss(), + 1.0, + 232.8, + 0, + -1.4287342391028437e-101, + 1.4287342391028437e-101, + ), + (HalfBinomialLoss(), 1.0, 232.9, 0, 0, 0), + (HalfBinomialLoss(), 1.0, 1e3, 0, 0, 0), + (HalfBinomialLoss(), 0.0, 1e3, 1e3, 1, 0), ( HalfMultinomialLoss(n_classes=3), 0.0, [0.2, 0.5, 0.3], logsumexp([0.2, 0.5, 0.3]) - 0.2, + None, + None, ), ( HalfMultinomialLoss(n_classes=3), 1.0, [0.2, 0.5, 0.3], logsumexp([0.2, 0.5, 0.3]) - 0.5, + None, + None, ), ( HalfMultinomialLoss(n_classes=3), 2.0, [0.2, 0.5, 0.3], logsumexp([0.2, 0.5, 0.3]) - 0.3, + None, + None, + ), + ( + HalfMultinomialLoss(n_classes=3), + 2.0, + [1e4, 0, 7e-7], + logsumexp([1e4, 0, 7e-7]) - (7e-7), + None, + None, ), ], ids=loss_instance_name, ) -def test_loss_on_specific_values(loss, y_true, raw_prediction, loss_true): - """Test losses at specific values.""" - assert loss( +def test_loss_on_specific_values( + loss, y_true, raw_prediction, loss_true, gradient_true, hessian_true +): + """Test losses, gradients and hessians at specific values.""" + loss1 = loss(y_true=np.array([y_true]), raw_prediction=np.array([raw_prediction])) + grad1 = loss.gradient( + y_true=np.array([y_true]), raw_prediction=np.array([raw_prediction]) + ) + loss2, grad2 = loss.loss_gradient( + y_true=np.array([y_true]), raw_prediction=np.array([raw_prediction]) + ) + grad3, hess = loss.gradient_hessian( y_true=np.array([y_true]), raw_prediction=np.array([raw_prediction]) - ) == approx(loss_true, rel=1e-11, abs=1e-12) + ) + + assert loss1 == approx(loss_true, rel=1e-15, abs=1e-15) + assert loss2 == approx(loss_true, rel=1e-15, abs=1e-15) + + if gradient_true is not None: + assert grad1 == approx(gradient_true, rel=1e-15, abs=1e-15) + assert grad2 == approx(gradient_true, rel=1e-15, abs=1e-15) + assert grad3 == approx(gradient_true, rel=1e-15, abs=1e-15) + + if hessian_true is not None: + assert hess == approx(hessian_true, rel=1e-15, abs=1e-15) @pytest.mark.parametrize("loss", ALL_LOSSES) From 60f3882c3be3b060c69580fd0ae26080d4d06c48 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Tue, 9 Jan 2024 21:05:11 +0100 Subject: [PATCH 023/554] MNT Remove some pins from lock files (#28082) --- ...latest_conda_forge_mkl_linux-64_conda.lock | 6 ++---- ...t_conda_forge_mkl_linux-64_environment.yml | 2 +- ...pylatest_conda_forge_mkl_osx-64_conda.lock | 6 ++---- ...est_conda_forge_mkl_osx-64_environment.yml | 2 +- ...latest_conda_mkl_no_openmp_environment.yml | 2 +- ...test_conda_mkl_no_openmp_osx-64_conda.lock | 6 ++---- ...latest_pip_openblas_pandas_environment.yml | 2 +- ...st_pip_openblas_pandas_linux-64_conda.lock | 6 ++---- .../pylatest_pip_scipy_dev_environment.yml | 2 +- ...pylatest_pip_scipy_dev_linux-64_conda.lock | 6 ++---- ...in_conda_defaults_openblas_environment.yml | 2 +- ...onda_defaults_openblas_linux-64_conda.lock | 6 ++---- .../pymin_conda_forge_mkl_environment.yml | 2 +- .../pymin_conda_forge_mkl_win-64_conda.lock | 6 ++---- ...forge_openblas_ubuntu_2204_environment.yml | 2 +- ...e_openblas_ubuntu_2204_linux-64_conda.lock | 6 ++---- build_tools/azure/pypy3_environment.yml | 2 +- build_tools/azure/pypy3_linux-64_conda.lock | 6 ++---- build_tools/azure/test_script.sh | 12 +++++------- build_tools/azure/ubuntu_atlas_lock.txt | 7 +------ .../azure/ubuntu_atlas_requirements.txt | 2 +- build_tools/circle/doc_environment.yml | 3 +-- build_tools/circle/doc_linux-64_conda.lock | 8 +++----- .../doc_min_dependencies_environment.yml | 2 +- .../doc_min_dependencies_linux-64_conda.lock | 6 ++---- .../cirrus/pymin_conda_forge_environment.yml | 2 +- ...pymin_conda_forge_linux-aarch64_conda.lock | 6 ++---- .../update_environments_and_lock_files.py | 19 ++++++------------- 28 files changed, 50 insertions(+), 89 deletions(-) diff --git a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock index 422dc6f1f9626..4171e34d5b5d1 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 7aa55d66dfbd0f6267a9aff8c750d1e9f42cd339726c8f9c4d1299341b064849 +# input_hash: 0e751f4212c4e51710aad471314a8b385a5e12fe3536c2a766f949da61eabb88 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -148,7 +148,6 @@ https://conda.anaconda.org/conda-forge/linux-64/openjpeg-2.5.0-h488ebb8_3.conda# https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 @@ -187,12 +186,11 @@ https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.3.0-h3d44ed6_0.conda# https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-16_linux64_mkl.tar.bz2#85f61af03fd291dae33150ffe89dc09a https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py311hb755f60_5.conda#e4d262cc3600e70b505a6761d29f6207 https://conda.anaconda.org/conda-forge/noarch/pytest-cov-4.1.0-pyhd8ed1ab_0.conda#06eb685a3a0b146347a58dda979485da -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.21.0-hb942446_5.conda#07d92ed5403ad7b5c66ffd7d5b8f7e57 https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_1.conda#3926dab94fe06d88ade0e716d77b8cf8 https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-16_linux64_mkl.tar.bz2#361bf757b95488de76c4f123805742d3 https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-16_linux64_mkl.tar.bz2#a2f166748917d6d6e4707841ca1f519e -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/aws-sdk-cpp-1.10.57-h85b1a90_19.conda#0605d3d60857fc07bd6a11e878fe0f08 https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py311h64a7726_0.conda#231eef4f33640338f64ef9ab690ba08d https://conda.anaconda.org/conda-forge/linux-64/qt-main-5.15.8-h82b777d_17.conda#4f01e33dbb406085a16a2813ab067e95 diff --git a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_environment.yml b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_environment.yml index 07ec7bb7ff206..107ad5b3d6f8b 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_environment.yml +++ b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_environment.yml @@ -15,7 +15,7 @@ dependencies: - pandas - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - pytest-cov diff --git a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock index d412beaf30789..e7fda548f5985 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: osx-64 -# input_hash: 02abef27514db5e5119c3cdc253e84a06374c1b308495298b46bdb14dcc52ae9 +# input_hash: 8d19b3cb048dd1e254e00f21d81841feddd52c98a15661153cb472e9903b5cb3 @EXPLICIT https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-h10d778d_5.conda#6097a6ca9ada32699b5fc4312dd6ef18 https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2023.11.17-h8857fd0_0.conda#c687e9d14c49e3d3946d50a413cdbf16 @@ -81,7 +81,6 @@ https://conda.anaconda.org/conda-forge/osx-64/numpy-1.26.3-py312he3a82b2_0.conda https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 https://conda.anaconda.org/conda-forge/osx-64/pillow-10.2.0-py312h0c70c2f_0.conda#0cc3674239ad12c6836cb4174f106c92 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2023.4-pyhd8ed1ab_0.conda#c79cacf8a06a51552fc651652f170208 https://conda.anaconda.org/conda-forge/noarch/pytz-2023.3.post1-pyhd8ed1ab_0.conda#c93346b446cd08c169d843ae5fc0da97 @@ -107,10 +106,9 @@ https://conda.anaconda.org/conda-forge/osx-64/matplotlib-base-3.8.2-py312h302682 https://conda.anaconda.org/conda-forge/osx-64/pandas-2.1.4-py312haf8ecfc_0.conda#cb889a75192ef98a17c3f431f6518dd2 https://conda.anaconda.org/conda-forge/osx-64/pyamg-5.0.1-py312h674694f_1.conda#e5b9c0f8b5c367467425ff34353ef761 https://conda.anaconda.org/conda-forge/noarch/pytest-cov-4.1.0-pyhd8ed1ab_0.conda#06eb685a3a0b146347a58dda979485da -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/noarch/compiler-rt_osx-64-16.0.6-ha38d28d_2.conda#7a46507edc35c6c8818db0adaf8d787f https://conda.anaconda.org/conda-forge/osx-64/matplotlib-3.8.2-py312hb401068_0.conda#926f479dcab7d6d26bba7fe39f67e3b2 -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/osx-64/compiler-rt-16.0.6-ha38d28d_2.conda#3b9e8c5c63b8e86234f499490acd85c2 https://conda.anaconda.org/conda-forge/osx-64/clang_impl_osx-64-16.0.6-h8787910_8.conda#2e694b8880599d19aec8e489eb01580f https://conda.anaconda.org/conda-forge/osx-64/clang_osx-64-16.0.6-hb91bd55_8.conda#831779e455d39ed7e8911be6e7d02814 diff --git a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_environment.yml b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_environment.yml index 4ddb80c7cae3d..8535baec11c4d 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_environment.yml +++ b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_environment.yml @@ -15,7 +15,7 @@ dependencies: - pandas - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - pytest-cov diff --git a/build_tools/azure/pylatest_conda_mkl_no_openmp_environment.yml b/build_tools/azure/pylatest_conda_mkl_no_openmp_environment.yml index 64a33fe7d7522..6bc77eef6ed64 100644 --- a/build_tools/azure/pylatest_conda_mkl_no_openmp_environment.yml +++ b/build_tools/azure/pylatest_conda_mkl_no_openmp_environment.yml @@ -15,7 +15,7 @@ dependencies: - pandas - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - pytest-cov diff --git a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock index 63ccdf725e7dc..9bdd868dbf1f9 100644 --- a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: osx-64 -# input_hash: 03f7604aefb9752d2367c457bdf4e4923158be96db35ac0dd1d5dc60a9981cd1 +# input_hash: 9eaf961c53a9a025d43e8f2e3c17586b0ff793daddfbde53625c4b098de328ff @EXPLICIT https://repo.anaconda.com/pkgs/main/osx-64/blas-1.0-mkl.conda#cb2c87e85ac8e0ceae776d26d4214c8a https://repo.anaconda.com/pkgs/main/osx-64/bzip2-1.0.8-h1de35cc_0.conda#19fcb113b170fe2a0be96b47801fed7d @@ -52,7 +52,6 @@ https://repo.anaconda.com/pkgs/main/noarch/munkres-1.1.4-py_0.conda#148362ba07f9 https://repo.anaconda.com/pkgs/main/osx-64/openjpeg-2.4.0-h66ea3da_0.conda#882833bd7befc5e60e6fba9c518c1b79 https://repo.anaconda.com/pkgs/main/osx-64/packaging-23.1-py311hecd8cb5_0.conda#4f5c491cd2de9d61f61c0ea3340ab46a https://repo.anaconda.com/pkgs/main/osx-64/pluggy-1.0.0-py311hecd8cb5_1.conda#98e4da64cd934965a0caf4136280ff35 -https://repo.anaconda.com/pkgs/main/noarch/py-1.11.0-pyhd3eb1b0_0.conda#7205a898ed2abbf6e9b903dff6abe08e https://repo.anaconda.com/pkgs/main/osx-64/pyparsing-3.0.9-py311hecd8cb5_0.conda#a4262f849ecc82af69f58da0cbcaaf04 https://repo.anaconda.com/pkgs/main/noarch/python-tzdata-2023.3-pyhd3eb1b0_0.conda#479c037de0186d114b9911158427624e https://repo.anaconda.com/pkgs/main/osx-64/pytz-2023.3.post1-py311hecd8cb5_0.conda#32d107281d133e3935dfb6935153e438 @@ -67,8 +66,7 @@ https://repo.anaconda.com/pkgs/main/osx-64/pillow-10.0.1-py311h7d39338_0.conda#0 https://repo.anaconda.com/pkgs/main/osx-64/pytest-7.4.0-py311hecd8cb5_0.conda#8c5496a4a1f36160ac5556495faa4a24 https://repo.anaconda.com/pkgs/main/noarch/python-dateutil-2.8.2-pyhd3eb1b0_0.conda#211ee00320b08a1ac9fea6677649f6c9 https://repo.anaconda.com/pkgs/main/osx-64/pytest-cov-4.1.0-py311hecd8cb5_1.conda#b1e41a8eda3f119b39b13f3a4d0c5bf5 -https://repo.anaconda.com/pkgs/main/osx-64/pytest-forked-1.6.0-py311hecd8cb5_0.conda#b1154a9887bee381b3405ec37f8b13f3 -https://repo.anaconda.com/pkgs/main/noarch/pytest-xdist-2.5.0-pyhd3eb1b0_0.conda#d15cdc4207bcf8ca920822597f1d138d +https://repo.anaconda.com/pkgs/main/osx-64/pytest-xdist-3.5.0-py311hecd8cb5_0.conda#e892e4359ea4f0987e8268f7e7869680 https://repo.anaconda.com/pkgs/main/osx-64/bottleneck-1.3.5-py311hb9e55a9_0.conda#5aa1b58b421d4608b16184f8468253ef https://repo.anaconda.com/pkgs/main/osx-64/contourpy-1.2.0-py311ha357a0b_0.conda#c9189b40e5b4be360aef22be336a4838 https://repo.anaconda.com/pkgs/main/osx-64/matplotlib-3.8.0-py311hecd8cb5_0.conda#f720f09a9d1bb976aa92a13180cf7133 diff --git a/build_tools/azure/pylatest_pip_openblas_pandas_environment.yml b/build_tools/azure/pylatest_pip_openblas_pandas_environment.yml index ddbc75c1d9110..6167ca6e63748 100644 --- a/build_tools/azure/pylatest_pip_openblas_pandas_environment.yml +++ b/build_tools/azure/pylatest_pip_openblas_pandas_environment.yml @@ -17,7 +17,7 @@ dependencies: - pandas - pyamg - pytest - - pytest-xdist==2.5.0 + - pytest-xdist - pillow - setuptools - pytest-cov diff --git a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock index 4d5e662a2d0f5..593c5571ece8b 100644 --- a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: d01d23bd27bcd50d2b3643492f966c8e390822d72b69f31bf66c2fe98a265a4c +# input_hash: 11d8952d04302b85207df163f6a5b20d8680e2eb067f9fb492d381a2b74c3a8f @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.12.12-h06a4308_0.conda#12bf7315c3f5ca50300e8b48d1b4ef2e @@ -45,7 +45,6 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685 # pip packaging @ https://files.pythonhosted.org/packages/ec/1a/610693ac4ee14fcdf2d9bf3c493370e4f2ef7ae2e19217d7a237ff42367d/packaging-23.2-py3-none-any.whl#sha256=8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7 # pip pillow @ https://files.pythonhosted.org/packages/87/0d/8f5136a5481731c342a901ff155c587ce7804114db069345e1894ab4978a/pillow-10.2.0-cp39-cp39-manylinux_2_28_x86_64.whl#sha256=b6f491cdf80ae540738859d9766783e3b3c8e5bd37f5dfa0b76abdecc5081f13 # pip pluggy @ https://files.pythonhosted.org/packages/05/b8/42ed91898d4784546c5f06c60506400548db3f7a4b3fb441cba4e5c17952/pluggy-1.3.0-py3-none-any.whl#sha256=d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7 -# pip py @ https://files.pythonhosted.org/packages/f6/f0/10642828a8dfb741e5f3fbaac830550a518a775c7fff6f04a007259b0548/py-1.11.0-py2.py3-none-any.whl#sha256=607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378 # pip pygments @ https://files.pythonhosted.org/packages/97/9c/372fef8377a6e340b1704768d20daaded98bf13282b5327beb2e2fe2c7ef/pygments-2.17.2-py3-none-any.whl#sha256=b27c2826c47d0f3219f29554824c30c5e8945175d888647acd804ddd04af846c # pip pyparsing @ https://files.pythonhosted.org/packages/39/92/8486ede85fcc088f1b3dba4ce92dd29d126fd96b0008ea213167940a2475/pyparsing-3.1.1-py3-none-any.whl#sha256=32c7c0b711493c72ff18a981d24f28aaf9c1fb7ed5e9667c9e84e3db623bdbfb # pip pytz @ https://files.pythonhosted.org/packages/32/4d/aaf7eff5deb402fd9a24a1449a8119f00d74ae9c2efa79f8ef9994261fc2/pytz-2023.3.post1-py2.py3-none-any.whl#sha256=ce42d816b81b68506614c11e8937d3aa9e41007ceb50bfdcb0749b921bf646c7 @@ -74,9 +73,8 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685 # pip pandas @ https://files.pythonhosted.org/packages/bc/f8/2aa75ae200bdb9dc6967712f26628a06bf45d3ad94cbbf6fb4962ada15a3/pandas-2.1.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=1ebfd771110b50055712b3b711b51bee5d50135429364d0498e1213a7adc2be8 # pip pyamg @ https://files.pythonhosted.org/packages/35/1c/8b2aa6fbb2bae258ab6cdb35b09635bf50865ac2bcdaf220db3d972cc0d8/pyamg-5.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=1332acec6d5ede9440c8ced0ef20952f5b766387116f254b79880ce29fdecee7 # pip pytest-cov @ https://files.pythonhosted.org/packages/a7/4b/8b78d126e275efa2379b1c2e09dc52cf70df16fc3b90613ef82531499d73/pytest_cov-4.1.0-py3-none-any.whl#sha256=6ba70b9e97e69fcc3fb45bfeab2d0a138fb65c4d0d6a41ef33983ad114be8c3a -# pip pytest-forked @ https://files.pythonhosted.org/packages/f4/af/9c0bda43e486a3c9bf1e0f876d0f241bc3f229d7d65d09331a0868db9629/pytest_forked-1.6.0-py3-none-any.whl#sha256=810958f66a91afb1a1e2ae83089d8dc1cd2437ac96b12963042fbb9fb4d16af0 +# pip pytest-xdist @ https://files.pythonhosted.org/packages/50/37/125fe5ec459321e2d48a0c38672cfc2419ad87d580196fd894e5f25230b0/pytest_xdist-3.5.0-py3-none-any.whl#sha256=d075629c7e00b611df89f490a5063944bee7a4362a5ff11c7cc7824a03dfce24 # pip scikit-image @ https://files.pythonhosted.org/packages/a3/7e/4cd853a855ac34b4ef3ef6a5c3d1c2e96eaca1154fc6be75db55ffa87393/scikit_image-0.22.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=3b7a6c89e8d6252332121b58f50e1625c35f7d6a85489c0b6b7ee4f5155d547a -# pip pytest-xdist @ https://files.pythonhosted.org/packages/21/08/b1945d4b4986eb1aa10cf84efc5293bba39da80a2f95db3573dd90678408/pytest_xdist-2.5.0-py3-none-any.whl#sha256=6fe5c74fec98906deb8f2d2b616b5c782022744978e7bd4695d39c8f42d0ce65 # pip numpydoc @ https://files.pythonhosted.org/packages/9c/94/09c437fd4a5fb5adf0468c0865c781dbc11d399544b55f1163d5d4414afb/numpydoc-1.6.0-py3-none-any.whl#sha256=b6ddaa654a52bdf967763c1e773be41f1c3ae3da39ee0de973f2680048acafaa # pip sphinxcontrib-applehelp @ https://files.pythonhosted.org/packages/c0/0c/261c0949083c0ac635853528bb0070c89e927841d4e533ba0b5563365c06/sphinxcontrib_applehelp-1.0.7-py3-none-any.whl#sha256=094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d # pip sphinxcontrib-devhelp @ https://files.pythonhosted.org/packages/c0/03/010ac733ec7b7f71c1dc88e7115743ee466560d6d85373b56fb9916e4586/sphinxcontrib_devhelp-1.0.5-py3-none-any.whl#sha256=fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f diff --git a/build_tools/azure/pylatest_pip_scipy_dev_environment.yml b/build_tools/azure/pylatest_pip_scipy_dev_environment.yml index 2d3de7b1e1ed4..63987809e6ddd 100644 --- a/build_tools/azure/pylatest_pip_scipy_dev_environment.yml +++ b/build_tools/azure/pylatest_pip_scipy_dev_environment.yml @@ -10,7 +10,7 @@ dependencies: - pip: - threadpoolctl - pytest - - pytest-xdist==2.5.0 + - pytest-xdist - setuptools - pytest-cov - coverage diff --git a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock index dd8c6560f66c7..a3c3af5613906 100644 --- a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 28ec764eefc982520846833c9ea571cf6ea5a0593dee76d7a7560b34e341e35b +# input_hash: 4ef027bae3f3dd18c4b010f99e6cc898037a9e17722412580463a65b352072ea @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 https://repo.anaconda.com/pkgs/main/linux-64/ca-certificates-2023.12.12-h06a4308_0.conda#12bf7315c3f5ca50300e8b48d1b4ef2e @@ -39,7 +39,6 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py311h06a4308_0.conda#6f # pip packaging @ https://files.pythonhosted.org/packages/ec/1a/610693ac4ee14fcdf2d9bf3c493370e4f2ef7ae2e19217d7a237ff42367d/packaging-23.2-py3-none-any.whl#sha256=8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7 # pip platformdirs @ https://files.pythonhosted.org/packages/be/53/42fe5eab4a09d251a76d0043e018172db324a23fcdac70f77a551c11f618/platformdirs-4.1.0-py3-none-any.whl#sha256=11c8f37bcca40db96d8144522d925583bdb7a31f7b0e37e3ed4318400a8e2380 # pip pluggy @ https://files.pythonhosted.org/packages/05/b8/42ed91898d4784546c5f06c60506400548db3f7a4b3fb441cba4e5c17952/pluggy-1.3.0-py3-none-any.whl#sha256=d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7 -# pip py @ https://files.pythonhosted.org/packages/f6/f0/10642828a8dfb741e5f3fbaac830550a518a775c7fff6f04a007259b0548/py-1.11.0-py2.py3-none-any.whl#sha256=607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378 # pip pygments @ https://files.pythonhosted.org/packages/97/9c/372fef8377a6e340b1704768d20daaded98bf13282b5327beb2e2fe2c7ef/pygments-2.17.2-py3-none-any.whl#sha256=b27c2826c47d0f3219f29554824c30c5e8945175d888647acd804ddd04af846c # pip six @ https://files.pythonhosted.org/packages/d9/5a/e7c31adbe875f2abbb91bd84cf2dc52d792b5a01506781dbcf25c91daf11/six-1.16.0-py2.py3-none-any.whl#sha256=8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254 # pip snowballstemmer @ https://files.pythonhosted.org/packages/ed/dc/c02e01294f7265e63a7315fe086dd1df7dacb9f840a804da846b96d01b96/snowballstemmer-2.2.0-py2.py3-none-any.whl#sha256=c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a @@ -53,8 +52,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py311h06a4308_0.conda#6f # pip requests @ https://files.pythonhosted.org/packages/70/8e/0e2d847013cb52cd35b38c009bb167a1a26b2ce6cd6965bf26b47bc0bf44/requests-2.31.0-py3-none-any.whl#sha256=58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f # pip pooch @ https://files.pythonhosted.org/packages/1a/a5/5174dac3957ac412e80a00f30b6507031fcab7000afc9ea0ac413bddcff2/pooch-1.8.0-py3-none-any.whl#sha256=1bfba436d9e2ad5199ccad3583cca8c241b8736b5bb23fe67c213d52650dbb66 # pip pytest-cov @ https://files.pythonhosted.org/packages/a7/4b/8b78d126e275efa2379b1c2e09dc52cf70df16fc3b90613ef82531499d73/pytest_cov-4.1.0-py3-none-any.whl#sha256=6ba70b9e97e69fcc3fb45bfeab2d0a138fb65c4d0d6a41ef33983ad114be8c3a -# pip pytest-forked @ https://files.pythonhosted.org/packages/f4/af/9c0bda43e486a3c9bf1e0f876d0f241bc3f229d7d65d09331a0868db9629/pytest_forked-1.6.0-py3-none-any.whl#sha256=810958f66a91afb1a1e2ae83089d8dc1cd2437ac96b12963042fbb9fb4d16af0 -# pip pytest-xdist @ https://files.pythonhosted.org/packages/21/08/b1945d4b4986eb1aa10cf84efc5293bba39da80a2f95db3573dd90678408/pytest_xdist-2.5.0-py3-none-any.whl#sha256=6fe5c74fec98906deb8f2d2b616b5c782022744978e7bd4695d39c8f42d0ce65 +# pip pytest-xdist @ https://files.pythonhosted.org/packages/50/37/125fe5ec459321e2d48a0c38672cfc2419ad87d580196fd894e5f25230b0/pytest_xdist-3.5.0-py3-none-any.whl#sha256=d075629c7e00b611df89f490a5063944bee7a4362a5ff11c7cc7824a03dfce24 # pip numpydoc @ https://files.pythonhosted.org/packages/9c/94/09c437fd4a5fb5adf0468c0865c781dbc11d399544b55f1163d5d4414afb/numpydoc-1.6.0-py3-none-any.whl#sha256=b6ddaa654a52bdf967763c1e773be41f1c3ae3da39ee0de973f2680048acafaa # pip sphinxcontrib-applehelp @ https://files.pythonhosted.org/packages/c0/0c/261c0949083c0ac635853528bb0070c89e927841d4e533ba0b5563365c06/sphinxcontrib_applehelp-1.0.7-py3-none-any.whl#sha256=094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d # pip sphinxcontrib-devhelp @ https://files.pythonhosted.org/packages/c0/03/010ac733ec7b7f71c1dc88e7115743ee466560d6d85373b56fb9916e4586/sphinxcontrib_devhelp-1.0.5-py3-none-any.whl#sha256=fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f diff --git a/build_tools/azure/pymin_conda_defaults_openblas_environment.yml b/build_tools/azure/pymin_conda_defaults_openblas_environment.yml index 69de346c4fff0..a93498d23e537 100644 --- a/build_tools/azure/pymin_conda_defaults_openblas_environment.yml +++ b/build_tools/azure/pymin_conda_defaults_openblas_environment.yml @@ -14,7 +14,7 @@ dependencies: - matplotlib=3.3.4 # min - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - pytest-cov diff --git a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock index 159ab024cc0c1..4543307280a3b 100644 --- a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: b4bfe38c127d42c34beb5fbcbb6d7a983e7063f8a6ec415182acb410dfc68d8d +# input_hash: 82d3fc4a221c5788b1501ed52f4700a43ac387e29dba2eccc9f2fd6521c878ff @EXPLICIT https://repo.anaconda.com/pkgs/main/linux-64/_libgcc_mutex-0.1-main.conda#c3473ff8bdb3d124ed5ff11ec380d6f9 https://repo.anaconda.com/pkgs/main/linux-64/blas-1.0-openblas.conda#9ddfcaef10d79366c90128f5dc444be8 @@ -72,7 +72,6 @@ https://repo.anaconda.com/pkgs/main/linux-64/packaging-23.1-py39h06a4308_0.conda https://repo.anaconda.com/pkgs/main/linux-64/pillow-10.0.1-py39ha6cbd5a_0.conda#a16f050efc583049a46accd497525967 https://repo.anaconda.com/pkgs/main/linux-64/pluggy-1.0.0-py39h06a4308_1.conda#fb4fed11ed43cf727dbd51883cc1d9fa https://repo.anaconda.com/pkgs/main/linux-64/ply-3.11-py39h06a4308_0.conda#6c89bf6d2fdf6d24126e34cb83fd10f1 -https://repo.anaconda.com/pkgs/main/noarch/py-1.11.0-pyhd3eb1b0_0.conda#7205a898ed2abbf6e9b903dff6abe08e https://repo.anaconda.com/pkgs/main/linux-64/pyparsing-3.0.9-py39h06a4308_0.conda#3a0537468e59760404f63b4f04369828 https://repo.anaconda.com/pkgs/main/linux-64/pyqt5-sip-12.13.0-py39h5eee18b_0.conda#256840c3841b52346ea5743be8490ede https://repo.anaconda.com/pkgs/main/linux-64/setuptools-68.2.2-py39h06a4308_0.conda#5b42cae5548732ae5c167bb1066085de @@ -90,8 +89,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/sip-6.7.12-py39h6a678d5_0.conda#698 https://repo.anaconda.com/pkgs/main/linux-64/matplotlib-base-3.3.4-py39h62a2d02_0.conda#dbab28222c740af8e21a3e5e2882c178 https://repo.anaconda.com/pkgs/main/linux-64/pyqt-5.15.10-py39h6a678d5_0.conda#52da5ff9b1144b078d2f41bab0b213f2 https://repo.anaconda.com/pkgs/main/linux-64/pytest-cov-4.1.0-py39h06a4308_1.conda#8f41fce21670b120bf7fa8a7883380d9 -https://repo.anaconda.com/pkgs/main/linux-64/pytest-forked-1.6.0-py39h06a4308_0.conda#f0a6e858c06dc4d2ae5c9644630a6a83 +https://repo.anaconda.com/pkgs/main/linux-64/pytest-xdist-3.5.0-py39h06a4308_0.conda#e1d7ffcb1ee2ed9a84800f5c4bbbd7ae https://repo.anaconda.com/pkgs/main/linux-64/scipy-1.7.3-py39hf838250_2.conda#0667ea5ac14d35e26da19a0f068739da https://repo.anaconda.com/pkgs/main/linux-64/matplotlib-3.3.4-py39h06a4308_0.conda#384fc5e01ebfcf30e7161119d3029b5a https://repo.anaconda.com/pkgs/main/linux-64/pyamg-4.2.3-py39h79cecc1_0.conda#afc634da8b81dc504179d53d334e6e55 -https://repo.anaconda.com/pkgs/main/noarch/pytest-xdist-2.5.0-pyhd3eb1b0_0.conda#d15cdc4207bcf8ca920822597f1d138d diff --git a/build_tools/azure/pymin_conda_forge_mkl_environment.yml b/build_tools/azure/pymin_conda_forge_mkl_environment.yml index 125c169ddc95f..a3b8b75363a46 100644 --- a/build_tools/azure/pymin_conda_forge_mkl_environment.yml +++ b/build_tools/azure/pymin_conda_forge_mkl_environment.yml @@ -13,7 +13,7 @@ dependencies: - threadpoolctl - matplotlib - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - pytest-cov diff --git a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock index 10b0d4ec2291f..e709d540b60d6 100644 --- a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: win-64 -# input_hash: af544b6135127d0b6abf1eedcc8ba32a4d5e2e1d2904d4592abc7f3dba338569 +# input_hash: 2f4b1d16d553e6307f97867b796d97131fd60899af1e29931840dbbc1b00d7b9 @EXPLICIT https://conda.anaconda.org/conda-forge/win-64/ca-certificates-2023.11.17-h56e8100_0.conda#1163114b483f26761f993c709e65271f https://conda.anaconda.org/conda-forge/win-64/intel-openmp-2023.2.0-h57928b3_50497.conda#a401f3cae152deb75bbed766a90a6312 @@ -63,7 +63,6 @@ https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda# https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 https://conda.anaconda.org/conda-forge/win-64/pthread-stubs-0.4-hcd874cb_1001.tar.bz2#a1f820480193ea83582b13249a7e7bd9 -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 @@ -96,11 +95,10 @@ https://conda.anaconda.org/conda-forge/win-64/mkl-2023.2.0-h6a75c08_50497.conda# https://conda.anaconda.org/conda-forge/win-64/pillow-10.2.0-py39h368b509_0.conda#706d6e5bbc4b5d2ac7b8a6077319294d https://conda.anaconda.org/conda-forge/win-64/pyqt5-sip-12.12.2-py39h99910a6_5.conda#dffbcea794c524c471772a5f697c2aea https://conda.anaconda.org/conda-forge/noarch/pytest-cov-4.1.0-pyhd8ed1ab_0.conda#06eb685a3a0b146347a58dda979485da -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/win-64/gstreamer-1.22.8-hb4038d2_1.conda#d24ef655de29ac3b1e14aae9cc2eb66b https://conda.anaconda.org/conda-forge/win-64/libblas-3.9.0-20_win64_mkl.conda#6cad6cd2fbdeef4d651b8f752a4da960 https://conda.anaconda.org/conda-forge/win-64/mkl-devel-2023.2.0-h57928b3_50497.conda#0d52cfab24361c77268b54920c11903c -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/win-64/gst-plugins-base-1.22.8-h001b923_1.conda#abe4d4f0820e367987d2ba73a84cf328 https://conda.anaconda.org/conda-forge/win-64/libcblas-3.9.0-20_win64_mkl.conda#e6d36cfcb2f2dff0f659d2aa0813eb2d https://conda.anaconda.org/conda-forge/win-64/liblapack-3.9.0-20_win64_mkl.conda#9510d07424d70fcac553d86b3e4a7c14 diff --git a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_environment.yml b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_environment.yml index de366a19e740d..51fe4e3308868 100644 --- a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_environment.yml +++ b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_environment.yml @@ -15,7 +15,7 @@ dependencies: - pandas - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - sphinx diff --git a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock index a6893000d1871..55ed5154a3d12 100644 --- a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: d70964a380150a9fdd34471eab9c13547ec7744156a6719ec0e4b97fc7d298fa +# input_hash: c5b0ca4d81a3951a78ce653cf958c09f523e7579537cf5f6f0c709eb3691bc3d @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -127,7 +127,6 @@ https://conda.anaconda.org/conda-forge/linux-64/openjpeg-2.5.0-h488ebb8_3.conda# https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pygments-2.17.2-pyhd8ed1ab_0.conda#140a7f159396547e9799aa98f9f0742e https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2#2a7de29fb590ca14b5243c4c812c8025 @@ -172,13 +171,12 @@ https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py39h474f0d3_0.conda#a1f1ad2d8ebf63f13f45fb21b7f49dfb https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py39h3d6467e_5.conda#93aff412f3e49fdb43361c0215cbd72d -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py39h7633fee_0.conda#ed71ad3e30eb03da363fb797419cce98 https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c_1.conda#3926dab94fe06d88ade0e716d77b8cf8 https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py39hddac248_0.conda#dcfd2f15c6f8f0bbf234412b18a2a5d0 -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py39h474f0d3_0.conda#4b401c1516417b4b14aa1249d2f7929d https://conda.anaconda.org/conda-forge/linux-64/blas-2.120-openblas.conda#c8f6916a81a340650078171b1d852574 https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.8.2-py39he9076e7_0.conda#6085411aa2f0b2b801d3b46e1d3b83c5 diff --git a/build_tools/azure/pypy3_environment.yml b/build_tools/azure/pypy3_environment.yml index d4f0d22e96042..45a0d0e8ffebb 100644 --- a/build_tools/azure/pypy3_environment.yml +++ b/build_tools/azure/pypy3_environment.yml @@ -15,6 +15,6 @@ dependencies: - matplotlib - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - setuptools - ccache diff --git a/build_tools/azure/pypy3_linux-64_conda.lock b/build_tools/azure/pypy3_linux-64_conda.lock index d5f5f842033c0..136b85b5395b8 100644 --- a/build_tools/azure/pypy3_linux-64_conda.lock +++ b/build_tools/azure/pypy3_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 296e0e62aa19cfbc6aa6d615c86db2d06be56b4b5f76bf148152aff936fcddf5 +# input_hash: 231e6765d0906ea65daa71dd10e672c1afde9ae87cba2e958a8744a6a38a4e7b @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -72,7 +72,6 @@ https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py39h6dedee3_0.cond https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 https://conda.anaconda.org/conda-forge/linux-64/pillow-10.2.0-py39hcf8a34e_0.conda#8a406ee5a979c2591f4c734d6fe4a958 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/pypy-7.3.13-0_pypy39.conda#0973de0664d1bd004c1bc64a7aab8f2e https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b @@ -93,7 +92,6 @@ https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py39h6dedee3_0.cond https://conda.anaconda.org/conda-forge/linux-64/blas-2.120-openblas.conda#c8f6916a81a340650078171b1d852574 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-64/pyamg-5.0.1-py39h5fd064f_1.conda#e364cfb3ffb590ccef24b5a92389e751 -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.8.2-py39h4e7d633_0.conda#a60f8c577d2db485f0b92bef480d6277 -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/matplotlib-3.8.2-py39h4162558_0.conda#24444011be733e7bde8617eb8fe725e1 diff --git a/build_tools/azure/test_script.sh b/build_tools/azure/test_script.sh index a45fa3dd49842..0b378675eebde 100755 --- a/build_tools/azure/test_script.sh +++ b/build_tools/azure/test_script.sh @@ -51,17 +51,15 @@ fi if [[ -n "$CHECK_WARNINGS" ]]; then TEST_CMD="$TEST_CMD -Werror::DeprecationWarning -Werror::FutureWarning -Werror::sklearn.utils.fixes.VisibleDeprecationWarning" - # numpy's 1.19.0's tostring() deprecation is ignored until scipy and joblib - # removes its usage - TEST_CMD="$TEST_CMD -Wignore:tostring:DeprecationWarning" - - # Ignore distutils deprecation warning, used by joblib internally - TEST_CMD="$TEST_CMD -Wignore:distutils\ Version\ classes\ are\ deprecated:DeprecationWarning" - # Ignore pkg_resources deprecation warnings triggered by pyamg TEST_CMD="$TEST_CMD -W 'ignore:pkg_resources is deprecated as an API:DeprecationWarning'" TEST_CMD="$TEST_CMD -W 'ignore:Deprecated call to \`pkg_resources:DeprecationWarning'" + # pytest-cov issue https://github.com/pytest-dev/pytest-cov/issues/557 not + # fixed although it has been closed. https://github.com/pytest-dev/pytest-cov/pull/623 + # would probably fix it. + TEST_CMD="$TEST_CMD -W 'ignore:The --rsyncdir command line argument and rsyncdirs config variable are deprecated.:DeprecationWarning'" + # In some case, exceptions are raised (by bug) in tests, and captured by pytest, # but not raised again. This is for instance the case when Cython directives are # activated: IndexErrors (which aren't fatal) are raised on out-of-bound accesses. diff --git a/build_tools/azure/ubuntu_atlas_lock.txt b/build_tools/azure/ubuntu_atlas_lock.txt index 680f90207abe8..42e63264193af 100644 --- a/build_tools/azure/ubuntu_atlas_lock.txt +++ b/build_tools/azure/ubuntu_atlas_lock.txt @@ -18,16 +18,11 @@ packaging==23.2 # via pytest pluggy==1.3.0 # via pytest -py==1.11.0 - # via pytest-forked pytest==7.4.4 # via # -r build_tools/azure/ubuntu_atlas_requirements.txt - # pytest-forked # pytest-xdist -pytest-forked==1.6.0 - # via pytest-xdist -pytest-xdist==2.5.0 +pytest-xdist==3.5.0 # via -r build_tools/azure/ubuntu_atlas_requirements.txt threadpoolctl==2.0.0 # via -r build_tools/azure/ubuntu_atlas_requirements.txt diff --git a/build_tools/azure/ubuntu_atlas_requirements.txt b/build_tools/azure/ubuntu_atlas_requirements.txt index b4fad825466a7..7bca99cc63cf2 100644 --- a/build_tools/azure/ubuntu_atlas_requirements.txt +++ b/build_tools/azure/ubuntu_atlas_requirements.txt @@ -5,4 +5,4 @@ cython==0.29.33 # min joblib==1.2.0 # min threadpoolctl==2.0.0 # min pytest -pytest-xdist==2.5.0 +pytest-xdist diff --git a/build_tools/circle/doc_environment.yml b/build_tools/circle/doc_environment.yml index 5789d2dfeabd1..22400c45091bb 100644 --- a/build_tools/circle/doc_environment.yml +++ b/build_tools/circle/doc_environment.yml @@ -15,12 +15,11 @@ dependencies: - pandas - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - scikit-image - seaborn - - patsy=0.5.4 - memory_profiler - compilers - sphinx diff --git a/build_tools/circle/doc_linux-64_conda.lock b/build_tools/circle/doc_linux-64_conda.lock index e72fe4615571b..77565ab07e476 100644 --- a/build_tools/circle/doc_linux-64_conda.lock +++ b/build_tools/circle/doc_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: f0d8179a97f73d1dd3fcaabd7b81b8f4ee3eeb0b07c038be883b60160b96c3e9 +# input_hash: e9ce7b66471a75e2156a32c83078c9688bbda241cd62e3d881989eae546ee2e9 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -163,7 +163,6 @@ https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.1.0-pyhd8ed1ab_0.co https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 https://conda.anaconda.org/conda-forge/linux-64/psutil-5.9.7-py39hd1e30aa_0.conda#34d2731732bc7de6269657d5d9fd6e79 -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pygments-2.17.2-pyhd8ed1ab_0.conda#140a7f159396547e9799aa98f9f0742e https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2#2a7de29fb590ca14b5243c4c812c8025 @@ -218,7 +217,7 @@ https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1 https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.3-py39h474f0d3_0.conda#a1f1ad2d8ebf63f13f45fb21b7f49dfb https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py39h3d6467e_5.conda#93aff412f3e49fdb43361c0215cbd72d -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.0-py39h7633fee_0.conda#ed71ad3e30eb03da363fb797419cce98 @@ -226,10 +225,9 @@ https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.22.8-h8e1006c https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-2024.1.1-py39hf9b8f0e_0.conda#9ddd29852457d1152ca235eb87bc74fb https://conda.anaconda.org/conda-forge/noarch/imageio-2.33.1-pyh8c1a49c_0.conda#1c34d58ac469a34e7e96832861368bce https://conda.anaconda.org/conda-forge/linux-64/pandas-2.1.4-py39hddac248_0.conda#dcfd2f15c6f8f0bbf234412b18a2a5d0 -https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.4-pyhd8ed1ab_0.conda#1184267eddebb57e47f8e1419c225595 +https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.6-pyhd8ed1ab_0.conda#a5b55d1cb110cdcedc748b5c3e16e687 https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.3-py39h927a070_1.conda#9228d65338fc75b9f7040c30465cd84b https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.0-pyhd8ed1ab_0.conda#134b2b57b7865d2316a7cce1915a51ed -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/pywavelets-1.4.1-py39h44dd56e_1.conda#d037c20e3da2e85f03ebd20ad480c359 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.11.4-py39h474f0d3_0.conda#4b401c1516417b4b14aa1249d2f7929d https://conda.anaconda.org/conda-forge/linux-64/blas-2.120-openblas.conda#c8f6916a81a340650078171b1d852574 diff --git a/build_tools/circle/doc_min_dependencies_environment.yml b/build_tools/circle/doc_min_dependencies_environment.yml index 539187723c01e..3a8320a7f8dd0 100644 --- a/build_tools/circle/doc_min_dependencies_environment.yml +++ b/build_tools/circle/doc_min_dependencies_environment.yml @@ -15,7 +15,7 @@ dependencies: - pandas=1.1.5 # min - pyamg - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - scikit-image=0.17.2 # min diff --git a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock index 4c0b70b6b260e..b0848d8fbea6f 100644 --- a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock +++ b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 63e92fdc759dcf030bf7e6d4a5d86bec102c98562cfb7ebd4d3d4991c895678b +# input_hash: a58a98732e5815c15757bc1def8ddc0d87f20f11edcf6e7b408594bf948cbb3e @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2023.11.17-hbcca054_0.conda#01ffc8d36f9eba0ce0b3c1955fa780ee @@ -146,7 +146,6 @@ https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.1.0-pyhd8ed1ab_0.co https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d https://conda.anaconda.org/conda-forge/noarch/ply-3.11-py_1.tar.bz2#7205635cd71531943440fbfe3b6b5727 https://conda.anaconda.org/conda-forge/linux-64/psutil-5.9.7-py39hd1e30aa_0.conda#34d2731732bc7de6269657d5d9fd6e79 -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pygments-2.17.2-pyhd8ed1ab_0.conda#140a7f159396547e9799aa98f9f0742e https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2#2a7de29fb590ca14b5243c4c812c8025 @@ -199,7 +198,7 @@ https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-7.0.1-hd8ed1ab_ https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-20_linux64_openblas.conda#05c5862c7dc25e65ba6c471d96429dae https://conda.anaconda.org/conda-forge/linux-64/numpy-1.19.5-py39hd249d9e_3.tar.bz2#0cf333996ebdeeba8d1c8c1c0ee9eff9 https://conda.anaconda.org/conda-forge/linux-64/pyqt5-sip-12.12.2-py39h3d6467e_5.conda#93aff412f3e49fdb43361c0215cbd72d -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/noarch/requests-2.31.0-pyhd8ed1ab_0.conda#a30144e4156cdbb236f99ebb49828f8b https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-20_linux64_openblas.conda#9932a1d4e9ecf2d35fb19475446e361e https://conda.anaconda.org/conda-forge/noarch/dask-core-2023.12.1-pyhd8ed1ab_0.conda#bf6ad72d882bc3f04e6a0fb50fd2cce8 @@ -211,7 +210,6 @@ https://conda.anaconda.org/conda-forge/linux-64/pandas-1.1.5-py39hde0f152_0.tar. https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.6-pyhd8ed1ab_0.conda#a5b55d1cb110cdcedc748b5c3e16e687 https://conda.anaconda.org/conda-forge/linux-64/polars-0.19.12-py39h90d8ae4_0.conda#191828961c95f8d59fa2b86a590f9905 https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.0-pyhd8ed1ab_0.conda#134b2b57b7865d2316a7cce1915a51ed -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-64/pywavelets-1.3.0-py39hd257fcd_1.tar.bz2#c4b698994b2d8d2e659ae02202e6abe4 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.6.0-py39hee8e79c_0.tar.bz2#3afcb78281836e61351a2924f3230060 https://conda.anaconda.org/conda-forge/linux-64/blas-2.120-openblas.conda#c8f6916a81a340650078171b1d852574 diff --git a/build_tools/cirrus/pymin_conda_forge_environment.yml b/build_tools/cirrus/pymin_conda_forge_environment.yml index 70aedd73bf883..67a163d2bd46b 100644 --- a/build_tools/cirrus/pymin_conda_forge_environment.yml +++ b/build_tools/cirrus/pymin_conda_forge_environment.yml @@ -13,7 +13,7 @@ dependencies: - threadpoolctl - matplotlib - pytest - - pytest-xdist=2.5.0 + - pytest-xdist - pillow - setuptools - pip diff --git a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock index c8e8d0baf5236..fa842def2d8d2 100644 --- a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock +++ b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-aarch64 -# input_hash: 26cb8d771d4d1ecc00c0fc477f3a4b364e4bd7558f3d18ecd50c0d1b440ffe7f +# input_hash: dc7e28d3993d445e2d092c8e0962c7c7b4861c3413f40ab9e1f017be338abb90 @EXPLICIT https://conda.anaconda.org/conda-forge/linux-aarch64/ca-certificates-2023.11.17-hcefe29a_0.conda#695a28440b58e3ba920bcac4ac7c73c6 https://conda.anaconda.org/conda-forge/linux-aarch64/ld_impl_linux-aarch64-2.40-h2d8c526_0.conda#16246d69e945d0b1969a6099e7c5d457 @@ -60,7 +60,6 @@ https://conda.anaconda.org/conda-forge/linux-aarch64/openblas-0.3.25-pthreads_h3 https://conda.anaconda.org/conda-forge/linux-aarch64/openjpeg-2.5.0-h0d9d63b_3.conda#123f5df3bc7f0e23c6950fddb97d1f43 https://conda.anaconda.org/conda-forge/noarch/packaging-23.2-pyhd8ed1ab_0.conda#79002079284aa895f883c6b7f3f88fd6 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.3.0-pyhd8ed1ab_0.conda#2390bd10bed1f3fdc7a537fb5a447d8d -https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyh6c4a22f_0.tar.bz2#b4613d7e7a493916d867842a6a148054 https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.1-pyhd8ed1ab_0.conda#176f7d56f0cfe9008bdf1bccd7de02fb https://conda.anaconda.org/conda-forge/noarch/setuptools-69.0.3-pyhd8ed1ab_0.conda#40695fdfd15a92121ed2922900d0308b https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 @@ -82,10 +81,9 @@ https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.1.1-pyhd8ed1ab_0.conda#d04bd1b5bed9177dd7c3cef15e2b6710 https://conda.anaconda.org/conda-forge/linux-aarch64/liblapacke-3.9.0-20_linuxaarch64_openblas.conda#1b8192f036a2dc41fec67700bb8bacef https://conda.anaconda.org/conda-forge/linux-aarch64/numpy-1.26.3-py39h91c28bb_0.conda#9e10c6f9e309c2ada0d41c945e0f9b56 -https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_0.conda#a46947638b6e005b63d2d6271da529b0 +https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/linux-aarch64/blas-devel-3.9.0-20_linuxaarch64_openblas.conda#211c74d7600d8d1dec226daf5e28e2dc https://conda.anaconda.org/conda-forge/linux-aarch64/contourpy-1.2.0-py39hd16970a_0.conda#dc11a4a2e020d1d71350baa7cb4980e4 -https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-2.5.0-pyhd8ed1ab_0.tar.bz2#1fdd1f3baccf0deb647385c677a1a48e https://conda.anaconda.org/conda-forge/linux-aarch64/scipy-1.11.3-py39h91c28bb_1.conda#216b118cdb919665ad7d9d2faff412df https://conda.anaconda.org/conda-forge/linux-aarch64/blas-2.120-openblas.conda#4354e2978d15f5b29b1557792e5c5c63 https://conda.anaconda.org/conda-forge/linux-aarch64/matplotlib-base-3.8.2-py39h8e43113_0.conda#0dd681b8d2a93b799954714481761fe0 diff --git a/build_tools/update_environments_and_lock_files.py b/build_tools/update_environments_and_lock_files.py index b344785ad01ca..6625c88affe29 100644 --- a/build_tools/update_environments_and_lock_files.py +++ b/build_tools/update_environments_and_lock_files.py @@ -5,8 +5,11 @@ Two scenarios where this script can be useful: - make sure that the latest versions of all the dependencies are used in the CI. - We can run this script regularly and open a PR with the changes to the lock - files. This workflow will eventually be automated with a bot in the future. + There is a scheduled workflow that does this, see + .github/workflows/update-lock-files.yml. This is still useful to run this + script when when the automated PR fails and for example some packages need to + be pinned. You can add the pins to this script, run it, and open a PR with + the changes. - bump minimum dependencies in sklearn/_min_dependencies.py. Running this script will update both the CI environment files and associated lock files. You can then open a PR with the changes. @@ -78,11 +81,7 @@ docstring_test_dependencies = ["sphinx", "numpydoc"] -default_package_constraints = { - # XXX: pin pytest-xdist to workaround: - # https://github.com/pytest-dev/pytest-xdist/issues/840 - "pytest-xdist": "2.5.0", -} +default_package_constraints = {} def remove_from(alist, to_remove): @@ -296,8 +295,6 @@ def remove_from(alist, to_remove): "conda_dependencies": common_dependencies_without_coverage + [ "scikit-image", "seaborn", - # TODO Remove when patsy pin is not needed anymore, see below - "patsy", "memory_profiler", "compilers", "sphinx", @@ -313,10 +310,6 @@ def remove_from(alist, to_remove): "pip_dependencies": ["jupyterlite-sphinx", "jupyterlite-pyodide-kernel"], "package_constraints": { "python": "3.9", - # TODO: Remove pin when issue is fixed in patsy, see - # https://github.com/pydata/patsy/issues/198. patsy 0.5.5 - # introduced a DeprecationWarning at import-time. - "patsy": "0.5.4", }, }, { From 91d273ae892851ec3bdc4a21cffe163fbaed40f0 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Wed, 10 Jan 2024 12:09:15 +0100 Subject: [PATCH 024/554] MAINT avoid commit containing doc/sg_execution_times.rst (#28088) --- .gitignore | 1 + 1 file changed, 1 insertion(+) diff --git a/.gitignore b/.gitignore index 199c2bd85d997..770f0b84f074a 100644 --- a/.gitignore +++ b/.gitignore @@ -13,6 +13,7 @@ sklearn/**/*.html dist/ MANIFEST +doc/sg_execution_times.rst doc/_build/ doc/auto_examples/ doc/modules/generated/ From 8234a8cf85edcbf8a12717698cb0f108067f290a Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Wed, 10 Jan 2024 18:02:37 +0100 Subject: [PATCH 025/554] DOC fix some doctring for numpydoc compliance (#27983) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Loïc Estève --- sklearn/base.py | 3 +- sklearn/utils/_random.pyx | 81 +++++++++++++++++++++------------------ 2 files changed, 45 insertions(+), 39 deletions(-) diff --git a/sklearn/base.py b/sklearn/base.py index 56a0ad3233a73..e7361c331617a 100644 --- a/sklearn/base.py +++ b/sklearn/base.py @@ -1079,9 +1079,10 @@ def fit_predict(self, X, y=None, **kwargs): class MetaEstimatorMixin: - _required_parameters = ["estimator"] """Mixin class for all meta estimators in scikit-learn.""" + _required_parameters = ["estimator"] + class MultiOutputMixin: """Mixin to mark estimators that support multioutput.""" diff --git a/sklearn/utils/_random.pyx b/sklearn/utils/_random.pyx index d063f4dfe7ff1..2ffe47d38bd33 100644 --- a/sklearn/utils/_random.pyx +++ b/sklearn/utils/_random.pyx @@ -228,6 +228,49 @@ cdef _sample_without_replacement(default_int n_population, random_state=None): """Sample integers without replacement. + Private function for the implementation, see sample_without_replacement + documentation for more details. + """ + _sample_without_replacement_check_input(n_population, n_samples) + + all_methods = ("auto", "tracking_selection", "reservoir_sampling", "pool") + + ratio = n_samples / n_population if n_population != 0.0 else 1.0 + + # Check ratio and use permutation unless ratio < 0.01 or ratio > 0.99 + if method == "auto" and ratio > 0.01 and ratio < 0.99: + rng = check_random_state(random_state) + return rng.permutation(n_population)[:n_samples] + + if method == "auto" or method == "tracking_selection": + # TODO the pool based method can also be used. + # however, it requires special benchmark to take into account + # the memory requirement of the array vs the set. + + # The value 0.2 has been determined through benchmarking. + if ratio < 0.2: + return _sample_without_replacement_with_tracking_selection( + n_population, n_samples, random_state) + else: + return _sample_without_replacement_with_reservoir_sampling( + n_population, n_samples, random_state) + + elif method == "reservoir_sampling": + return _sample_without_replacement_with_reservoir_sampling( + n_population, n_samples, random_state) + + elif method == "pool": + return _sample_without_replacement_with_pool(n_population, n_samples, + random_state) + else: + raise ValueError('Expected a method name in %s, got %s. ' + % (all_methods, method)) + + +def sample_without_replacement( + object n_population, object n_samples, method="auto", random_state=None): + """Sample integers without replacement. + Select n_samples integers from the set [0, n_population) without replacement. @@ -276,44 +319,6 @@ cdef _sample_without_replacement(default_int n_population, The sampled subsets of integer. The subset of selected integer might not be randomized, see the method argument. """ - _sample_without_replacement_check_input(n_population, n_samples) - - all_methods = ("auto", "tracking_selection", "reservoir_sampling", "pool") - - ratio = n_samples / n_population if n_population != 0.0 else 1.0 - - # Check ratio and use permutation unless ratio < 0.01 or ratio > 0.99 - if method == "auto" and ratio > 0.01 and ratio < 0.99: - rng = check_random_state(random_state) - return rng.permutation(n_population)[:n_samples] - - if method == "auto" or method == "tracking_selection": - # TODO the pool based method can also be used. - # however, it requires special benchmark to take into account - # the memory requirement of the array vs the set. - - # The value 0.2 has been determined through benchmarking. - if ratio < 0.2: - return _sample_without_replacement_with_tracking_selection( - n_population, n_samples, random_state) - else: - return _sample_without_replacement_with_reservoir_sampling( - n_population, n_samples, random_state) - - elif method == "reservoir_sampling": - return _sample_without_replacement_with_reservoir_sampling( - n_population, n_samples, random_state) - - elif method == "pool": - return _sample_without_replacement_with_pool(n_population, n_samples, - random_state) - else: - raise ValueError('Expected a method name in %s, got %s. ' - % (all_methods, method)) - - -def sample_without_replacement( - object n_population, object n_samples, method="auto", random_state=None): cdef: cnp.intp_t n_pop_intp, n_samples_intp long n_pop_long, n_samples_long From 173e9d95594d366e5206361b0c5cab3d2933ed85 Mon Sep 17 00:00:00 2001 From: Connor Boyle Date: Thu, 11 Jan 2024 01:50:46 -0800 Subject: [PATCH 026/554] DOC update the formula regarding the computation of the F1-score (#27936) Co-authored-by: Joel Nothman Co-authored-by: Guillaume Lemaitre --- doc/modules/model_evaluation.rst | 29 +++++++++++++--- .../model_selection/plot_precision_recall.py | 7 ++-- sklearn/metrics/_classification.py | 34 +++++++++++-------- 3 files changed, 47 insertions(+), 23 deletions(-) diff --git a/doc/modules/model_evaluation.rst b/doc/modules/model_evaluation.rst index a88a92604767e..271e5f6c1c661 100644 --- a/doc/modules/model_evaluation.rst +++ b/doc/modules/model_evaluation.rst @@ -886,22 +886,41 @@ following table: | | Missing result | Correct absence of result| +-------------------+---------------------+--------------------------+ -In this context, we can define the notions of precision, recall and F-measure: +In this context, we can define the notions of precision and recall: .. math:: - \text{precision} = \frac{tp}{tp + fp}, + \text{precision} = \frac{\text{tp}}{\text{tp} + \text{fp}}, .. math:: - \text{recall} = \frac{tp}{tp + fn}, + \text{recall} = \frac{\text{tp}}{\text{tp} + \text{fn}}, +(Sometimes recall is also called ''sensitivity'') + +F-measure is the weighted harmonic mean of precision and recall, with precision's contribution to the mean weighted by +some parameter :math:`\beta`: +F-measure is the weighted harmonic mean of precision and recall, with precision's +contribution to the mean weighted by some parameter :math:`\beta`: .. math:: - F_\beta = (1 + \beta^2) \frac{\text{precision} \times \text{recall}}{\beta^2 \text{precision} + \text{recall}}. + F_\beta = (1 + \beta^2) \frac{\text{precision} \times \text{recall}}{\beta^2 \text{precision} + \text{recall}} + +To avoid division by zero when precision and recall are zero, Scikit-Learn calculates F-measure with this +otherwise-equivalent formula: +To avoid division by zero when precision and recall are zero, we can define the +F-measure with this otherwise-equivalent formula: +.. math:: -Sometimes recall is also called ''sensitivity''. + F_\beta = \frac{(1 + \beta^2) \text{tp}}{(1 + \beta^2) \text{tp} + \text{fp} + \beta^2 \text{fn}}. +Note that this formula is still undefined when there are no true positives, false positives, nor false negatives. By +default, F-1 for a set of exclusively true negatives is calculated as 0, however this behavior can be changed using the +`zero_division` parameter. +Note that this formula is still undefined when there are no true positives, false +positives, nor false negatives. By default, F-1 for a set of exclusively true negatives +is calculated as 0, however this behavior can be changed using the `zero_division` +parameter. Here are some small examples in binary classification:: >>> from sklearn import metrics diff --git a/examples/model_selection/plot_precision_recall.py b/examples/model_selection/plot_precision_recall.py index 2e48495f96a16..03b273de66b7f 100644 --- a/examples/model_selection/plot_precision_recall.py +++ b/examples/model_selection/plot_precision_recall.py @@ -37,10 +37,11 @@ :math:`R = \\frac{T_p}{T_p + F_n}` -These quantities are also related to the (:math:`F_1`) score, which is defined -as the harmonic mean of precision and recall. +These quantities are also related to the :math:`F_1` score, which is the +harmonic mean of precision and recall. Thus, we can compute the :math:`F_1` +using the following formula: -:math:`F1 = 2\\frac{P \\times R}{P+R}` +:math:`F_1 = \\frac{2T_p}{2T_p + F_p + F_n}` Note that the precision may not decrease with recall. The definition of precision (:math:`\\frac{T_p}{T_p + F_p}`) shows that lowering diff --git a/sklearn/metrics/_classification.py b/sklearn/metrics/_classification.py index f0a13f8a04830..5b8a024e6e5fc 100644 --- a/sklearn/metrics/_classification.py +++ b/sklearn/metrics/_classification.py @@ -1116,7 +1116,12 @@ def f1_score( The relative contribution of precision and recall to the F1 score are equal. The formula for the F1 score is:: - F1 = 2 * (precision * recall) / (precision + recall) + F1 = 2 * TP / (2 * TP + FN + FP) + + Where "TP" is the number of true positives, "FN" is the number of false + negatives, and "FP" is the number of false positives. F1 is by default + calculated as 0.0 when there are no true positives, false negatives, nor + false positives. Support beyond term:`binary` targets is achieved by treating :term:`multiclass` and :term:`multilabel` data as a collection of binary problems, one for each @@ -1211,12 +1216,11 @@ def f1_score( Notes ----- - When ``true positive + false positive == 0``, precision is undefined. - When ``true positive + false negative == 0``, recall is undefined. - In such cases, by default the metric will be set to 0, as will f-score, - and ``UndefinedMetricWarning`` will be raised. This behavior can be - modified with ``zero_division``. Note that if `zero_division` is np.nan, - scores being `np.nan` will be ignored for averaging. + When ``true positive + false positive + false negative == 0`` (i.e. a class + is completely absent from both ``y_true`` or ``y_pred``), f-score is + undefined. In such cases, by default f-score will be set to 0.0, and + ``UndefinedMetricWarning`` will be raised. This behavior can be modified by + setting the ``zero_division`` parameter. References ---------- @@ -1404,10 +1408,9 @@ def fbeta_score( Notes ----- - When ``true positive + false positive == 0`` or - ``true positive + false negative == 0``, f-score returns 0 and raises - ``UndefinedMetricWarning``. This behavior can be - modified with ``zero_division``. + When ``true positive + false positive + false negative == 0``, f-score + returns 0.0 and raises ``UndefinedMetricWarning``. This behavior can be + modified by setting ``zero_division``. References ---------- @@ -1699,10 +1702,11 @@ def precision_recall_fscore_support( Notes ----- When ``true positive + false positive == 0``, precision is undefined. - When ``true positive + false negative == 0``, recall is undefined. - In such cases, by default the metric will be set to 0, as will f-score, - and ``UndefinedMetricWarning`` will be raised. This behavior can be - modified with ``zero_division``. + When ``true positive + false negative == 0``, recall is undefined. When + ``true positive + false negative + false positive == 0``, f-score is + undefined. In such cases, by default the metric will be set to 0, and + ``UndefinedMetricWarning`` will be raised. This behavior can be modified + with ``zero_division``. References ---------- From 55f4a3ab94cf1e8480bbcdc1793d8675713e27ec Mon Sep 17 00:00:00 2001 From: Mohammed Hamdy <62081584+mmhamdy@users.noreply.github.com> Date: Thu, 11 Jan 2024 12:31:39 +0200 Subject: [PATCH 027/554] DOC Add links to decomposition examples in docstrings and user guide (#26932) Co-authored-by: Guillaume Lemaitre Co-authored-by: Adrin Jalali --- doc/conf.py | 3 + doc/modules/decomposition.rst | 1 + .../unsupervised_learning.rst | 76 +++++++------- examples/decomposition/plot_pca_3d.py | 99 ------------------- sklearn/decomposition/_incremental_pca.py | 3 + sklearn/decomposition/_kernel_pca.py | 3 + sklearn/decomposition/_pca.py | 3 + sklearn/decomposition/_sparse_pca.py | 3 + 8 files changed, 57 insertions(+), 134 deletions(-) delete mode 100644 examples/decomposition/plot_pca_3d.py diff --git a/doc/conf.py b/doc/conf.py index c5e87442abe1f..20181c0a84769 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -303,6 +303,9 @@ "auto_examples/ensemble/plot_adaboost_hastie_10_2": ( "auto_examples/ensemble/plot_adaboost_multiclass" ), + "auto_examples/decomposition/plot_pca_3d": ( + "auto_examples/decomposition/plot_pca_iris" + ), } html_context["redirects"] = redirects for old_link in redirects: diff --git a/doc/modules/decomposition.rst b/doc/modules/decomposition.rst index a151eda636e7b..223985c6579f0 100644 --- a/doc/modules/decomposition.rst +++ b/doc/modules/decomposition.rst @@ -53,6 +53,7 @@ data based on the amount of variance it explains. As such it implements a .. topic:: Examples: + * :ref:`sphx_glr_auto_examples_decomposition_plot_pca_iris.py` * :ref:`sphx_glr_auto_examples_decomposition_plot_pca_vs_lda.py` * :ref:`sphx_glr_auto_examples_decomposition_plot_pca_vs_fa_model_selection.py` diff --git a/doc/tutorial/statistical_inference/unsupervised_learning.rst b/doc/tutorial/statistical_inference/unsupervised_learning.rst index f96ac343a4882..e385eccaf592c 100644 --- a/doc/tutorial/statistical_inference/unsupervised_learning.rst +++ b/doc/tutorial/statistical_inference/unsupervised_learning.rst @@ -204,51 +204,57 @@ Decompositions: from a signal to components and loadings Principal component analysis: PCA ----------------------------------- -:ref:`PCA` selects the successive components that -explain the maximum variance in the signal. +:ref:`PCA` selects the successive components that explain the maximum variance in the +signal. Let's create a synthetic 3-dimensional dataset. -.. |pca_3d_axis| image:: /auto_examples/decomposition/images/sphx_glr_plot_pca_3d_001.png - :target: ../../auto_examples/decomposition/plot_pca_3d.html - :scale: 70 - -.. |pca_3d_aligned| image:: /auto_examples/decomposition/images/sphx_glr_plot_pca_3d_002.png - :target: ../../auto_examples/decomposition/plot_pca_3d.html - :scale: 70 +.. np.random.seed(0) -.. rst-class:: centered +:: - |pca_3d_axis| |pca_3d_aligned| + >>> # Create a signal with only 2 useful dimensions + >>> x1 = np.random.normal(size=(100, 1)) + >>> x2 = np.random.normal(size=(100, 1)) + >>> x3 = x1 + x2 + >>> X = np.concatenate([x1, x2, x3], axis=1) The point cloud spanned by the observations above is very flat in one -direction: one of the three univariate features can almost be exactly -computed using the other two. PCA finds the directions in which the data is -not *flat* +direction: one of the three univariate features (i.e. z-axis) can almost be exactly +computed using the other two. -When used to *transform* data, PCA can reduce the dimensionality of the -data by projecting on a principal subspace. +.. plot:: + :context: close-figs + :align: center -.. np.random.seed(0) + >>> import matplotlib.pyplot as plt + >>> fig = plt.figure() + >>> ax = fig.add_subplot(111, projection='3d') + >>> ax.scatter(X[:, 0], X[:, 1], X[:, 2]) + <...> + >>> _ = ax.set(xlabel="x", ylabel="y", zlabel="z") + + +PCA finds the directions in which the data is not *flat*. :: - >>> # Create a signal with only 2 useful dimensions - >>> x1 = np.random.normal(size=100) - >>> x2 = np.random.normal(size=100) - >>> x3 = x1 + x2 - >>> X = np.c_[x1, x2, x3] - - >>> from sklearn import decomposition - >>> pca = decomposition.PCA() - >>> pca.fit(X) - PCA() - >>> print(pca.explained_variance_) # doctest: +SKIP - [ 2.18565811e+00 1.19346747e+00 8.43026679e-32] - - >>> # As we can see, only the 2 first components are useful - >>> pca.n_components = 2 - >>> X_reduced = pca.fit_transform(X) - >>> X_reduced.shape - (100, 2) + >>> from sklearn import decomposition + >>> pca = decomposition.PCA() + >>> pca.fit(X) + PCA() + >>> print(pca.explained_variance_) # doctest: +SKIP + [ 2.18565811e+00 1.19346747e+00 8.43026679e-32] + +Looking at the explained variance, we see that only the first two components +are useful. PCA can be used to reduce dimensionality while preserving +most of the information. It will project the data on the principal subspace. + +:: + + >>> pca.set_params(n_components=2) + PCA(n_components=2) + >>> X_reduced = pca.fit_transform(X) + >>> X_reduced.shape + (100, 2) .. Eigenfaces here? diff --git a/examples/decomposition/plot_pca_3d.py b/examples/decomposition/plot_pca_3d.py deleted file mode 100644 index 61ce5dde75c89..0000000000000 --- a/examples/decomposition/plot_pca_3d.py +++ /dev/null @@ -1,99 +0,0 @@ -""" -========================================================= -Principal components analysis (PCA) -========================================================= - -These figures aid in illustrating how a point cloud -can be very flat in one direction--which is where PCA -comes in to choose a direction that is not flat. - -""" - -# Authors: Gael Varoquaux -# Jaques Grobler -# Kevin Hughes -# License: BSD 3 clause - -# %% -# Create the data -# --------------- - -import numpy as np -from scipy import stats - -e = np.exp(1) -np.random.seed(4) - - -def pdf(x): - return 0.5 * (stats.norm(scale=0.25 / e).pdf(x) + stats.norm(scale=4 / e).pdf(x)) - - -y = np.random.normal(scale=0.5, size=(30000)) -x = np.random.normal(scale=0.5, size=(30000)) -z = np.random.normal(scale=0.1, size=len(x)) - -density = pdf(x) * pdf(y) -pdf_z = pdf(5 * z) - -density *= pdf_z - -a = x + y -b = 2 * y -c = a - b + z - -norm = np.sqrt(a.var() + b.var()) -a /= norm -b /= norm - - -# %% -# Plot the figures -# ---------------- - -import matplotlib.pyplot as plt - -# unused but required import for doing 3d projections with matplotlib < 3.2 -import mpl_toolkits.mplot3d # noqa: F401 - -from sklearn.decomposition import PCA - - -def plot_figs(fig_num, elev, azim): - fig = plt.figure(fig_num, figsize=(4, 3)) - plt.clf() - ax = fig.add_subplot(111, projection="3d", elev=elev, azim=azim) - ax.set_position([0, 0, 0.95, 1]) - - ax.scatter(a[::10], b[::10], c[::10], c=density[::10], marker="+", alpha=0.4) - Y = np.c_[a, b, c] - - # Using SciPy's SVD, this would be: - # _, pca_score, Vt = scipy.linalg.svd(Y, full_matrices=False) - - pca = PCA(n_components=3) - pca.fit(Y) - V = pca.components_.T - - x_pca_axis, y_pca_axis, z_pca_axis = 3 * V - x_pca_plane = np.r_[x_pca_axis[:2], -x_pca_axis[1::-1]] - y_pca_plane = np.r_[y_pca_axis[:2], -y_pca_axis[1::-1]] - z_pca_plane = np.r_[z_pca_axis[:2], -z_pca_axis[1::-1]] - x_pca_plane.shape = (2, 2) - y_pca_plane.shape = (2, 2) - z_pca_plane.shape = (2, 2) - ax.plot_surface(x_pca_plane, y_pca_plane, z_pca_plane) - ax.xaxis.set_ticklabels([]) - ax.yaxis.set_ticklabels([]) - ax.zaxis.set_ticklabels([]) - - -elev = -40 -azim = -80 -plot_figs(1, elev, azim) - -elev = 30 -azim = 20 -plot_figs(2, elev, azim) - -plt.show() diff --git a/sklearn/decomposition/_incremental_pca.py b/sklearn/decomposition/_incremental_pca.py index f05e2dacc66b2..1089b2c54e086 100644 --- a/sklearn/decomposition/_incremental_pca.py +++ b/sklearn/decomposition/_incremental_pca.py @@ -39,6 +39,9 @@ class IncrementalPCA(_BasePCA): computations to get the principal components, versus 1 large SVD of complexity ``O(n_samples * n_features ** 2)`` for PCA. + For a usage example, see + :ref:`sphx_glr_auto_examples_decomposition_plot_incremental_pca.py`. + Read more in the :ref:`User Guide `. .. versionadded:: 0.16 diff --git a/sklearn/decomposition/_kernel_pca.py b/sklearn/decomposition/_kernel_pca.py index eb73ced3527c8..8fc4aa26a6dfb 100644 --- a/sklearn/decomposition/_kernel_pca.py +++ b/sklearn/decomposition/_kernel_pca.py @@ -41,6 +41,9 @@ class KernelPCA(ClassNamePrefixFeaturesOutMixin, TransformerMixin, BaseEstimator components to extract. It can also use a randomized truncated SVD by the method proposed in [3]_, see `eigen_solver`. + For a usage example, see + :ref:`sphx_glr_auto_examples_decomposition_plot_kernel_pca.py`. + Read more in the :ref:`User Guide `. Parameters diff --git a/sklearn/decomposition/_pca.py b/sklearn/decomposition/_pca.py index 5fe8d666d8e0b..d121c5e5c186f 100644 --- a/sklearn/decomposition/_pca.py +++ b/sklearn/decomposition/_pca.py @@ -136,6 +136,9 @@ class PCA(_BasePCA): Notice that this class does not support sparse input. See :class:`TruncatedSVD` for an alternative with sparse data. + For a usage example, see + :ref:`sphx_glr_auto_examples_decomposition_plot_pca_iris.py` + Read more in the :ref:`User Guide `. Parameters diff --git a/sklearn/decomposition/_sparse_pca.py b/sklearn/decomposition/_sparse_pca.py index f544b710fd073..b14df8c5f4d22 100644 --- a/sklearn/decomposition/_sparse_pca.py +++ b/sklearn/decomposition/_sparse_pca.py @@ -342,6 +342,9 @@ class MiniBatchSparsePCA(_BaseSparsePCA): the data. The amount of sparseness is controllable by the coefficient of the L1 penalty, given by the parameter alpha. + For an example comparing sparse PCA to PCA, see + :ref:`sphx_glr_auto_examples_decomposition_plot_faces_decomposition.py` + Read more in the :ref:`User Guide `. Parameters From 22c2983cb516fe9ee3ed796aa0d9c84cfcd8c198 Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Thu, 11 Jan 2024 22:07:09 +0100 Subject: [PATCH 028/554] MNT set to None for easier reading in HGBT (#28069) --- sklearn/ensemble/_hist_gradient_boosting/binning.py | 2 +- sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/sklearn/ensemble/_hist_gradient_boosting/binning.py b/sklearn/ensemble/_hist_gradient_boosting/binning.py index 8786e866d7be3..3ab90aadcb6bb 100644 --- a/sklearn/ensemble/_hist_gradient_boosting/binning.py +++ b/sklearn/ensemble/_hist_gradient_boosting/binning.py @@ -144,7 +144,7 @@ class _BinMapper(TransformerMixin, BaseEstimator): missing_values_bin_idx_ : np.uint8 The index of the bin where missing values are mapped. This is a constant across all features. This corresponds to the last bin, and - it is always equal to ``n_bins - 1``. Note that if ``n_bins_missing_`` + it is always equal to ``n_bins - 1``. Note that if ``n_bins_non_missing_`` is less than ``n_bins - 1`` for a given feature, then there are empty (and unused) bins. """ diff --git a/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py b/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py index a83b1dbd0f4b9..0837d19407030 100644 --- a/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py +++ b/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py @@ -272,7 +272,7 @@ def _preprocess_X(self, X, *, reset): if self.is_categorical_ is None: self._preprocessor = None - self._is_categorical_remapped = self.is_categorical_ + self._is_categorical_remapped = None X = self._validate_data(X, **check_X_kwargs) return X, None From 28bdc932e88de6a3c7c4eb25230892b75bbe8d1e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Thu, 11 Jan 2024 22:09:47 +0100 Subject: [PATCH 029/554] MNT Add ability to select by build tag to the lock-file update script (#28068) --- .github/workflows/update-lock-files.yml | 8 +- .../update_environments_and_lock_files.py | 125 ++++++++++++------ 2 files changed, 85 insertions(+), 48 deletions(-) diff --git a/.github/workflows/update-lock-files.yml b/.github/workflows/update-lock-files.yml index 4d8e98c01442e..b259617494e9c 100644 --- a/.github/workflows/update-lock-files.yml +++ b/.github/workflows/update-lock-files.yml @@ -17,16 +17,16 @@ jobs: matrix: include: - name: main - update_script_args: "--skip-build 'scipy-dev|^pymin_conda_forge$|pypy'" + update_script_args: "--select-build-tag main-ci" additional_commit_message: "[doc build]" - name: scipy-dev - update_script_args: "--select-build scipy_dev" + update_script_args: "--select-build-tag scipy-dev" additional_commit_message: "[scipy-dev]" - name: cirrus-arm - update_script_args: "--select-build '^pymin_conda_forge$'" + update_script_args: "--select-build-tag arm" additional_commit_message: "[cirrus arm]" - name: pypy - update_script_args: "--select-build pypy" + update_script_args: "--select-build-tag pypy" additional_commit_message: "[pypy]" steps: diff --git a/build_tools/update_environments_and_lock_files.py b/build_tools/update_environments_and_lock_files.py index 6625c88affe29..1115e89408dd9 100644 --- a/build_tools/update_environments_and_lock_files.py +++ b/build_tools/update_environments_and_lock_files.py @@ -88,9 +88,11 @@ def remove_from(alist, to_remove): return [each for each in alist if each not in to_remove] -conda_build_metadata_list = [ +build_metadata_list = [ { - "build_name": "pylatest_conda_forge_mkl_linux-64", + "name": "pylatest_conda_forge_mkl_linux-64", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/azure", "platform": "linux-64", "channel": "conda-forge", @@ -108,7 +110,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pylatest_conda_forge_mkl_osx-64", + "name": "pylatest_conda_forge_mkl_osx-64", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/azure", "platform": "osx-64", "channel": "conda-forge", @@ -122,7 +126,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pylatest_conda_mkl_no_openmp", + "name": "pylatest_conda_mkl_no_openmp", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/azure", "platform": "osx-64", "channel": "defaults", @@ -136,7 +142,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pymin_conda_defaults_openblas", + "name": "pymin_conda_defaults_openblas", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/azure", "platform": "linux-64", "channel": "defaults", @@ -152,7 +160,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pymin_conda_forge_openblas_ubuntu_2204", + "name": "pymin_conda_forge_openblas_ubuntu_2204", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/azure", "platform": "linux-64", "channel": "conda-forge", @@ -167,7 +177,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pylatest_pip_openblas_pandas", + "name": "pylatest_pip_openblas_pandas", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/azure", "platform": "linux-64", "channel": "defaults", @@ -182,7 +194,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pylatest_pip_scipy_dev", + "name": "pylatest_pip_scipy_dev", + "type": "conda", + "tag": "scipy-dev", "folder": "build_tools/azure", "platform": "linux-64", "channel": "defaults", @@ -219,7 +233,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pypy3", + "name": "pypy3", + "type": "conda", + "tag": "pypy", "folder": "build_tools/azure", "platform": "linux-64", "channel": "conda-forge", @@ -236,7 +252,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pymin_conda_forge_mkl", + "name": "pymin_conda_forge_mkl", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/azure", "platform": "win-64", "channel": "conda-forge", @@ -250,7 +268,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "doc_min_dependencies", + "name": "doc_min_dependencies", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/circle", "platform": "linux-64", "channel": "conda-forge", @@ -288,7 +308,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "doc", + "name": "doc", + "type": "conda", + "tag": "main-ci", "folder": "build_tools/circle", "platform": "linux-64", "channel": "conda-forge", @@ -313,7 +335,9 @@ def remove_from(alist, to_remove): }, }, { - "build_name": "pymin_conda_forge", + "name": "pymin_conda_forge", + "type": "conda", + "tag": "arm", "folder": "build_tools/cirrus", "platform": "linux-aarch64", "channel": "conda-forge", @@ -324,12 +348,10 @@ def remove_from(alist, to_remove): "python": "3.9", }, }, -] - - -pip_build_metadata_list = [ { - "build_name": "debian_atlas_32bit", + "name": "debian_atlas_32bit", + "type": "pip", + "tag": "main-ci", "folder": "build_tools/azure", "pip_dependencies": [ "cython", @@ -350,7 +372,9 @@ def remove_from(alist, to_remove): "python_version": "3.9.2", }, { - "build_name": "ubuntu_atlas", + "name": "ubuntu_atlas", + "type": "pip", + "tag": "main-ci", "folder": "build_tools/azure", "pip_dependencies": [ "cython", @@ -444,7 +468,7 @@ def get_conda_environment_content(build_metadata): def write_conda_environment(build_metadata): content = get_conda_environment_content(build_metadata) - build_name = build_metadata["build_name"] + build_name = build_metadata["name"] folder_path = Path(build_metadata["folder"]) output_path = folder_path / f"{build_name}_environment.yml" logger.debug(output_path) @@ -465,7 +489,7 @@ def conda_lock(environment_path, lock_file_path, platform): def create_conda_lock_file(build_metadata): - build_name = build_metadata["build_name"] + build_name = build_metadata["name"] folder_path = Path(build_metadata["folder"]) environment_path = folder_path / f"{build_name}_environment.yml" platform = build_metadata["platform"] @@ -479,7 +503,7 @@ def create_conda_lock_file(build_metadata): def write_all_conda_lock_files(build_metadata_list): for build_metadata in build_metadata_list: - logger.info(f"# Locking dependencies for {build_metadata['build_name']}") + logger.info(f"# Locking dependencies for {build_metadata['name']}") create_conda_lock_file(build_metadata) @@ -495,7 +519,7 @@ def get_pip_requirements_content(build_metadata): def write_pip_requirements(build_metadata): - build_name = build_metadata["build_name"] + build_name = build_metadata["name"] content = get_pip_requirements_content(build_metadata) folder_path = Path(build_metadata["folder"]) output_path = folder_path / f"{build_name}_requirements.txt" @@ -514,7 +538,7 @@ def pip_compile(pip_compile_path, requirements_path, lock_file_path): def write_pip_lock_file(build_metadata): - build_name = build_metadata["build_name"] + build_name = build_metadata["name"] python_version = build_metadata["python_version"] environment_name = f"pip-tools-python{python_version}" # To make sure that the Python used to create the pip lock file is the same @@ -544,7 +568,7 @@ def write_pip_lock_file(build_metadata): def write_all_pip_lock_files(build_metadata_list): for build_metadata in build_metadata_list: - logger.info(f"# Locking dependencies for {build_metadata['build_name']}") + logger.info(f"# Locking dependencies for {build_metadata['name']}") write_pip_lock_file(build_metadata) @@ -583,7 +607,7 @@ def check_conda_version(): "--select-build", default="", help=( - "Regex to restrict the builds we want to update environment and lock files. By" + "Regex to filter the builds we want to update environment and lock files. By" " default all the builds are selected." ), ) @@ -592,6 +616,14 @@ def check_conda_version(): default=None, help="Regex to skip some builds from the builds selected by --select-build", ) +@click.option( + "--select-tag", + default=None, + help=( + "Tag to filter the builds, e.g. 'main-ci' or 'scipy-dev'. " + "This is an additional filtering on top of --select-build." + ), +) @click.option( "-v", "--verbose", @@ -604,7 +636,7 @@ def check_conda_version(): is_flag=True, help="Print output of commands executed by the script", ) -def main(verbose, very_verbose, select_build, skip_build): +def main(select_build, skip_build, select_tag, verbose, very_verbose): if verbose: logger.setLevel(logging.DEBUG) if very_verbose: @@ -613,18 +645,32 @@ def main(verbose, very_verbose, select_build, skip_build): check_conda_lock_version() check_conda_version() - filtered_conda_build_metadata_list = [ - each - for each in conda_build_metadata_list - if re.search(select_build, each["build_name"]) + filtered_build_metadata_list = [ + each for each in build_metadata_list if re.search(select_build, each["name"]) ] + if select_tag is not None: + filtered_build_metadata_list = [ + each for each in build_metadata_list if each["tag"] == select_tag + ] if skip_build is not None: - filtered_conda_build_metadata_list = [ + filtered_build_metadata_list = [ each - for each in filtered_conda_build_metadata_list - if not re.search(skip_build, each["build_name"]) + for each in filtered_build_metadata_list + if not re.search(skip_build, each["name"]) ] + selected_build_info = "\n".join( + f" - {each['name']}, type: {each['type']}, tag: {each['tag']}" + for each in filtered_build_metadata_list + ) + selected_build_message = ( + f"# {len(filtered_build_metadata_list)} selected builds\n{selected_build_info}" + ) + logger.info(selected_build_message) + + filtered_conda_build_metadata_list = [ + each for each in filtered_build_metadata_list if each["type"] == "conda" + ] if filtered_conda_build_metadata_list: logger.info("# Writing conda environments") write_all_conda_environments(filtered_conda_build_metadata_list) @@ -632,17 +678,8 @@ def main(verbose, very_verbose, select_build, skip_build): write_all_conda_lock_files(filtered_conda_build_metadata_list) filtered_pip_build_metadata_list = [ - each - for each in pip_build_metadata_list - if re.search(select_build, each["build_name"]) + each for each in filtered_build_metadata_list if each["type"] == "pip" ] - if skip_build is not None: - filtered_pip_build_metadata_list = [ - each - for each in filtered_pip_build_metadata_list - if not re.search(skip_build, each["build_name"]) - ] - if filtered_pip_build_metadata_list: logger.info("# Writing pip requirements") write_all_pip_requirements(filtered_pip_build_metadata_list) From aacca244a69cce5f3719c861e9bdc400c3945a6c Mon Sep 17 00:00:00 2001 From: Harmanan Kohli <17681934+Harmanankohli@users.noreply.github.com> Date: Fri, 12 Jan 2024 02:40:09 +0530 Subject: [PATCH 030/554] DOC add example in docstring of f_regression (#28104) Co-authored-by: Guillaume Lemaitre --- sklearn/feature_selection/_univariate_selection.py | 13 +++++++++++++ 1 file changed, 13 insertions(+) diff --git a/sklearn/feature_selection/_univariate_selection.py b/sklearn/feature_selection/_univariate_selection.py index f019d128e4d53..96b6b93332091 100644 --- a/sklearn/feature_selection/_univariate_selection.py +++ b/sklearn/feature_selection/_univariate_selection.py @@ -436,6 +436,19 @@ def f_regression(X, y, *, center=True, force_finite=True): SelectFwe: Select features based on family-wise error rate. SelectPercentile: Select features based on percentile of the highest scores. + + Examples + -------- + >>> from sklearn.datasets import make_regression + >>> from sklearn.feature_selection import f_regression + >>> X, y = make_regression( + ... n_samples=50, n_features=3, n_informative=1, noise=1e-4, random_state=42 + ... ) + >>> f_statistic, p_values = f_regression(X, y) + >>> f_statistic + array([1.2...+00, 2.6...+13, 2.6...+00]) + >>> p_values + array([2.7..., 1.5..., 1.0...]) """ correlation_coefficient = r_regression( X, y, center=center, force_finite=force_finite From fee76cc5405c01e283a3b079dcb865f3017d5007 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pierre=20de=20Fr=C3=A9minville?= <6165084+pidefrem@users.noreply.github.com> Date: Thu, 11 Jan 2024 22:25:38 +0100 Subject: [PATCH 031/554] EFF Optimize function utils.validation._check_pos_label_consistency (#28051) Co-authored-by: Thomas J. Fan --- doc/whats_new/v1.5.rst | 13 +++++++++++++ sklearn/utils/validation.py | 24 +++++++++++------------- 2 files changed, 24 insertions(+), 13 deletions(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index f7a521ca4f0d0..0e3e37caeeb05 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -44,3 +44,16 @@ TODO: update at the time of the release. - |Feature| A fitted :class:`compose.ColumnTransformer` now implements `__getitem__` which returns the fitted transformers by name. :pr:`27990` by `Thomas Fan`_. + + +:mod:`sklearn.metrics` +...................... + +- |Efficiency| Improve efficiency of functions :func:`~metrics.brier_score_loss`, + :func:`~metrics.calibration_curve`, :func:`~metrics.det_curve`, :func:`~metrics.precision_recall_curve`, + :func:`~metrics.roc_curve` when `pos_label` argument is specified. + Also improve efficiency of methods `from_estimator` + and `from_predictions` in :class:`~metrics.RocCurveDisplay`, + :class:`~metrics.PrecisionRecallDisplay`, :class:`~metrics.DetCurveDisplay`, + :class:`~calibration.CalibrationDisplay`. + :pr:`28051` by :user:`Pierre de Fréminville ` diff --git a/sklearn/utils/validation.py b/sklearn/utils/validation.py index ebaf11aa5f90a..283d24a431fbd 100644 --- a/sklearn/utils/validation.py +++ b/sklearn/utils/validation.py @@ -2298,24 +2298,22 @@ def _check_pos_label_consistency(pos_label, y_true): # classes.dtype.kind in ('O', 'U', 'S') is required to avoid # triggering a FutureWarning by calling np.array_equal(a, b) # when elements in the two arrays are not comparable. - classes = np.unique(y_true) - if pos_label is None and ( - classes.dtype.kind in "OUS" - or not ( + if pos_label is None: + # Compute classes only if pos_label is not specified: + classes = np.unique(y_true) + if classes.dtype.kind in "OUS" or not ( np.array_equal(classes, [0, 1]) or np.array_equal(classes, [-1, 1]) or np.array_equal(classes, [0]) or np.array_equal(classes, [-1]) or np.array_equal(classes, [1]) - ) - ): - classes_repr = ", ".join([repr(c) for c in classes.tolist()]) - raise ValueError( - f"y_true takes value in {{{classes_repr}}} and pos_label is not " - "specified: either make y_true take value in {0, 1} or " - "{-1, 1} or pass pos_label explicitly." - ) - elif pos_label is None: + ): + classes_repr = ", ".join([repr(c) for c in classes.tolist()]) + raise ValueError( + f"y_true takes value in {{{classes_repr}}} and pos_label is not " + "specified: either make y_true take value in {0, 1} or " + "{-1, 1} or pass pos_label explicitly." + ) pos_label = 1 return pos_label From 06865e1c4090abd70ffbf78fe34bd456def45867 Mon Sep 17 00:00:00 2001 From: ldwy4 <59212418+ldwy4@users.noreply.github.com> Date: Thu, 11 Jan 2024 14:39:53 -0800 Subject: [PATCH 032/554] DOC Added docstring examples for utils.sparsefuncs (#28035) Co-authored-by: Guillaume Lemaitre --- sklearn/utils/sparsefuncs.py | 147 +++++++++++++++++++++++++++++++++++ 1 file changed, 147 insertions(+) diff --git a/sklearn/utils/sparsefuncs.py b/sklearn/utils/sparsefuncs.py index 9eccb8c07676f..a46e9e4d9ed93 100644 --- a/sklearn/utils/sparsefuncs.py +++ b/sklearn/utils/sparsefuncs.py @@ -53,6 +53,28 @@ def inplace_csr_column_scale(X, scale): scale : ndarray of shape (n_features,), dtype={np.float32, np.float64} Array of precomputed feature-wise values to use for scaling. + + Examples + -------- + >>> from sklearn.utils import sparsefuncs + >>> from scipy import sparse + >>> import numpy as np + >>> indptr = np.array([0, 3, 4, 4, 4]) + >>> indices = np.array([0, 1, 2, 2]) + >>> data = np.array([8, 1, 2, 5]) + >>> scale = np.array([2, 3, 2]) + >>> csr = sparse.csr_matrix((data, indices, indptr)) + >>> csr.todense() + matrix([[8, 1, 2], + [0, 0, 5], + [0, 0, 0], + [0, 0, 0]]) + >>> sparsefuncs.inplace_csr_column_scale(csr, scale) + >>> csr.todense() + matrix([[16, 3, 4], + [ 0, 0, 10], + [ 0, 0, 0], + [ 0, 0, 0]]) """ assert scale.shape[0] == X.shape[1] X.data *= scale.take(X.indices, mode="clip") @@ -111,6 +133,24 @@ def mean_variance_axis(X, axis, weights=None, return_sum_weights=False): sum_weights : ndarray of shape (n_features,), dtype=floating Returned if `return_sum_weights` is `True`. + + Examples + -------- + >>> from sklearn.utils import sparsefuncs + >>> from scipy import sparse + >>> import numpy as np + >>> indptr = np.array([0, 3, 4, 4, 4]) + >>> indices = np.array([0, 1, 2, 2]) + >>> data = np.array([8, 1, 2, 5]) + >>> scale = np.array([2, 3, 2]) + >>> csr = sparse.csr_matrix((data, indices, indptr)) + >>> csr.todense() + matrix([[8, 1, 2], + [0, 0, 5], + [0, 0, 0], + [0, 0, 0]]) + >>> sparsefuncs.mean_variance_axis(csr, axis=0) + (array([2. , 0.25, 1.75]), array([12. , 0.1875, 4.1875])) """ _raise_error_wrong_axis(axis) @@ -195,6 +235,27 @@ def incr_mean_variance_axis(X, *, axis, last_mean, last_var, last_n, weights=Non Notes ----- NaNs are ignored in the algorithm. + + Examples + -------- + >>> from sklearn.utils import sparsefuncs + >>> from scipy import sparse + >>> import numpy as np + >>> indptr = np.array([0, 3, 4, 4, 4]) + >>> indices = np.array([0, 1, 2, 2]) + >>> data = np.array([8, 1, 2, 5]) + >>> scale = np.array([2, 3, 2]) + >>> csr = sparse.csr_matrix((data, indices, indptr)) + >>> csr.todense() + matrix([[8, 1, 2], + [0, 0, 5], + [0, 0, 0], + [0, 0, 0]]) + >>> sparsefuncs.incr_mean_variance_axis( + ... csr, axis=0, last_mean=np.zeros(3), last_var=np.zeros(3), last_n=2 + ... ) + (array([1.3..., 0.1..., 1.1...]), array([8.8..., 0.1..., 3.4...]), + array([6., 6., 6.])) """ _raise_error_wrong_axis(axis) @@ -244,6 +305,28 @@ def inplace_column_scale(X, scale): scale : ndarray of shape (n_features,), dtype={np.float32, np.float64} Array of precomputed feature-wise values to use for scaling. + + Examples + -------- + >>> from sklearn.utils import sparsefuncs + >>> from scipy import sparse + >>> import numpy as np + >>> indptr = np.array([0, 3, 4, 4, 4]) + >>> indices = np.array([0, 1, 2, 2]) + >>> data = np.array([8, 1, 2, 5]) + >>> scale = np.array([2, 3, 2]) + >>> csr = sparse.csr_matrix((data, indices, indptr)) + >>> csr.todense() + matrix([[8, 1, 2], + [0, 0, 5], + [0, 0, 0], + [0, 0, 0]]) + >>> sparsefuncs.inplace_column_scale(csr, scale) + >>> csr.todense() + matrix([[16, 3, 4], + [ 0, 0, 10], + [ 0, 0, 0], + [ 0, 0, 0]]) """ if sp.issparse(X) and X.format == "csc": inplace_csr_row_scale(X.T, scale) @@ -266,6 +349,28 @@ def inplace_row_scale(X, scale): scale : ndarray of shape (n_features,), dtype={np.float32, np.float64} Array of precomputed sample-wise values to use for scaling. + + Examples + -------- + >>> from sklearn.utils import sparsefuncs + >>> from scipy import sparse + >>> import numpy as np + >>> indptr = np.array([0, 2, 3, 4, 5]) + >>> indices = np.array([0, 1, 2, 3, 3]) + >>> data = np.array([8, 1, 2, 5, 6]) + >>> scale = np.array([2, 3, 4, 5]) + >>> csr = sparse.csr_matrix((data, indices, indptr)) + >>> csr.todense() + matrix([[8, 1, 0, 0], + [0, 0, 2, 0], + [0, 0, 0, 5], + [0, 0, 0, 6]]) + >>> sparsefuncs.inplace_row_scale(csr, scale) + >>> csr.todense() + matrix([[16, 2, 0, 0], + [ 0, 0, 6, 0], + [ 0, 0, 0, 20], + [ 0, 0, 0, 30]]) """ if sp.issparse(X) and X.format == "csc": inplace_csr_column_scale(X.T, scale) @@ -382,6 +487,27 @@ def inplace_swap_row(X, m, n): n : int Index of the row of X to be swapped. + + Examples + -------- + >>> from sklearn.utils import sparsefuncs + >>> from scipy import sparse + >>> import numpy as np + >>> indptr = np.array([0, 2, 3, 3, 3]) + >>> indices = np.array([0, 2, 2]) + >>> data = np.array([8, 2, 5]) + >>> csr = sparse.csr_matrix((data, indices, indptr)) + >>> csr.todense() + matrix([[8, 0, 2], + [0, 0, 5], + [0, 0, 0], + [0, 0, 0]]) + >>> sparsefuncs.inplace_swap_row(csr, 0, 1) + >>> csr.todense() + matrix([[0, 0, 5], + [8, 0, 2], + [0, 0, 0], + [0, 0, 0]]) """ if sp.issparse(X) and X.format == "csc": inplace_swap_row_csc(X, m, n) @@ -406,6 +532,27 @@ def inplace_swap_column(X, m, n): n : int Index of the column of X to be swapped. + + Examples + -------- + >>> from sklearn.utils import sparsefuncs + >>> from scipy import sparse + >>> import numpy as np + >>> indptr = np.array([0, 2, 3, 3, 3]) + >>> indices = np.array([0, 2, 2]) + >>> data = np.array([8, 2, 5]) + >>> csr = sparse.csr_matrix((data, indices, indptr)) + >>> csr.todense() + matrix([[8, 0, 2], + [0, 0, 5], + [0, 0, 0], + [0, 0, 0]]) + >>> sparsefuncs.inplace_swap_column(csr, 0, 1) + >>> csr.todense() + matrix([[0, 8, 2], + [0, 0, 5], + [0, 0, 0], + [0, 0, 0]]) """ if m < 0: m += X.shape[1] From f1e89363f6777155a25b3574db9f0fc5c21a8c51 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Filip=20Karlo=20Do=C5=A1ilovi=C4=87?= Date: Fri, 12 Jan 2024 00:09:11 +0100 Subject: [PATCH 033/554] DOC Add approximate nearest neighbors example to Notes subsection to TSNE and KNeighborsTransformer (#28052) --- sklearn/manifold/_t_sne.py | 6 ++++++ sklearn/neighbors/_graph.py | 6 ++++++ 2 files changed, 12 insertions(+) diff --git a/sklearn/manifold/_t_sne.py b/sklearn/manifold/_t_sne.py index 6a90b1c43bbba..e280671ee2752 100644 --- a/sklearn/manifold/_t_sne.py +++ b/sklearn/manifold/_t_sne.py @@ -728,6 +728,12 @@ class TSNE(ClassNamePrefixFeaturesOutMixin, TransformerMixin, BaseEstimator): LocallyLinearEmbedding : Manifold learning using Locally Linear Embedding. SpectralEmbedding : Spectral embedding for non-linear dimensionality. + Notes + ----- + For an example of using :class:`~sklearn.manifold.TSNE` in combination with + :class:`~sklearn.neighbors.KNeighborsTransformer` see + :ref:`sphx_glr_auto_examples_neighbors_approximate_nearest_neighbors.py`. + References ---------- diff --git a/sklearn/neighbors/_graph.py b/sklearn/neighbors/_graph.py index 5fd9be4766c6e..2ff27d07514e0 100644 --- a/sklearn/neighbors/_graph.py +++ b/sklearn/neighbors/_graph.py @@ -362,6 +362,12 @@ class KNeighborsTransformer( RadiusNeighborsTransformer : Transform X into a weighted graph of neighbors nearer than a radius. + Notes + ----- + For an example of using :class:`~sklearn.neighbors.KNeighborsTransformer` + in combination with :class:`~sklearn.manifold.TSNE` see + :ref:`sphx_glr_auto_examples_neighbors_approximate_nearest_neighbors.py`. + Examples -------- >>> from sklearn.datasets import load_wine From 1b1f0eaa2e89817cc70736f7fdd135631c78768c Mon Sep 17 00:00:00 2001 From: Raj Pulapakura Date: Fri, 12 Jan 2024 22:06:51 +1100 Subject: [PATCH 034/554] DOC Add a docstring example for feature selection functions (#28033) Co-authored-by: Guillaume Lemaitre --- sklearn/feature_selection/_mutual_info.py | 22 +++++++++ .../_univariate_selection.py | 45 +++++++++++++++++++ 2 files changed, 67 insertions(+) diff --git a/sklearn/feature_selection/_mutual_info.py b/sklearn/feature_selection/_mutual_info.py index 1845447a24623..821ef889e7ed9 100644 --- a/sklearn/feature_selection/_mutual_info.py +++ b/sklearn/feature_selection/_mutual_info.py @@ -396,6 +396,16 @@ def mutual_info_regression( Data Sets". PLoS ONE 9(2), 2014. .. [4] L. F. Kozachenko, N. N. Leonenko, "Sample Estimate of the Entropy of a Random Vector", Probl. Peredachi Inf., 23:2 (1987), 9-16 + + Examples + -------- + >>> from sklearn.datasets import make_regression + >>> from sklearn.feature_selection import mutual_info_regression + >>> X, y = make_regression( + ... n_samples=50, n_features=3, n_informative=1, noise=1e-4, random_state=42 + ... ) + >>> mutual_info_regression(X, y) + array([0.1..., 2.6... , 0.0...]) """ return _estimate_mi(X, y, discrete_features, False, n_neighbors, copy, random_state) @@ -487,6 +497,18 @@ def mutual_info_classif( Data Sets". PLoS ONE 9(2), 2014. .. [4] L. F. Kozachenko, N. N. Leonenko, "Sample Estimate of the Entropy of a Random Vector:, Probl. Peredachi Inf., 23:2 (1987), 9-16 + + Examples + -------- + >>> from sklearn.datasets import make_classification + >>> from sklearn.feature_selection import mutual_info_classif + >>> X, y = make_classification( + ... n_samples=100, n_features=10, n_informative=2, n_clusters_per_class=1, + ... shuffle=False, random_state=42 + ... ) + >>> mutual_info_classif(X, y) + array([0.58..., 0.10..., 0.19..., 0.09... , 0. , + 0. , 0. , 0. , 0. , 0. ]) """ check_classification_targets(y) return _estimate_mi(X, y, discrete_features, True, n_neighbors, copy, random_state) diff --git a/sklearn/feature_selection/_univariate_selection.py b/sklearn/feature_selection/_univariate_selection.py index 96b6b93332091..df1b5072ce741 100644 --- a/sklearn/feature_selection/_univariate_selection.py +++ b/sklearn/feature_selection/_univariate_selection.py @@ -149,6 +149,24 @@ def f_classif(X, y): -------- chi2 : Chi-squared stats of non-negative features for classification tasks. f_regression : F-value between label/feature for regression tasks. + + Examples + -------- + >>> from sklearn.datasets import make_classification + >>> from sklearn.feature_selection import f_classif + >>> X, y = make_classification( + ... n_samples=100, n_features=10, n_informative=2, n_clusters_per_class=1, + ... shuffle=False, random_state=42 + ... ) + >>> f_statistic, p_values = f_classif(X, y) + >>> f_statistic + array([2.2...e+02, 7.0...e-01, 1.6...e+00, 9.3...e-01, + 5.4...e+00, 3.2...e-01, 4.7...e-02, 5.7...e-01, + 7.5...e-01, 8.9...e-02]) + >>> p_values + array([7.1...e-27, 4.0...e-01, 1.9...e-01, 3.3...e-01, + 2.2...e-02, 5.7...e-01, 8.2...e-01, 4.5...e-01, + 3.8...e-01, 7.6...e-01]) """ X, y = check_X_y(X, y, accept_sparse=["csr", "csc", "coo"]) args = [X[safe_mask(X, y == k)] for k in np.unique(y)] @@ -220,6 +238,23 @@ def chi2(X, y): Notes ----- Complexity of this algorithm is O(n_classes * n_features). + + Examples + -------- + >>> import numpy as np + >>> from sklearn.feature_selection import chi2 + >>> X = np.array([[1, 1, 3], + ... [0, 1, 5], + ... [5, 4, 1], + ... [6, 6, 2], + ... [1, 4, 0], + ... [0, 0, 0]]) + >>> y = np.array([1, 1, 0, 0, 2, 2]) + >>> chi2_stats, p_values = chi2(X, y) + >>> chi2_stats + array([15.3..., 6.5 , 8.9...]) + >>> p_values + array([0.0004..., 0.0387..., 0.0116... ]) """ # XXX: we might want to do some of the following in logspace instead for @@ -314,6 +349,16 @@ def r_regression(X, y, *, center=True, force_finite=True): mutual_info_regression: Mutual information for a continuous target. f_classif: ANOVA F-value between label/feature for classification tasks. chi2: Chi-squared stats of non-negative features for classification tasks. + + Examples + -------- + >>> from sklearn.datasets import make_regression + >>> from sklearn.feature_selection import r_regression + >>> X, y = make_regression( + ... n_samples=50, n_features=3, n_informative=1, noise=1e-4, random_state=42 + ... ) + >>> r_regression(X, y) + array([-0.15..., 1. , -0.22...]) """ X, y = check_X_y(X, y, accept_sparse=["csr", "csc", "coo"], dtype=np.float64) n_samples = X.shape[0] From 95d20178775548d06ffa55cf07a3619481bdd48c Mon Sep 17 00:00:00 2001 From: Michael Higgins <55243596+Higgs32584@users.noreply.github.com> Date: Fri, 12 Jan 2024 07:05:43 -0500 Subject: [PATCH 035/554] DOC improve docstring of BaseEstimator, ClassifierMixin, and RegressorMixin (#28030) Co-authored-by: Guillaume Lemaitre --- sklearn/base.py | 98 ++++++++++++++++++++++++++++++++++++++++++++++++- 1 file changed, 96 insertions(+), 2 deletions(-) diff --git a/sklearn/base.py b/sklearn/base.py index e7361c331617a..c48a5f2d99628 100644 --- a/sklearn/base.py +++ b/sklearn/base.py @@ -137,11 +137,45 @@ def _clone_parametrized(estimator, *, safe=True): class BaseEstimator(_HTMLDocumentationLinkMixin, _MetadataRequester): """Base class for all estimators in scikit-learn. + Inheriting from this class provides default implementations of: + + - setting and getting parameters used by `GridSearchCV` and friends; + - textual and HTML representation displayed in terminals and IDEs; + - estimator serialization; + - parameters validation; + - data validation; + - feature names validation. + + Read more in the :ref:`User Guide `. + + Notes ----- All estimators should specify all the parameters that can be set at the class level in their ``__init__`` as explicit keyword arguments (no ``*args`` or ``**kwargs``). + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import BaseEstimator + >>> class MyEstimator(BaseEstimator): + ... def __init__(self, *, param=1): + ... self.param = param + ... def fit(self, X, y=None): + ... self.is_fitted_ = True + ... return self + ... def predict(self, X): + ... return np.full(shape=X.shape[0], fill_value=self.param) + >>> estimator = MyEstimator(param=2) + >>> estimator.get_params() + {'param': 2} + >>> X = np.array([[1, 2], [2, 3], [3, 4]]) + >>> y = np.array([1, 0, 1]) + >>> estimator.fit(X, y).predict(X) + array([2, 2, 2]) + >>> estimator.set_params(param=3).fit(X, y).predict(X) + array([3, 3, 3]) """ @classmethod @@ -652,7 +686,37 @@ def _repr_mimebundle_(self, **kwargs): class ClassifierMixin: - """Mixin class for all classifiers in scikit-learn.""" + """Mixin class for all classifiers in scikit-learn. + + This mixin defines the following functionality: + + - `_estimator_type` class attribute defaulting to `"classifier"`; + - `score` method that default to :func:`~sklearn.metrics.accuracy_score`. + - enforce that `fit` requires `y` to be passed through the `requires_y` tag. + + Read more in the :ref:`User Guide `. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import BaseEstimator, ClassifierMixin + >>> # Mixin classes should always be on the left-hand side for a correct MRO + >>> class MyEstimator(ClassifierMixin, BaseEstimator): + ... def __init__(self, *, param=1): + ... self.param = param + ... def fit(self, X, y=None): + ... self.is_fitted_ = True + ... return self + ... def predict(self, X): + ... return np.full(shape=X.shape[0], fill_value=self.param) + >>> estimator = MyEstimator(param=1) + >>> X = np.array([[1, 2], [2, 3], [3, 4]]) + >>> y = np.array([1, 0, 1]) + >>> estimator.fit(X, y).predict(X) + array([1, 1, 1]) + >>> estimator.score(X, y) + 0.66... + """ _estimator_type = "classifier" @@ -689,7 +753,37 @@ def _more_tags(self): class RegressorMixin: - """Mixin class for all regression estimators in scikit-learn.""" + """Mixin class for all regression estimators in scikit-learn. + + This mixin defines the following functionality: + + - `_estimator_type` class attribute defaulting to `"regressor"`; + - `score` method that default to :func:`~sklearn.metrics.r2_score`. + - enforce that `fit` requires `y` to be passed through the `requires_y` tag. + + Read more in the :ref:`User Guide `. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import BaseEstimator, RegressorMixin + >>> # Mixin classes should always be on the left-hand side for a correct MRO + >>> class MyEstimator(RegressorMixin, BaseEstimator): + ... def __init__(self, *, param=1): + ... self.param = param + ... def fit(self, X, y=None): + ... self.is_fitted_ = True + ... return self + ... def predict(self, X): + ... return np.full(shape=X.shape[0], fill_value=self.param) + >>> estimator = MyEstimator(param=0) + >>> X = np.array([[1, 2], [2, 3], [3, 4]]) + >>> y = np.array([-1, 0, 1]) + >>> estimator.fit(X, y).predict(X) + array([0, 0, 0]) + >>> estimator.score(X, y) + 0.0 + """ _estimator_type = "regressor" From 55c6679ada73344ad83b07166ed05aa0877dbff4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Fri, 12 Jan 2024 15:44:41 +0100 Subject: [PATCH 036/554] TST Tweak tests to facilitate Meson usage (#28094) Co-authored-by: Olivier Grisel Co-authored-by: Guillaume Lemaitre --- .../test_enable_hist_gradient_boosting.py | 5 ++-- .../tests/test_enable_iterative_imputer.py | 24 ++++++++++++------- .../tests/test_enable_successive_halving.py | 24 ++++++++++++------- sklearn/tests/test_common.py | 9 ++++--- sklearn/tests/test_docstring_parameters.py | 3 ++- sklearn/tests/test_min_dependencies_readme.py | 4 ++-- sklearn/utils/_testing.py | 23 +++++++++++++----- sklearn/utils/tests/test_testing.py | 24 +++++++++++++++++++ 8 files changed, 86 insertions(+), 30 deletions(-) diff --git a/sklearn/experimental/tests/test_enable_hist_gradient_boosting.py b/sklearn/experimental/tests/test_enable_hist_gradient_boosting.py index 6e0b50c18e0ae..0a90d63fcb37c 100644 --- a/sklearn/experimental/tests/test_enable_hist_gradient_boosting.py +++ b/sklearn/experimental/tests/test_enable_hist_gradient_boosting.py @@ -5,7 +5,7 @@ import pytest from sklearn.utils import _IS_WASM -from sklearn.utils._testing import assert_run_python_script +from sklearn.utils._testing import assert_run_python_script_without_output @pytest.mark.xfail(_IS_WASM, reason="cannot start subprocess") @@ -15,4 +15,5 @@ def test_import_raises_warning(): with pytest.warns(UserWarning, match="it is not needed to import"): from sklearn.experimental import enable_hist_gradient_boosting # noqa """ - assert_run_python_script(textwrap.dedent(code)) + pattern = "it is not needed to import enable_hist_gradient_boosting anymore" + assert_run_python_script_without_output(textwrap.dedent(code), pattern=pattern) diff --git a/sklearn/experimental/tests/test_enable_iterative_imputer.py b/sklearn/experimental/tests/test_enable_iterative_imputer.py index 3044a52daf0ce..617d921eb8f88 100644 --- a/sklearn/experimental/tests/test_enable_iterative_imputer.py +++ b/sklearn/experimental/tests/test_enable_iterative_imputer.py @@ -5,7 +5,7 @@ import pytest from sklearn.utils import _IS_WASM -from sklearn.utils._testing import assert_run_python_script +from sklearn.utils._testing import assert_run_python_script_without_output @pytest.mark.xfail(_IS_WASM, reason="cannot start subprocess") @@ -16,28 +16,36 @@ def test_imports_strategies(): # for every test case. Else, the tests would not be independent # (manually removing the imports from the cache (sys.modules) is not # recommended and can lead to many complications). - + pattern = "IterativeImputer is experimental" good_import = """ from sklearn.experimental import enable_iterative_imputer from sklearn.impute import IterativeImputer """ - assert_run_python_script(textwrap.dedent(good_import)) + assert_run_python_script_without_output( + textwrap.dedent(good_import), pattern=pattern + ) good_import_with_ensemble_first = """ import sklearn.ensemble from sklearn.experimental import enable_iterative_imputer from sklearn.impute import IterativeImputer """ - assert_run_python_script(textwrap.dedent(good_import_with_ensemble_first)) + assert_run_python_script_without_output( + textwrap.dedent(good_import_with_ensemble_first), + pattern=pattern, + ) - bad_imports = """ + bad_imports = f""" import pytest - with pytest.raises(ImportError, match='IterativeImputer is experimental'): + with pytest.raises(ImportError, match={pattern!r}): from sklearn.impute import IterativeImputer import sklearn.experimental - with pytest.raises(ImportError, match='IterativeImputer is experimental'): + with pytest.raises(ImportError, match={pattern!r}): from sklearn.impute import IterativeImputer """ - assert_run_python_script(textwrap.dedent(bad_imports)) + assert_run_python_script_without_output( + textwrap.dedent(bad_imports), + pattern=pattern, + ) diff --git a/sklearn/experimental/tests/test_enable_successive_halving.py b/sklearn/experimental/tests/test_enable_successive_halving.py index 8c0d5ef869680..0abbf07eced00 100644 --- a/sklearn/experimental/tests/test_enable_successive_halving.py +++ b/sklearn/experimental/tests/test_enable_successive_halving.py @@ -5,7 +5,7 @@ import pytest from sklearn.utils import _IS_WASM -from sklearn.utils._testing import assert_run_python_script +from sklearn.utils._testing import assert_run_python_script_without_output @pytest.mark.xfail(_IS_WASM, reason="cannot start subprocess") @@ -16,13 +16,15 @@ def test_imports_strategies(): # for every test case. Else, the tests would not be independent # (manually removing the imports from the cache (sys.modules) is not # recommended and can lead to many complications). - + pattern = "Halving(Grid|Random)SearchCV is experimental" good_import = """ from sklearn.experimental import enable_halving_search_cv from sklearn.model_selection import HalvingGridSearchCV from sklearn.model_selection import HalvingRandomSearchCV """ - assert_run_python_script(textwrap.dedent(good_import)) + assert_run_python_script_without_output( + textwrap.dedent(good_import), pattern=pattern + ) good_import_with_model_selection_first = """ import sklearn.model_selection @@ -30,16 +32,22 @@ def test_imports_strategies(): from sklearn.model_selection import HalvingGridSearchCV from sklearn.model_selection import HalvingRandomSearchCV """ - assert_run_python_script(textwrap.dedent(good_import_with_model_selection_first)) + assert_run_python_script_without_output( + textwrap.dedent(good_import_with_model_selection_first), + pattern=pattern, + ) - bad_imports = """ + bad_imports = f""" import pytest - with pytest.raises(ImportError, match='HalvingGridSearchCV is experimental'): + with pytest.raises(ImportError, match={pattern!r}): from sklearn.model_selection import HalvingGridSearchCV import sklearn.experimental - with pytest.raises(ImportError, match='HalvingRandomSearchCV is experimental'): + with pytest.raises(ImportError, match={pattern!r}): from sklearn.model_selection import HalvingRandomSearchCV """ - assert_run_python_script(textwrap.dedent(bad_imports)) + assert_run_python_script_without_output( + textwrap.dedent(bad_imports), + pattern=pattern, + ) diff --git a/sklearn/tests/test_common.py b/sklearn/tests/test_common.py index 256cb7e209381..6dbf54b203e4c 100644 --- a/sklearn/tests/test_common.py +++ b/sklearn/tests/test_common.py @@ -14,6 +14,7 @@ from functools import partial from inspect import isgenerator, signature from itertools import chain, product +from pathlib import Path import numpy as np import pytest @@ -167,7 +168,7 @@ def test_configure(): # is installed in editable mode by pip build isolation enabled. pytest.importorskip("Cython") cwd = os.getcwd() - setup_path = os.path.abspath(os.path.join(sklearn.__path__[0], "..")) + setup_path = Path(sklearn.__file__).parent.parent setup_filename = os.path.join(setup_path, "setup.py") if not os.path.exists(setup_filename): pytest.skip("setup.py not available") @@ -211,10 +212,11 @@ def test_class_weight_balanced_linear_classifiers(name, Classifier): @pytest.mark.xfail(_IS_WASM, reason="importlib not supported for Pyodide packages") @ignore_warnings def test_import_all_consistency(): + sklearn_path = [os.path.dirname(sklearn.__file__)] # Smoke test to check that any name in a __all__ list is actually defined # in the namespace of the module or package. pkgs = pkgutil.walk_packages( - path=sklearn.__path__, prefix="sklearn.", onerror=lambda _: None + path=sklearn_path, prefix="sklearn.", onerror=lambda _: None ) submods = [modname for _, modname, _ in pkgs] for modname in submods + ["sklearn"]: @@ -236,9 +238,10 @@ def test_import_all_consistency(): def test_root_import_all_completeness(): + sklearn_path = [os.path.dirname(sklearn.__file__)] EXCEPTIONS = ("utils", "tests", "base", "setup", "conftest") for _, modname, _ in pkgutil.walk_packages( - path=sklearn.__path__, onerror=lambda _: None + path=sklearn_path, onerror=lambda _: None ): if "." in modname or modname.startswith("_") or modname in EXCEPTIONS: continue diff --git a/sklearn/tests/test_docstring_parameters.py b/sklearn/tests/test_docstring_parameters.py index e97646f4a701c..52a383e4ca602 100644 --- a/sklearn/tests/test_docstring_parameters.py +++ b/sklearn/tests/test_docstring_parameters.py @@ -4,6 +4,7 @@ import importlib import inspect +import os import warnings from inspect import signature from pkgutil import walk_packages @@ -40,7 +41,7 @@ with warnings.catch_warnings(): warnings.simplefilter("ignore", FutureWarning) # mypy error: Module has no attribute "__path__" - sklearn_path = sklearn.__path__ # type: ignore # mypy issue #1422 + sklearn_path = [os.path.dirname(sklearn.__file__)] PUBLIC_MODULES = set( [ pckg[1] diff --git a/sklearn/tests/test_min_dependencies_readme.py b/sklearn/tests/test_min_dependencies_readme.py index 9f9718d292699..2cc4d25d25a12 100644 --- a/sklearn/tests/test_min_dependencies_readme.py +++ b/sklearn/tests/test_min_dependencies_readme.py @@ -28,7 +28,7 @@ def test_min_dependencies_readme(): + r"( [0-9]+\.[0-9]+(\.[0-9]+)?)" ) - readme_path = Path(sklearn.__path__[0]).parents[0] + readme_path = Path(sklearn.__file__).parent.parent readme_file = readme_path / "README.rst" if not os.path.exists(readme_file): @@ -58,7 +58,7 @@ def test_min_dependencies_pyproject_toml(): # tomllib is available in Python 3.11 tomllib = pytest.importorskip("tomllib") - root_directory = Path(sklearn.__path__[0]).parent + root_directory = Path(sklearn.__file__).parent.parent pyproject_toml_path = root_directory / "pyproject.toml" if not pyproject_toml_path.exists(): diff --git a/sklearn/utils/_testing.py b/sklearn/utils/_testing.py index 5411c4dacf766..b49622627c7ae 100644 --- a/sklearn/utils/_testing.py +++ b/sklearn/utils/_testing.py @@ -66,7 +66,7 @@ "assert_array_less", "assert_approx_equal", "assert_allclose", - "assert_run_python_script", + "assert_run_python_script_without_output", "assert_no_warnings", "SkipTest", ] @@ -669,11 +669,11 @@ def check_docstring_parameters(func, doc=None, ignore=None): return incorrect -def assert_run_python_script(source_code, timeout=60): +def assert_run_python_script_without_output(source_code, pattern=".+", timeout=60): """Utility to check assertions in an independent Python subprocess. - The script provided in the source code should return 0 and not print - anything on stderr or stdout. + The script provided in the source code should return 0 and the stdtout + + stderr should not match the pattern `pattern`. This is a port from cloudpickle https://github.com/cloudpipe/cloudpickle @@ -681,6 +681,9 @@ def assert_run_python_script(source_code, timeout=60): ---------- source_code : str The Python source code to execute. + pattern : str + Pattern that the stdout + stderr should not match. By default, unless + stdout + stderr are both empty, an error will be raised. timeout : int, default=60 Time in seconds before timeout. """ @@ -710,8 +713,16 @@ def assert_run_python_script(source_code, timeout=60): raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) - if out != b"": - raise AssertionError(out.decode("utf-8")) + + out = out.decode("utf-8") + if re.search(pattern, out): + if pattern == ".+": + expectation = "Expected no output" + else: + expectation = f"The output was not supposed to match {pattern!r}" + + message = f"{expectation}, got the following output instead: {out!r}" + raise AssertionError(message) except TimeoutExpired as e: raise RuntimeError( "script timeout, output so far:\n%s" % e.output.decode("utf-8") diff --git a/sklearn/utils/tests/test_testing.py b/sklearn/utils/tests/test_testing.py index f25bdc54be4d8..7a4b02aeec224 100644 --- a/sklearn/utils/tests/test_testing.py +++ b/sklearn/utils/tests/test_testing.py @@ -20,6 +20,7 @@ assert_raise_message, assert_raises, assert_raises_regex, + assert_run_python_script_without_output, check_docstring_parameters, create_memmap_backed_data, ignore_warnings, @@ -820,3 +821,26 @@ def test_float32_aware_assert_allclose(): with pytest.raises(AssertionError): assert_allclose(np.array([1e-5], dtype=np.float32), 0.0) assert_allclose(np.array([1e-5], dtype=np.float32), 0.0, atol=2e-5) + + +def test_assert_run_python_script_without_output(): + code = "x = 1" + assert_run_python_script_without_output(code) + + code = "print('something to stdout')" + with pytest.raises(AssertionError, match="Expected no output"): + assert_run_python_script_without_output(code) + + code = "print('something to stdout')" + with pytest.raises( + AssertionError, + match="output was not supposed to match.+got.+something to stdout", + ): + assert_run_python_script_without_output(code, pattern="to.+stdout") + + code = "\n".join(["import sys", "print('something to stderr', file=sys.stderr)"]) + with pytest.raises( + AssertionError, + match="output was not supposed to match.+got.+something to stderr", + ): + assert_run_python_script_without_output(code, pattern="to.+stderr") From e1559a8c9946e7440989c03a5c7e59818ff1af1a Mon Sep 17 00:00:00 2001 From: Cindy Liang <67083541+cindy-x-liang@users.noreply.github.com> Date: Fri, 12 Jan 2024 09:46:36 -0500 Subject: [PATCH 037/554] DOC added examples in validation functions (#28023) Co-authored-by: Guillaume Lemaitre --- sklearn/utils/validation.py | 21 +++++++++++++++++++++ 1 file changed, 21 insertions(+) diff --git a/sklearn/utils/validation.py b/sklearn/utils/validation.py index 283d24a431fbd..a7553993f7ded 100644 --- a/sklearn/utils/validation.py +++ b/sklearn/utils/validation.py @@ -1241,6 +1241,12 @@ def column_or_1d(y, *, dtype=None, warn=False): ------ ValueError If `y` is not a 1D array or a 2D array with a single row or column. + + Examples + -------- + >>> from sklearn.utils.validation import column_or_1d + >>> column_or_1d([1, 1]) + array([1, 1]) """ xp, _ = get_namespace(y) y = check_array( @@ -1355,6 +1361,21 @@ def check_symmetric(array, *, tol=1e-10, raise_warning=True, raise_exception=Fal Symmetrized version of the input array, i.e. the average of array and array.transpose(). If sparse, then duplicate entries are first summed and zeros are eliminated. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.utils.validation import check_symmetric + >>> symmetric_array = np.array([[0, 1, 2], [1, 0, 1], [2, 1, 0]]) + >>> check_symmetric(symmetric_array) + array([[0, 1, 2], + [1, 0, 1], + [2, 1, 0]]) + >>> from scipy.sparse import csr_matrix + >>> sparse_symmetric_array = csr_matrix(symmetric_array) + >>> check_symmetric(sparse_symmetric_array) + <3x3 sparse matrix of type '' + with 6 stored elements in Compressed Sparse Row format> """ if (array.ndim != 2) or (array.shape[0] != array.shape[1]): raise ValueError( From 9b0446844c2c99fb880bba538269fff1deba79f4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Filip=20Karlo=20Do=C5=A1ilovi=C4=87?= Date: Fri, 12 Jan 2024 16:00:58 +0100 Subject: [PATCH 038/554] DOC Add docstring examples to some functions from metrics package (#28022) Co-authored-by: Guillaume Lemaitre --- sklearn/metrics/_classification.py | 8 ++++++++ sklearn/metrics/_scorer.py | 23 +++++++++++++++++++++++ sklearn/metrics/cluster/_supervised.py | 8 ++++++++ 3 files changed, 39 insertions(+) diff --git a/sklearn/metrics/_classification.py b/sklearn/metrics/_classification.py index 5b8a024e6e5fc..a92e165e150cc 100644 --- a/sklearn/metrics/_classification.py +++ b/sklearn/metrics/_classification.py @@ -679,6 +679,14 @@ class labels [2]_. `_. .. [3] `Wikipedia entry for the Cohen's kappa `_. + + Examples + -------- + >>> from sklearn.metrics import cohen_kappa_score + >>> y1 = ["negative", "positive", "negative", "neutral", "positive"] + >>> y2 = ["negative", "positive", "negative", "neutral", "negative"] + >>> cohen_kappa_score(y1, y2) + 0.6875 """ confusion = confusion_matrix(y1, y2, labels=labels, sample_weight=sample_weight) n_classes = confusion.shape[0] diff --git a/sklearn/metrics/_scorer.py b/sklearn/metrics/_scorer.py index 37f0fa044455c..3e55b627ee08a 100644 --- a/sklearn/metrics/_scorer.py +++ b/sklearn/metrics/_scorer.py @@ -379,6 +379,18 @@ def get_scorer(scoring): When passed a string, this function always returns a copy of the scorer object. Calling `get_scorer` twice for the same scorer results in two separate scorer objects. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.dummy import DummyClassifier + >>> from sklearn.metrics import get_scorer + >>> X = np.reshape([0, 1, -1, -0.5, 2], (-1, 1)) + >>> y = np.array([0, 1, 1, 0, 1]) + >>> classifier = DummyClassifier(strategy="constant", constant=0).fit(X, y) + >>> accuracy = get_scorer("accuracy") + >>> accuracy(classifier, X, y) + 0.4 """ if isinstance(scoring, str): try: @@ -839,6 +851,17 @@ def get_scorer_names(): ------- list of str Names of all available scorers. + + Examples + -------- + >>> from sklearn.metrics import get_scorer_names + >>> all_scorers = get_scorer_names() + >>> type(all_scorers) + + >>> all_scorers[:3] + ['accuracy', 'adjusted_mutual_info_score', 'adjusted_rand_score'] + >>> "roc_auc" in all_scorers + True """ return sorted(_SCORERS.keys()) diff --git a/sklearn/metrics/cluster/_supervised.py b/sklearn/metrics/cluster/_supervised.py index b8c80e6292b31..4e2b05e9d1946 100644 --- a/sklearn/metrics/cluster/_supervised.py +++ b/sklearn/metrics/cluster/_supervised.py @@ -867,6 +867,14 @@ def mutual_info_score(labels_true, labels_pred, *, contingency=None): Notes ----- The logarithm used is the natural logarithm (base-e). + + Examples + -------- + >>> from sklearn.metrics import mutual_info_score + >>> labels_true = [0, 1, 1, 0, 1, 0] + >>> labels_pred = [0, 1, 0, 0, 1, 1] + >>> mutual_info_score(labels_true, labels_pred) + 0.056... """ if contingency is None: labels_true, labels_pred = check_clusterings(labels_true, labels_pred) From b05b5090005dfee1a4311d89ce8eed47f4160d9d Mon Sep 17 00:00:00 2001 From: Sandip Dutta Date: Fri, 12 Jan 2024 20:45:34 +0530 Subject: [PATCH 039/554] DOC add docstring example for clear_data_home and fetch_covtype (#28027) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Guillaume Lemaitre Co-authored-by: Loïc Estève --- sklearn/conftest.py | 12 +++++++++--- sklearn/datasets/_base.py | 5 +++++ sklearn/datasets/_covtype.py | 12 ++++++++++++ 3 files changed, 26 insertions(+), 3 deletions(-) diff --git a/sklearn/conftest.py b/sklearn/conftest.py index d2f44f6912b62..d14afddc3773d 100644 --- a/sklearn/conftest.py +++ b/sklearn/conftest.py @@ -134,10 +134,16 @@ def pytest_collection_modifyitems(config, items): datasets_to_download = set() for item in items: - if not hasattr(item, "fixturenames"): + if isinstance(item, DoctestItem) and "fetch_" in item.name: + fetcher_function_name = item.name.split(".")[-1] + dataset_fetchers_key = f"{fetcher_function_name}_fxt" + dataset_to_fetch = set([dataset_fetchers_key]) & dataset_features_set + elif not hasattr(item, "fixturenames"): continue - item_fixtures = set(item.fixturenames) - dataset_to_fetch = item_fixtures & dataset_features_set + else: + item_fixtures = set(item.fixturenames) + dataset_to_fetch = item_fixtures & dataset_features_set + if not dataset_to_fetch: continue diff --git a/sklearn/datasets/_base.py b/sklearn/datasets/_base.py index e062bf381b393..ab2b8bd3f5110 100644 --- a/sklearn/datasets/_base.py +++ b/sklearn/datasets/_base.py @@ -85,6 +85,11 @@ def clear_data_home(data_home=None): data_home : str or path-like, default=None The path to scikit-learn data directory. If `None`, the default path is `~/scikit_learn_data`. + + Examples + ---------- + >>> from sklearn.datasets import clear_data_home + >>> clear_data_home() # doctest: +SKIP """ data_home = get_data_home(data_home) shutil.rmtree(data_home) diff --git a/sklearn/datasets/_covtype.py b/sklearn/datasets/_covtype.py index 7620e08c5ec92..4e1b1d7961f2e 100644 --- a/sklearn/datasets/_covtype.py +++ b/sklearn/datasets/_covtype.py @@ -156,6 +156,18 @@ def fetch_covtype( ndarray of shape (n_samples,) containing the target samples. .. versionadded:: 0.20 + + Examples + -------- + >>> from sklearn.datasets import fetch_covtype + >>> cov_type = fetch_covtype() + >>> cov_type.data.shape + (581012, 54) + >>> cov_type.target.shape + (581012,) + >>> # Let's check the 4 first feature names + >>> cov_type.feature_names[:4] + ['Elevation', 'Aspect', 'Slope', 'Horizontal_Distance_To_Hydrology'] """ data_home = get_data_home(data_home=data_home) covtype_dir = join(data_home, "covertype") From c56d74a603859fc1b1cc7766a6c35121b9a17b16 Mon Sep 17 00:00:00 2001 From: Advik Sinha Date: Fri, 12 Jan 2024 20:47:05 +0530 Subject: [PATCH 040/554] DOC Add examples to docstring for sklearn.isotonic functions (#28020) Co-authored-by: Guillaume Lemaitre --- sklearn/isotonic.py | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/sklearn/isotonic.py b/sklearn/isotonic.py index 2ed99c0532b58..04456b1763791 100644 --- a/sklearn/isotonic.py +++ b/sklearn/isotonic.py @@ -58,6 +58,16 @@ def check_increasing(x, y): ---------- Fisher transformation. Wikipedia. https://en.wikipedia.org/wiki/Fisher_transformation + + Examples + -------- + >>> from sklearn.isotonic import check_increasing + >>> x, y = [1, 2, 3, 4, 5], [2, 4, 6, 8, 10] + >>> check_increasing(x, y) + True + >>> y = [10, 8, 6, 4, 2] + >>> check_increasing(x, y) + False """ # Calculate Spearman rho estimate and set return accordingly. @@ -133,6 +143,13 @@ def isotonic_regression( ---------- "Active set algorithms for isotonic regression; A unifying framework" by Michael J. Best and Nilotpal Chakravarti, section 3. + + Examples + -------- + >>> from sklearn.isotonic import isotonic_regression + >>> isotonic_regression([5, 3, 1, 2, 8, 10, 7, 9, 6, 4]) + array([2.75 , 2.75 , 2.75 , 2.75 , 7.33..., + 7.33..., 7.33..., 7.33..., 7.33..., 7.33...]) """ order = np.s_[:] if increasing else np.s_[::-1] y = check_array(y, ensure_2d=False, input_name="y", dtype=[np.float64, np.float32]) From a7b2dc36be3ab3dde649f13ead6533d38bde3873 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Filip=20Karlo=20Do=C5=A1ilovi=C4=87?= Date: Fri, 12 Jan 2024 16:46:29 +0100 Subject: [PATCH 041/554] DOC Add examples to docstring to functions from the preprocessing package (#28019) Co-authored-by: Guillaume Lemaitre --- sklearn/preprocessing/_data.py | 65 +++++++++++++++++++++++++++++++++- 1 file changed, 64 insertions(+), 1 deletion(-) diff --git a/sklearn/preprocessing/_data.py b/sklearn/preprocessing/_data.py index 4cbae0e1d3591..7fc03ccd7ab36 100644 --- a/sklearn/preprocessing/_data.py +++ b/sklearn/preprocessing/_data.py @@ -205,7 +205,18 @@ def scale(X, *, axis=0, with_mean=True, with_std=True, copy=True): :class:`~sklearn.preprocessing.StandardScaler` within a :ref:`Pipeline ` in order to prevent most risks of data leaking: `pipe = make_pipeline(StandardScaler(), LogisticRegression())`. - """ # noqa + + Examples + -------- + >>> from sklearn.preprocessing import scale + >>> X = [[-2, 1, 2], [-1, 0, 1]] + >>> scale(X, axis=0) # scaling each column independently + array([[-1., 1., 1.], + [ 1., -1., -1.]]) + >>> scale(X, axis=1) # scaling each row independently + array([[-1.37..., 0.39..., 0.98...], + [-1.22..., 0. , 1.22...]]) + """ X = check_array( X, accept_sparse="csc", @@ -646,6 +657,17 @@ def minmax_scale(X, feature_range=(0, 1), *, axis=0, copy=True): ----- For a comparison of the different scalers, transformers, and normalizers, see: :ref:`sphx_glr_auto_examples_preprocessing_plot_all_scaling.py`. + + Examples + -------- + >>> from sklearn.preprocessing import minmax_scale + >>> X = [[-2, 1, 2], [-1, 0, 1]] + >>> minmax_scale(X, axis=0) # scale each column independently + array([[0., 1., 1.], + [1., 0., 0.]]) + >>> minmax_scale(X, axis=1) # scale each row independently + array([[0. , 0.75, 1. ], + [0. , 0.5 , 1. ]]) """ # Unlike the scaler object, this function allows 1d input. # If copy is required, it will be done inside the scaler object. @@ -1374,6 +1396,17 @@ def maxabs_scale(X, *, axis=0, copy=True): For a comparison of the different scalers, transformers, and normalizers, see: :ref:`sphx_glr_auto_examples_preprocessing_plot_all_scaling.py`. + + Examples + -------- + >>> from sklearn.preprocessing import maxabs_scale + >>> X = [[-2, 1, 2], [-1, 0, 1]] + >>> maxabs_scale(X, axis=0) # scale each column independently + array([[-1. , 1. , 1. ], + [-0.5, 0. , 0.5]]) + >>> maxabs_scale(X, axis=1) # scale each row independently + array([[-1. , 0.5, 1. ], + [-1. , 0. , 1. ]]) """ # Unlike the scaler object, this function allows 1d input. @@ -1769,6 +1802,17 @@ def robust_scale( :class:`~sklearn.preprocessing.RobustScaler` within a :ref:`Pipeline ` in order to prevent most risks of data leaking: `pipe = make_pipeline(RobustScaler(), LogisticRegression())`. + + Examples + -------- + >>> from sklearn.preprocessing import robust_scale + >>> X = [[-2, 1, 2], [-1, 0, 1]] + >>> robust_scale(X, axis=0) # scale each column independently + array([[-1., 1., 1.], + [ 1., -1., -1.]]) + >>> robust_scale(X, axis=1) # scale each row independently + array([[-1.5, 0. , 0.5], + [-1. , 0. , 1. ]]) """ X = check_array( X, @@ -1859,6 +1903,17 @@ def normalize(X, norm="l2", *, axis=1, copy=True, return_norm=False): ----- For a comparison of the different scalers, transformers, and normalizers, see: :ref:`sphx_glr_auto_examples_preprocessing_plot_all_scaling.py`. + + Examples + -------- + >>> from sklearn.preprocessing import normalize + >>> X = [[-2, 1, 2], [-1, 0, 1]] + >>> normalize(X, norm="l1") # L1 normalization each row independently + array([[-0.4, 0.2, 0.4], + [-0.5, 0. , 0.5]]) + >>> normalize(X, norm="l2") # L2 normalization each row independently + array([[-0.66..., 0.33..., 0.66...], + [-0.70..., 0. , 0.70...]]) """ if axis == 0: sparse_format = "csc" @@ -2082,6 +2137,14 @@ def binarize(X, *, threshold=0.0, copy=True): -------- Binarizer : Performs binarization using the Transformer API (e.g. as part of a preprocessing :class:`~sklearn.pipeline.Pipeline`). + + Examples + -------- + >>> from sklearn.preprocessing import binarize + >>> X = [[0.4, 0.6, 0.5], [0.6, 0.1, 0.2]] + >>> binarize(X, threshold=0.5) + array([[0., 1., 0.], + [1., 0., 0.]]) """ X = check_array(X, accept_sparse=["csr", "csc"], copy=copy) if sparse.issparse(X): From e485c39928d8b923b5e4a59743d7af7e104b8368 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Filip=20Karlo=20Do=C5=A1ilovi=C4=87?= Date: Fri, 12 Jan 2024 17:30:02 +0100 Subject: [PATCH 042/554] DOC Add examples to docstring to functions of class_weight module (#28014) Co-authored-by: Guillaume Lemaitre --- sklearn/utils/class_weight.py | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/sklearn/utils/class_weight.py b/sklearn/utils/class_weight.py index d049a2eac569a..55802f780ed41 100644 --- a/sklearn/utils/class_weight.py +++ b/sklearn/utils/class_weight.py @@ -49,6 +49,14 @@ def compute_class_weight(class_weight, *, classes, y): ---------- The "balanced" heuristic is inspired by Logistic Regression in Rare Events Data, King, Zen, 2001. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.utils.class_weight import compute_class_weight + >>> y = [1, 1, 1, 1, 0, 0] + >>> compute_class_weight(class_weight="balanced", classes=np.unique(y), y=y) + array([1.5 , 0.75]) """ # Import error caused by circular imports. from ..preprocessing import LabelEncoder @@ -133,6 +141,13 @@ def compute_sample_weight(class_weight, y, *, indices=None): ------- sample_weight_vect : ndarray of shape (n_samples,) Array with sample weights as applied to the original `y`. + + Examples + -------- + >>> from sklearn.utils.class_weight import compute_sample_weight + >>> y = [1, 1, 1, 1, 0, 0] + >>> compute_sample_weight(class_weight="balanced", y=y) + array([0.75, 0.75, 0.75, 0.75, 1.5 , 1.5 ]) """ # Ensure y is 2D. Sparse matrices are already 2D. From 2b07c87e5b748d98322ae6ea4023594681a7ddae Mon Sep 17 00:00:00 2001 From: DUONG <47552931+duongb@users.noreply.github.com> Date: Sat, 13 Jan 2024 00:28:59 +0700 Subject: [PATCH 043/554] DOC add examples for sklearn.model_selection (#28013) Co-authored-by: Guillaume Lemaitre --- sklearn/model_selection/_split.py | 8 ++++++ sklearn/model_selection/_validation.py | 37 ++++++++++++++++++++++++++ 2 files changed, 45 insertions(+) diff --git a/sklearn/model_selection/_split.py b/sklearn/model_selection/_split.py index 8bb06356e245e..1f89832daba22 100644 --- a/sklearn/model_selection/_split.py +++ b/sklearn/model_selection/_split.py @@ -2508,6 +2508,14 @@ def check_cv(cv=5, y=None, *, classifier=False): checked_cv : a cross-validator instance. The return value is a cross-validator which generates the train/test splits via the ``split`` method. + + Examples + -------- + >>> from sklearn.model_selection import check_cv + >>> check_cv(cv=5, y=None, classifier=False) + KFold(...) + >>> check_cv(cv=5, y=[1, 1, 0, 0, 0, 0], classifier=True) + StratifiedKFold(...) """ cv = 5 if cv is None else cv if isinstance(cv, numbers.Integral): diff --git a/sklearn/model_selection/_validation.py b/sklearn/model_selection/_validation.py index 07b229b57bf96..75c956f2d38a7 100644 --- a/sklearn/model_selection/_validation.py +++ b/sklearn/model_selection/_validation.py @@ -1636,6 +1636,26 @@ def permutation_test_score( Performance `_. The Journal of Machine Learning Research (2010) vol. 11 + + Examples + -------- + >>> from sklearn.datasets import make_classification + >>> from sklearn.linear_model import LogisticRegression + >>> from sklearn.model_selection import permutation_test_score + >>> X, y = make_classification(random_state=0) + >>> estimator = LogisticRegression() + >>> score, permutation_scores, pvalue = permutation_test_score( + ... estimator, X, y, random_state=0 + ... ) + >>> print(f"Original Score: {score:.3f}") + Original Score: 0.810 + >>> print( + ... f"Permutation Scores: {permutation_scores.mean():.3f} +/- " + ... f"{permutation_scores.std():.3f}" + ... ) + Permutation Scores: 0.505 +/- 0.057 + >>> print(f"P-value: {pvalue:.3f}") + P-value: 0.010 """ X, y, groups = indexable(X, y, groups) @@ -2254,6 +2274,23 @@ def validation_curve( Notes ----- See :ref:`sphx_glr_auto_examples_model_selection_plot_validation_curve.py` + + Examples + -------- + >>> import numpy as np + >>> from sklearn.datasets import make_classification + >>> from sklearn.model_selection import validation_curve + >>> from sklearn.linear_model import LogisticRegression + >>> X, y = make_classification(n_samples=1_000, random_state=0) + >>> logistic_regression = LogisticRegression() + >>> param_name, param_range = "C", np.logspace(-8, 3, 10) + >>> train_scores, test_scores = validation_curve( + ... logistic_regression, X, y, param_name=param_name, param_range=param_range + ... ) + >>> print(f"The average train accuracy is {train_scores.mean():.2f}") + The average train accuracy is 0.81 + >>> print(f"The average test accuracy is {test_scores.mean():.2f}") + The average test accuracy is 0.81 """ X, y, groups = indexable(X, y, groups) From 30fcb45f14e761ad41ce1e8874ee7e33b8fa7f79 Mon Sep 17 00:00:00 2001 From: Adarsh Wase <68223910+AdarshWase@users.noreply.github.com> Date: Fri, 12 Jan 2024 23:56:29 +0530 Subject: [PATCH 044/554] DOC added examples in metrics.pairwise functions (#28005) Co-authored-by: Guillaume Lemaitre --- sklearn/metrics/pairwise.py | 156 +++++++++++++++++++++++++++++++++--- 1 file changed, 145 insertions(+), 11 deletions(-) diff --git a/sklearn/metrics/pairwise.py b/sklearn/metrics/pairwise.py index 4e5c37dff0091..740c08fe7a2f6 100644 --- a/sklearn/metrics/pairwise.py +++ b/sklearn/metrics/pairwise.py @@ -740,6 +740,17 @@ def pairwise_distances_argmin_min( pairwise_distances : Distances between every pair of samples of X and Y. pairwise_distances_argmin : Same as `pairwise_distances_argmin_min` but only returns the argmins. + + Examples + -------- + >>> from sklearn.metrics.pairwise import pairwise_distances_argmin_min + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> argmin, distances = pairwise_distances_argmin_min(X, Y) + >>> argmin + array([0, 1]) + >>> distances + array([1., 1.]) """ X, Y = check_pairwise_arrays(X, Y) @@ -872,6 +883,14 @@ def pairwise_distances_argmin(X, Y, *, axis=1, metric="euclidean", metric_kwargs pairwise_distances : Distances between every pair of samples of X and Y. pairwise_distances_argmin_min : Same as `pairwise_distances_argmin` but also returns the distances. + + Examples + -------- + >>> from sklearn.metrics.pairwise import pairwise_distances_argmin + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> pairwise_distances_argmin(X, Y) + array([0, 1]) """ if metric_kwargs is None: metric_kwargs = {} @@ -952,7 +971,7 @@ def haversine_distances(X, Y=None): Returns ------- - distance : ndarray of shape (n_samples_X, n_samples_Y) + distances : ndarray of shape (n_samples_X, n_samples_Y) The distance matrix. Notes @@ -1006,7 +1025,7 @@ def manhattan_distances(X, Y=None): Returns ------- - D : ndarray of shape (n_samples_X, n_samples_Y) + distances : ndarray of shape (n_samples_X, n_samples_Y) Pairwise L1 distances. Notes @@ -1068,13 +1087,22 @@ def cosine_distances(X, Y=None): Returns ------- - distance matrix : ndarray of shape (n_samples_X, n_samples_Y) + distances : ndarray of shape (n_samples_X, n_samples_Y) Returns the cosine distance between samples in X and Y. See Also -------- cosine_similarity : Compute cosine similarity between samples in X and Y. scipy.spatial.distance.cosine : Dense matrices only. + + Examples + -------- + >>> from sklearn.metrics.pairwise import cosine_distances + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> cosine_distances(X, Y) + array([[1. , 1. ], + [0.42..., 0.18...]]) """ # 1.0 - cosine_similarity(X, Y) without copy S = cosine_similarity(X, Y) @@ -1111,6 +1139,14 @@ def paired_euclidean_distances(X, Y): distances : ndarray of shape (n_samples,) Output array/matrix containing the calculated paired euclidean distances. + + Examples + -------- + >>> from sklearn.metrics.pairwise import paired_euclidean_distances + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> paired_euclidean_distances(X, Y) + array([1., 1.]) """ X, Y = check_paired_arrays(X, Y) return row_norms(X - Y) @@ -1189,6 +1225,14 @@ def paired_cosine_distances(X, Y): ----- The cosine distance is equivalent to the half the squared euclidean distance if each sample is normalized to unit norm. + + Examples + -------- + >>> from sklearn.metrics.pairwise import paired_cosine_distances + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> paired_cosine_distances(X, Y) + array([0.5 , 0.18...]) """ X, Y = check_paired_arrays(X, Y) return 0.5 * row_norms(normalize(X) - normalize(Y), squared=True) @@ -1304,8 +1348,17 @@ def linear_kernel(X, Y=None, dense_output=True): Returns ------- - Gram matrix : ndarray of shape (n_samples_X, n_samples_Y) + kernel : ndarray of shape (n_samples_X, n_samples_Y) The Gram matrix of the linear kernel, i.e. `X @ Y.T`. + + Examples + -------- + >>> from sklearn.metrics.pairwise import linear_kernel + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> linear_kernel(X, Y) + array([[0., 0.], + [1., 2.]]) """ X, Y = check_pairwise_arrays(X, Y) return safe_sparse_dot(X, Y.T, dense_output=dense_output) @@ -1352,8 +1405,17 @@ def polynomial_kernel(X, Y=None, degree=3, gamma=None, coef0=1): Returns ------- - Gram matrix : ndarray of shape (n_samples_X, n_samples_Y) + kernel : ndarray of shape (n_samples_X, n_samples_Y) The polynomial kernel. + + Examples + -------- + >>> from sklearn.metrics.pairwise import polynomial_kernel + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> polynomial_kernel(X, Y, degree=2) + array([[1. , 1. ], + [1.77..., 2.77...]]) """ X, Y = check_pairwise_arrays(X, Y) if gamma is None: @@ -1402,8 +1464,17 @@ def sigmoid_kernel(X, Y=None, gamma=None, coef0=1): Returns ------- - Gram matrix : ndarray of shape (n_samples_X, n_samples_Y) + kernel : ndarray of shape (n_samples_X, n_samples_Y) Sigmoid kernel between two arrays. + + Examples + -------- + >>> from sklearn.metrics.pairwise import sigmoid_kernel + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> sigmoid_kernel(X, Y) + array([[0.76..., 0.76...], + [0.87..., 0.93...]]) """ X, Y = check_pairwise_arrays(X, Y) if gamma is None: @@ -1450,8 +1521,17 @@ def rbf_kernel(X, Y=None, gamma=None): Returns ------- - kernel_matrix : ndarray of shape (n_samples_X, n_samples_Y) + kernel : ndarray of shape (n_samples_X, n_samples_Y) The RBF kernel. + + Examples + -------- + >>> from sklearn.metrics.pairwise import rbf_kernel + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> rbf_kernel(X, Y) + array([[0.71..., 0.51...], + [0.51..., 0.71...]]) """ X, Y = check_pairwise_arrays(X, Y) if gamma is None: @@ -1500,8 +1580,17 @@ def laplacian_kernel(X, Y=None, gamma=None): Returns ------- - kernel_matrix : ndarray of shape (n_samples_X, n_samples_Y) + kernel : ndarray of shape (n_samples_X, n_samples_Y) The kernel matrix. + + Examples + -------- + >>> from sklearn.metrics.pairwise import laplacian_kernel + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> laplacian_kernel(X, Y) + array([[0.71..., 0.51...], + [0.51..., 0.71...]]) """ X, Y = check_pairwise_arrays(X, Y) if gamma is None: @@ -1551,8 +1640,17 @@ def cosine_similarity(X, Y=None, dense_output=True): Returns ------- - kernel matrix : ndarray of shape (n_samples_X, n_samples_Y) + similarities : ndarray of shape (n_samples_X, n_samples_Y) Returns the cosine similarity between samples in X and Y. + + Examples + -------- + >>> from sklearn.metrics.pairwise import cosine_similarity + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> cosine_similarity(X, Y) + array([[0. , 0. ], + [0.57..., 0.81...]]) """ # to avoid recursive import @@ -1598,7 +1696,7 @@ def additive_chi2_kernel(X, Y=None): Returns ------- - kernel_matrix : ndarray of shape (n_samples_X, n_samples_Y) + kernel : ndarray of shape (n_samples_X, n_samples_Y) The kernel matrix. See Also @@ -1620,6 +1718,15 @@ def additive_chi2_kernel(X, Y=None): categories: A comprehensive study International Journal of Computer Vision 2007 https://hal.archives-ouvertes.fr/hal-00171412/document + + Examples + -------- + >>> from sklearn.metrics.pairwise import additive_chi2_kernel + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> additive_chi2_kernel(X, Y) + array([[-1., -2.], + [-2., -1.]]) """ X, Y = check_pairwise_arrays(X, Y, accept_sparse=False) if (X < 0).any(): @@ -1668,7 +1775,7 @@ def chi2_kernel(X, Y=None, gamma=1.0): Returns ------- - kernel_matrix : ndarray of shape (n_samples_X, n_samples_Y) + kernel : ndarray of shape (n_samples_X, n_samples_Y) The kernel matrix. See Also @@ -1684,6 +1791,15 @@ def chi2_kernel(X, Y=None, gamma=1.0): categories: A comprehensive study International Journal of Computer Vision 2007 https://hal.archives-ouvertes.fr/hal-00171412/document + + Examples + -------- + >>> from sklearn.metrics.pairwise import chi2_kernel + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> chi2_kernel(X, Y) + array([[0.36..., 0.13...], + [0.13..., 0.36...]]) """ K = additive_chi2_kernel(X, Y) K *= gamma @@ -2163,6 +2279,15 @@ def pairwise_distances( order to limit memory usage. sklearn.metrics.pairwise.paired_distances : Computes the distances between corresponding elements of two arrays. + + Examples + -------- + >>> from sklearn.metrics.pairwise import pairwise_distances + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> pairwise_distances(X, Y, metric='sqeuclidean') + array([[1., 2.], + [2., 1.]]) """ if metric == "precomputed": X, _ = check_pairwise_arrays( @@ -2369,6 +2494,15 @@ def pairwise_kernels( Notes ----- If metric is 'precomputed', Y is ignored and X is returned. + + Examples + -------- + >>> from sklearn.metrics.pairwise import pairwise_kernels + >>> X = [[0, 0, 0], [1, 1, 1]] + >>> Y = [[1, 0, 0], [1, 1, 0]] + >>> pairwise_kernels(X, Y, metric='linear') + array([[0., 0.], + [1., 2.]]) """ # import GPKernel locally to prevent circular imports from ..gaussian_process.kernels import Kernel as GPKernel From 9d6384dcbdfd76895c4c62c2d064d9418ac8c711 Mon Sep 17 00:00:00 2001 From: Raj Pulapakura Date: Sat, 13 Jan 2024 06:02:40 +1100 Subject: [PATCH 045/554] DOC Add a docstring example for clustering functions (#27989) Co-authored-by: Guillaume Lemaitre --- sklearn/cluster/_affinity_propagation.py | 14 ++++ sklearn/cluster/_agglomerative.py | 18 +++++ sklearn/cluster/_kmeans.py | 17 +++++ sklearn/cluster/_mean_shift.py | 22 ++++++ sklearn/cluster/_optics.py | 86 ++++++++++++++++++++++++ sklearn/cluster/_spectral.py | 13 ++++ 6 files changed, 170 insertions(+) diff --git a/sklearn/cluster/_affinity_propagation.py b/sklearn/cluster/_affinity_propagation.py index b3b5869687c22..5587a7fd5aa1f 100644 --- a/sklearn/cluster/_affinity_propagation.py +++ b/sklearn/cluster/_affinity_propagation.py @@ -278,6 +278,20 @@ def affinity_propagation( ---------- Brendan J. Frey and Delbert Dueck, "Clustering by Passing Messages Between Data Points", Science Feb. 2007 + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import affinity_propagation + >>> from sklearn.metrics.pairwise import euclidean_distances + >>> X = np.array([[1, 2], [1, 4], [1, 0], + ... [4, 2], [4, 4], [4, 0]]) + >>> S = -euclidean_distances(X, squared=True) + >>> cluster_centers_indices, labels = affinity_propagation(S, random_state=0) + >>> cluster_centers_indices + array([0, 3]) + >>> labels + array([0, 0, 0, 1, 1, 1]) """ estimator = AffinityPropagation( damping=damping, diff --git a/sklearn/cluster/_agglomerative.py b/sklearn/cluster/_agglomerative.py index 5c803a2ae82bb..884d1605e70c3 100644 --- a/sklearn/cluster/_agglomerative.py +++ b/sklearn/cluster/_agglomerative.py @@ -270,6 +270,24 @@ def ward_tree(X, *, connectivity=None, n_clusters=None, return_distance=False): cluster in the forest, :math:`T=|v|+|s|+|t|`, and :math:`|*|` is the cardinality of its argument. This is also known as the incremental algorithm. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import ward_tree + >>> X = np.array([[1, 2], [1, 4], [1, 0], + ... [4, 2], [4, 4], [4, 0]]) + >>> children, n_connected_components, n_leaves, parents = ward_tree(X) + >>> children + array([[0, 1], + [3, 5], + [2, 6], + [4, 7], + [8, 9]]) + >>> n_connected_components + 1 + >>> n_leaves + 6 """ X = np.asarray(X) if X.ndim == 1: diff --git a/sklearn/cluster/_kmeans.py b/sklearn/cluster/_kmeans.py index 59470aae6c13f..0732b75f982b8 100644 --- a/sklearn/cluster/_kmeans.py +++ b/sklearn/cluster/_kmeans.py @@ -425,6 +425,23 @@ def k_means( best_n_iter : int Number of iterations corresponding to the best results. Returned only if `return_n_iter` is set to True. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import k_means + >>> X = np.array([[1, 2], [1, 4], [1, 0], + ... [10, 2], [10, 4], [10, 0]]) + >>> centroid, label, inertia = k_means( + ... X, n_clusters=2, n_init="auto", random_state=0 + ... ) + >>> centroid + array([[10., 2.], + [ 1., 2.]]) + >>> label + array([1, 1, 1, 0, 0, 0], dtype=int32) + >>> inertia + 16.0 """ est = KMeans( n_clusters=n_clusters, diff --git a/sklearn/cluster/_mean_shift.py b/sklearn/cluster/_mean_shift.py index a3ca7efba8743..fae11cca7df23 100644 --- a/sklearn/cluster/_mean_shift.py +++ b/sklearn/cluster/_mean_shift.py @@ -76,6 +76,15 @@ def estimate_bandwidth(X, *, quantile=0.3, n_samples=None, random_state=0, n_job ------- bandwidth : float The bandwidth parameter. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import estimate_bandwidth + >>> X = np.array([[1, 1], [2, 1], [1, 0], + ... [4, 7], [3, 5], [3, 6]]) + >>> estimate_bandwidth(X, quantile=0.5) + 1.61... """ X = check_array(X) @@ -211,6 +220,19 @@ def mean_shift( ----- For an example, see :ref:`examples/cluster/plot_mean_shift.py `. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import mean_shift + >>> X = np.array([[1, 1], [2, 1], [1, 0], + ... [4, 7], [3, 5], [3, 6]]) + >>> cluster_centers, labels = mean_shift(X, bandwidth=2) + >>> cluster_centers + array([[3.33..., 6. ], + [1.33..., 0.66...]]) + >>> labels + array([1, 1, 1, 0, 0, 0]) """ model = MeanShift( bandwidth=bandwidth, diff --git a/sklearn/cluster/_optics.py b/sklearn/cluster/_optics.py index 87cecfd8a93a6..493b7f40389cb 100755 --- a/sklearn/cluster/_optics.py +++ b/sklearn/cluster/_optics.py @@ -562,6 +562,34 @@ def compute_optics_graph( .. [1] Ankerst, Mihael, Markus M. Breunig, Hans-Peter Kriegel, and Jörg Sander. "OPTICS: ordering points to identify the clustering structure." ACM SIGMOD Record 28, no. 2 (1999): 49-60. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import compute_optics_graph + >>> X = np.array([[1, 2], [2, 5], [3, 6], + ... [8, 7], [8, 8], [7, 3]]) + >>> ordering, core_distances, reachability, predecessor = compute_optics_graph( + ... X, + ... min_samples=2, + ... max_eps=np.inf, + ... metric="minkowski", + ... p=2, + ... metric_params=None, + ... algorithm="auto", + ... leaf_size=30, + ... n_jobs=None, + ... ) + >>> ordering + array([0, 1, 2, 5, 3, 4]) + >>> core_distances + array([3.16..., 1.41..., 1.41..., 1. , 1. , + 4.12...]) + >>> reachability + array([ inf, 3.16..., 1.41..., 4.12..., 1. , + 5. ]) + >>> predecessor + array([-1, 0, 1, 5, 3, 2]) """ n_samples = X.shape[0] _validate_size(min_samples, n_samples, "min_samples") @@ -720,6 +748,33 @@ def cluster_optics_dbscan(*, reachability, core_distances, ordering, eps): ------- labels_ : array of shape (n_samples,) The estimated labels. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import cluster_optics_dbscan, compute_optics_graph + >>> X = np.array([[1, 2], [2, 5], [3, 6], + ... [8, 7], [8, 8], [7, 3]]) + >>> ordering, core_distances, reachability, predecessor = compute_optics_graph( + ... X, + ... min_samples=2, + ... max_eps=np.inf, + ... metric="minkowski", + ... p=2, + ... metric_params=None, + ... algorithm="auto", + ... leaf_size=30, + ... n_jobs=None, + ... ) + >>> eps = 4.5 + >>> labels = cluster_optics_dbscan( + ... reachability=reachability, + ... core_distances=core_distances, + ... ordering=ordering, + ... eps=eps, + ... ) + >>> labels + array([0, 0, 0, 1, 1, 1]) """ n_samples = len(core_distances) labels = np.zeros(n_samples, dtype=int) @@ -806,6 +861,37 @@ def cluster_optics_xi( clusters come after such nested smaller clusters. Since ``labels`` does not reflect the hierarchy, usually ``len(clusters) > np.unique(labels)``. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.cluster import cluster_optics_xi, compute_optics_graph + >>> X = np.array([[1, 2], [2, 5], [3, 6], + ... [8, 7], [8, 8], [7, 3]]) + >>> ordering, core_distances, reachability, predecessor = compute_optics_graph( + ... X, + ... min_samples=2, + ... max_eps=np.inf, + ... metric="minkowski", + ... p=2, + ... metric_params=None, + ... algorithm="auto", + ... leaf_size=30, + ... n_jobs=None + ... ) + >>> min_samples = 2 + >>> labels, clusters = cluster_optics_xi( + ... reachability=reachability, + ... predecessor=predecessor, + ... ordering=ordering, + ... min_samples=min_samples, + ... ) + >>> labels + array([0, 0, 0, 1, 1, 1]) + >>> clusters + array([[0, 2], + [3, 5], + [0, 5]]) """ n_samples = len(reachability) _validate_size(min_samples, n_samples, "min_samples") diff --git a/sklearn/cluster/_spectral.py b/sklearn/cluster/_spectral.py index d42b5526f0122..d925a2ff56bc4 100644 --- a/sklearn/cluster/_spectral.py +++ b/sklearn/cluster/_spectral.py @@ -346,6 +346,19 @@ def spectral_clustering( streaming graph challenge (Preliminary version at arXiv.) David Zhuzhunashvili, Andrew Knyazev <10.1109/HPEC.2017.8091045>` + + Examples + -------- + >>> import numpy as np + >>> from sklearn.metrics.pairwise import pairwise_kernels + >>> from sklearn.cluster import spectral_clustering + >>> X = np.array([[1, 1], [2, 1], [1, 0], + ... [4, 7], [3, 5], [3, 6]]) + >>> affinity = pairwise_kernels(X, metric='rbf') + >>> spectral_clustering( + ... affinity=affinity, n_clusters=2, assign_labels="discretize", random_state=0 + ... ) + array([1, 1, 1, 0, 0, 0]) """ clusterer = SpectralClustering( From 71e5271d8b717b77d720f205d4bfdcb1dcc51402 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Sat, 13 Jan 2024 09:56:00 +0100 Subject: [PATCH 046/554] TST Skip test using subprocess in Pyodide (#28116) --- sklearn/utils/tests/test_testing.py | 1 + 1 file changed, 1 insertion(+) diff --git a/sklearn/utils/tests/test_testing.py b/sklearn/utils/tests/test_testing.py index 7a4b02aeec224..f24b4de928201 100644 --- a/sklearn/utils/tests/test_testing.py +++ b/sklearn/utils/tests/test_testing.py @@ -823,6 +823,7 @@ def test_float32_aware_assert_allclose(): assert_allclose(np.array([1e-5], dtype=np.float32), 0.0, atol=2e-5) +@pytest.mark.xfail(_IS_WASM, reason="cannot start subprocess") def test_assert_run_python_script_without_output(): code = "x = 1" assert_run_python_script_without_output(code) From f48ee39832537424a7b5379b4826e8f0a0eb21d1 Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Sat, 13 Jan 2024 22:36:42 +0800 Subject: [PATCH 047/554] FIX dump svmlight when data is read-only (#28111) Co-authored-by: Guillaume Lemaitre --- doc/whats_new/v1.4.rst | 18 ++++++++++++++++++ sklearn/datasets/_svmlight_format_fast.pyx | 2 +- sklearn/datasets/tests/test_svmlight_format.py | 18 ++++++++++++++++++ 3 files changed, 37 insertions(+), 1 deletion(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index a932391b732cd..25a3f600c5446 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -2,6 +2,24 @@ .. currentmodule:: sklearn +.. _changes_1_4_1: + +Version 1.4.1 +============= + +**In Development** + +Changelog +--------- + +:mod:`sklearn.datasets` +....................... + +- |Fix| :func:`datasets.dump_svmlight_file` now does not raise `ValueError` when `X` + is read-only, e.g., a `numpy.memmap` instance. + :pr:`28111` by :user:`Yao Xiao `. + + .. _changes_1_4: Version 1.4.0 diff --git a/sklearn/datasets/_svmlight_format_fast.pyx b/sklearn/datasets/_svmlight_format_fast.pyx index 31530ed55d251..103d43bf88965 100644 --- a/sklearn/datasets/_svmlight_format_fast.pyx +++ b/sklearn/datasets/_svmlight_format_fast.pyx @@ -131,7 +131,7 @@ ctypedef fused int_or_longlong: def get_dense_row_string( - int_or_float[:, :] X, + const int_or_float[:, :] X, Py_ssize_t[:] x_inds, double_or_longlong[:] x_vals, Py_ssize_t row, diff --git a/sklearn/datasets/tests/test_svmlight_format.py b/sklearn/datasets/tests/test_svmlight_format.py index 10b0e29810ef7..5c641dd79cc63 100644 --- a/sklearn/datasets/tests/test_svmlight_format.py +++ b/sklearn/datasets/tests/test_svmlight_format.py @@ -16,6 +16,7 @@ assert_allclose, assert_array_almost_equal, assert_array_equal, + create_memmap_backed_data, fails_if_pypy, ) from sklearn.utils.fixes import CSR_CONTAINERS @@ -596,3 +597,20 @@ def test_multilabel_y_explicit_zeros(tmp_path, csr_container): _, y_load = load_svmlight_file(save_path, multilabel=True) y_true = [(2.0,), (2.0,), (0.0, 1.0)] assert y_load == y_true + + +def test_dump_read_only(tmp_path): + """Ensure that there is no ValueError when dumping a read-only `X`. + + Non-regression test for: + https://github.com/scikit-learn/scikit-learn/issues/28026 + """ + rng = np.random.RandomState(42) + X = rng.randn(5, 2) + y = rng.randn(5) + + # Convert to memmap-backed which are read-only + X, y = create_memmap_backed_data([X, y]) + + save_path = str(tmp_path / "svm_read_only") + dump_svmlight_file(X, y, save_path) From b3a54c085b6319458da33868957b9a783f7714d2 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Sat, 13 Jan 2024 16:07:19 +0100 Subject: [PATCH 048/554] TST Tweak one more test to facilitate Meson usage (#28112) --- sklearn/tests/test_common.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/sklearn/tests/test_common.py b/sklearn/tests/test_common.py index 6dbf54b203e4c..cbd658f25ed28 100644 --- a/sklearn/tests/test_common.py +++ b/sklearn/tests/test_common.py @@ -262,9 +262,10 @@ def test_all_tests_are_importable(): "sklearn.datasets.descr", "sklearn.datasets.images", } + sklearn_path = [os.path.dirname(sklearn.__file__)] lookup = { name: ispkg - for _, name, ispkg in pkgutil.walk_packages(sklearn.__path__, prefix="sklearn.") + for _, name, ispkg in pkgutil.walk_packages(sklearn_path, prefix="sklearn.") } missing_tests = [ name From 198908beb9dce5800a3ef78763a8b48afbc5cff4 Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Sun, 14 Jan 2024 00:26:08 +0800 Subject: [PATCH 049/554] DOC fix wrong indentations in the documentation that lead to undesired blockquotes (#28107) Co-authored-by: Guillaume Lemaitre --- doc/about.rst | 68 ++-- doc/computing/computational_performance.rst | 28 +- doc/computing/parallelism.rst | 20 +- doc/computing/scaling_strategies.rst | 52 +-- doc/developers/bug_triaging.rst | 18 +- doc/developers/contributing.rst | 100 ++--- doc/developers/cython.rst | 28 +- doc/developers/maintainer.rst | 98 ++--- doc/developers/minimal_reproducer.rst | 80 ++-- doc/developers/performance.rst | 54 +-- doc/developers/tips.rst | 168 ++++---- doc/model_persistence.rst | 2 +- doc/modules/clustering.rst | 62 +-- doc/modules/cross_validation.rst | 8 +- doc/modules/decomposition.rst | 56 +-- doc/modules/ensemble.rst | 134 +++---- doc/modules/feature_extraction.rst | 124 +++--- doc/modules/feature_selection.rst | 22 +- doc/modules/gaussian_process.rst | 28 +- doc/modules/grid_search.rst | 16 +- doc/modules/isotonic.rst | 6 +- doc/modules/kernel_approximation.rst | 12 +- doc/modules/linear_model.rst | 84 ++-- doc/modules/metrics.rst | 21 +- doc/modules/mixture.rst | 12 +- doc/modules/multiclass.rst | 58 +-- doc/modules/neighbors.rst | 10 +- doc/modules/neural_networks_supervised.rst | 76 ++-- doc/modules/outlier_detection.rst | 8 +- doc/modules/preprocessing.rst | 40 +- doc/modules/semi_supervised.rst | 8 +- doc/modules/sgd.rst | 142 +++---- doc/modules/svm.rst | 230 +++++------ doc/modules/tree.rst | 348 ++++++++--------- doc/presentations.rst | 26 +- doc/support.rst | 6 +- doc/tutorial/basic/tutorial.rst | 70 ++-- .../statistical_inference/model_selection.rst | 6 +- .../putting_together.rst | 2 +- .../supervised_learning.rst | 8 +- .../unsupervised_learning.rst | 27 +- .../text_analytics/working_with_text_data.rst | 30 +- doc/whats_new/older_versions.rst | 365 +++++++++--------- doc/whats_new/v0.13.rst | 163 ++++---- doc/whats_new/v0.14.rst | 176 ++++----- doc/whats_new/v0.20.rst | 2 +- sklearn/datasets/descr/breast_cancer.rst | 154 ++++---- sklearn/datasets/descr/california_housing.rst | 24 +- sklearn/datasets/descr/covtype.rst | 12 +- sklearn/datasets/descr/diabetes.rst | 34 +- sklearn/datasets/descr/digits.rst | 14 +- sklearn/datasets/descr/iris.rst | 54 +-- sklearn/datasets/descr/kddcup99.rst | 88 ++--- sklearn/datasets/descr/lfw.rst | 14 +- sklearn/datasets/descr/linnerud.rst | 8 +- sklearn/datasets/descr/olivetti_faces.rst | 20 +- sklearn/datasets/descr/rcv1.rst | 38 +- sklearn/datasets/descr/twenty_newsgroups.rst | 12 +- sklearn/datasets/descr/wine_data.rst | 133 ++++--- 59 files changed, 1865 insertions(+), 1842 deletions(-) diff --git a/doc/about.rst b/doc/about.rst index e462963135b58..2ef0718b92f7e 100644 --- a/doc/about.rst +++ b/doc/about.rst @@ -96,44 +96,44 @@ Citing scikit-learn If you use scikit-learn in a scientific publication, we would appreciate citations to the following paper: - `Scikit-learn: Machine Learning in Python - `_, Pedregosa - *et al.*, JMLR 12, pp. 2825-2830, 2011. - - Bibtex entry:: - - @article{scikit-learn, - title={Scikit-learn: Machine Learning in {P}ython}, - author={Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V. - and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P. - and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and - Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E.}, - journal={Journal of Machine Learning Research}, - volume={12}, - pages={2825--2830}, - year={2011} - } +`Scikit-learn: Machine Learning in Python +`_, Pedregosa +*et al.*, JMLR 12, pp. 2825-2830, 2011. + +Bibtex entry:: + + @article{scikit-learn, + title={Scikit-learn: Machine Learning in {P}ython}, + author={Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V. + and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P. + and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and + Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E.}, + journal={Journal of Machine Learning Research}, + volume={12}, + pages={2825--2830}, + year={2011} + } If you want to cite scikit-learn for its API or design, you may also want to consider the following paper: - :arxiv:`API design for machine learning software: experiences from the scikit-learn - project <1309.0238>`, Buitinck *et al.*, 2013. - - Bibtex entry:: - - @inproceedings{sklearn_api, - author = {Lars Buitinck and Gilles Louppe and Mathieu Blondel and - Fabian Pedregosa and Andreas Mueller and Olivier Grisel and - Vlad Niculae and Peter Prettenhofer and Alexandre Gramfort - and Jaques Grobler and Robert Layton and Jake VanderPlas and - Arnaud Joly and Brian Holt and Ga{\"{e}}l Varoquaux}, - title = {{API} design for machine learning software: experiences from the scikit-learn - project}, - booktitle = {ECML PKDD Workshop: Languages for Data Mining and Machine Learning}, - year = {2013}, - pages = {108--122}, - } +:arxiv:`API design for machine learning software: experiences from the scikit-learn +project <1309.0238>`, Buitinck *et al.*, 2013. + +Bibtex entry:: + + @inproceedings{sklearn_api, + author = {Lars Buitinck and Gilles Louppe and Mathieu Blondel and + Fabian Pedregosa and Andreas Mueller and Olivier Grisel and + Vlad Niculae and Peter Prettenhofer and Alexandre Gramfort + and Jaques Grobler and Robert Layton and Jake VanderPlas and + Arnaud Joly and Brian Holt and Ga{\"{e}}l Varoquaux}, + title = {{API} design for machine learning software: experiences from the scikit-learn + project}, + booktitle = {ECML PKDD Workshop: Languages for Data Mining and Machine Learning}, + year = {2013}, + pages = {108--122}, + } Artwork ------- diff --git a/doc/computing/computational_performance.rst b/doc/computing/computational_performance.rst index dd5720630c377..d6864689502c2 100644 --- a/doc/computing/computational_performance.rst +++ b/doc/computing/computational_performance.rst @@ -39,10 +39,11 @@ machine learning toolkit is the latency at which predictions can be made in a production environment. The main factors that influence the prediction latency are - 1. Number of features - 2. Input data representation and sparsity - 3. Model complexity - 4. Feature extraction + +1. Number of features +2. Input data representation and sparsity +3. Model complexity +4. Feature extraction A last major parameter is also the possibility to do predictions in bulk or one-at-a-time mode. @@ -224,9 +225,9 @@ files, tokenizing the text and hashing it into a common vector space) is taking 100 to 500 times more time than the actual prediction code, depending on the chosen model. - .. |prediction_time| image:: ../auto_examples/applications/images/sphx_glr_plot_out_of_core_classification_004.png - :target: ../auto_examples/applications/plot_out_of_core_classification.html - :scale: 80 +.. |prediction_time| image:: ../auto_examples/applications/images/sphx_glr_plot_out_of_core_classification_004.png + :target: ../auto_examples/applications/plot_out_of_core_classification.html + :scale: 80 .. centered:: |prediction_time| @@ -283,10 +284,11 @@ scikit-learn install with the following command:: python -c "import sklearn; sklearn.show_versions()" Optimized BLAS / LAPACK implementations include: - - Atlas (need hardware specific tuning by rebuilding on the target machine) - - OpenBLAS - - MKL - - Apple Accelerate and vecLib frameworks (OSX only) + +- Atlas (need hardware specific tuning by rebuilding on the target machine) +- OpenBLAS +- MKL +- Apple Accelerate and vecLib frameworks (OSX only) More information can be found on the `NumPy install page `_ and in this @@ -364,5 +366,5 @@ sufficient to not generate the relevant features, leaving their columns empty. Links ...... - - :ref:`scikit-learn developer performance documentation ` - - `Scipy sparse matrix formats documentation `_ +- :ref:`scikit-learn developer performance documentation ` +- `Scipy sparse matrix formats documentation `_ diff --git a/doc/computing/parallelism.rst b/doc/computing/parallelism.rst index 0cd02ab5a0449..0fcbf00cd6c04 100644 --- a/doc/computing/parallelism.rst +++ b/doc/computing/parallelism.rst @@ -87,15 +87,15 @@ will use as many threads as possible, i.e. as many threads as logical cores. You can control the exact number of threads that are used either: - - via the ``OMP_NUM_THREADS`` environment variable, for instance when: - running a python script: +- via the ``OMP_NUM_THREADS`` environment variable, for instance when: + running a python script: - .. prompt:: bash $ + .. prompt:: bash $ - OMP_NUM_THREADS=4 python my_script.py + OMP_NUM_THREADS=4 python my_script.py - - or via `threadpoolctl` as explained by `this piece of documentation - `_. +- or via `threadpoolctl` as explained by `this piece of documentation + `_. Parallel NumPy and SciPy routines from numerical libraries .......................................................... @@ -107,15 +107,15 @@ such as MKL, OpenBLAS or BLIS. You can control the exact number of threads used by BLAS for each library using environment variables, namely: - - ``MKL_NUM_THREADS`` sets the number of thread MKL uses, - - ``OPENBLAS_NUM_THREADS`` sets the number of threads OpenBLAS uses - - ``BLIS_NUM_THREADS`` sets the number of threads BLIS uses +- ``MKL_NUM_THREADS`` sets the number of thread MKL uses, +- ``OPENBLAS_NUM_THREADS`` sets the number of threads OpenBLAS uses +- ``BLIS_NUM_THREADS`` sets the number of threads BLIS uses Note that BLAS & LAPACK implementations can also be impacted by `OMP_NUM_THREADS`. To check whether this is the case in your environment, you can inspect how the number of threads effectively used by those libraries is affected when running the following command in a bash or zsh terminal -for different values of `OMP_NUM_THREADS`:: +for different values of `OMP_NUM_THREADS`: .. prompt:: bash $ diff --git a/doc/computing/scaling_strategies.rst b/doc/computing/scaling_strategies.rst index 277d499f4cc13..143643131b0e8 100644 --- a/doc/computing/scaling_strategies.rst +++ b/doc/computing/scaling_strategies.rst @@ -20,9 +20,9 @@ data that cannot fit in a computer's main memory (RAM). Here is a sketch of a system designed to achieve this goal: - 1. a way to stream instances - 2. a way to extract features from instances - 3. an incremental algorithm +1. a way to stream instances +2. a way to extract features from instances +3. an incremental algorithm Streaming instances .................... @@ -62,29 +62,29 @@ balances relevancy and memory footprint could involve some tuning [1]_. Here is a list of incremental estimators for different tasks: - - Classification - + :class:`sklearn.naive_bayes.MultinomialNB` - + :class:`sklearn.naive_bayes.BernoulliNB` - + :class:`sklearn.linear_model.Perceptron` - + :class:`sklearn.linear_model.SGDClassifier` - + :class:`sklearn.linear_model.PassiveAggressiveClassifier` - + :class:`sklearn.neural_network.MLPClassifier` - - Regression - + :class:`sklearn.linear_model.SGDRegressor` - + :class:`sklearn.linear_model.PassiveAggressiveRegressor` - + :class:`sklearn.neural_network.MLPRegressor` - - Clustering - + :class:`sklearn.cluster.MiniBatchKMeans` - + :class:`sklearn.cluster.Birch` - - Decomposition / feature Extraction - + :class:`sklearn.decomposition.MiniBatchDictionaryLearning` - + :class:`sklearn.decomposition.IncrementalPCA` - + :class:`sklearn.decomposition.LatentDirichletAllocation` - + :class:`sklearn.decomposition.MiniBatchNMF` - - Preprocessing - + :class:`sklearn.preprocessing.StandardScaler` - + :class:`sklearn.preprocessing.MinMaxScaler` - + :class:`sklearn.preprocessing.MaxAbsScaler` +- Classification + + :class:`sklearn.naive_bayes.MultinomialNB` + + :class:`sklearn.naive_bayes.BernoulliNB` + + :class:`sklearn.linear_model.Perceptron` + + :class:`sklearn.linear_model.SGDClassifier` + + :class:`sklearn.linear_model.PassiveAggressiveClassifier` + + :class:`sklearn.neural_network.MLPClassifier` +- Regression + + :class:`sklearn.linear_model.SGDRegressor` + + :class:`sklearn.linear_model.PassiveAggressiveRegressor` + + :class:`sklearn.neural_network.MLPRegressor` +- Clustering + + :class:`sklearn.cluster.MiniBatchKMeans` + + :class:`sklearn.cluster.Birch` +- Decomposition / feature Extraction + + :class:`sklearn.decomposition.MiniBatchDictionaryLearning` + + :class:`sklearn.decomposition.IncrementalPCA` + + :class:`sklearn.decomposition.LatentDirichletAllocation` + + :class:`sklearn.decomposition.MiniBatchNMF` +- Preprocessing + + :class:`sklearn.preprocessing.StandardScaler` + + :class:`sklearn.preprocessing.MinMaxScaler` + + :class:`sklearn.preprocessing.MaxAbsScaler` For classification, a somewhat important thing to note is that although a stateless feature extraction routine may be able to cope with new/unseen diff --git a/doc/developers/bug_triaging.rst b/doc/developers/bug_triaging.rst index 3ec628f7e5867..915ea0a9a22b7 100644 --- a/doc/developers/bug_triaging.rst +++ b/doc/developers/bug_triaging.rst @@ -19,18 +19,18 @@ A third party can give useful feedback or even add comments on the issue. The following actions are typically useful: - - documenting issues that are missing elements to reproduce the problem - such as code samples +- documenting issues that are missing elements to reproduce the problem + such as code samples - - suggesting better use of code formatting +- suggesting better use of code formatting - - suggesting to reformulate the title and description to make them more - explicit about the problem to be solved +- suggesting to reformulate the title and description to make them more + explicit about the problem to be solved - - linking to related issues or discussions while briefly describing how - they are related, for instance "See also #xyz for a similar attempt - at this" or "See also #xyz where the same thing happened in - SomeEstimator" provides context and helps the discussion. +- linking to related issues or discussions while briefly describing how + they are related, for instance "See also #xyz for a similar attempt + at this" or "See also #xyz where the same thing happened in + SomeEstimator" provides context and helps the discussion. .. topic:: Fruitful discussions diff --git a/doc/developers/contributing.rst b/doc/developers/contributing.rst index 02e02eb485e8a..26f952b543a03 100644 --- a/doc/developers/contributing.rst +++ b/doc/developers/contributing.rst @@ -291,7 +291,7 @@ The next steps now describe the process of modifying code and submitting a PR: 9. Create a feature branch to hold your development changes: - .. prompt:: bash $ + .. prompt:: bash $ git checkout -b my_feature @@ -529,25 +529,25 @@ Continuous Integration (CI) Please note that if one of the following markers appear in the latest commit message, the following actions are taken. - ====================== =================== - Commit Message Marker Action Taken by CI - ---------------------- ------------------- - [ci skip] CI is skipped completely - [cd build] CD is run (wheels and source distribution are built) - [cd build gh] CD is run only for GitHub Actions - [cd build cirrus] CD is run only for Cirrus CI - [lint skip] Azure pipeline skips linting - [scipy-dev] Build & test with our dependencies (numpy, scipy, etc.) development builds - [nogil] Build & test with the nogil experimental branches of CPython, Cython, NumPy, SciPy, ... - [pypy] Build & test with PyPy - [pyodide] Build & test with Pyodide - [azure parallel] Run Azure CI jobs in parallel - [cirrus arm] Run Cirrus CI ARM test - [float32] Run float32 tests by setting `SKLEARN_RUN_FLOAT32_TESTS=1`. See :ref:`environment_variable` for more details - [doc skip] Docs are not built - [doc quick] Docs built, but excludes example gallery plots - [doc build] Docs built including example gallery plots (very long) - ====================== =================== +====================== =================== +Commit Message Marker Action Taken by CI +---------------------- ------------------- +[ci skip] CI is skipped completely +[cd build] CD is run (wheels and source distribution are built) +[cd build gh] CD is run only for GitHub Actions +[cd build cirrus] CD is run only for Cirrus CI +[lint skip] Azure pipeline skips linting +[scipy-dev] Build & test with our dependencies (numpy, scipy, etc.) development builds +[nogil] Build & test with the nogil experimental branches of CPython, Cython, NumPy, SciPy, ... +[pypy] Build & test with PyPy +[pyodide] Build & test with Pyodide +[azure parallel] Run Azure CI jobs in parallel +[cirrus arm] Run Cirrus CI ARM test +[float32] Run float32 tests by setting `SKLEARN_RUN_FLOAT32_TESTS=1`. See :ref:`environment_variable` for more details +[doc skip] Docs are not built +[doc quick] Docs built, but excludes example gallery plots +[doc build] Docs built including example gallery plots (very long) +====================== =================== Note that, by default, the documentation is built but only the examples that are directly modified by the pull request are executed. @@ -713,30 +713,30 @@ We are glad to accept any sort of documentation: In general have the following in mind: - * Use Python basic types. (``bool`` instead of ``boolean``) - * Use parenthesis for defining shapes: ``array-like of shape (n_samples,)`` - or ``array-like of shape (n_samples, n_features)`` - * For strings with multiple options, use brackets: ``input: {'log', - 'squared', 'multinomial'}`` - * 1D or 2D data can be a subset of ``{array-like, ndarray, sparse matrix, - dataframe}``. Note that ``array-like`` can also be a ``list``, while - ``ndarray`` is explicitly only a ``numpy.ndarray``. - * Specify ``dataframe`` when "frame-like" features are being used, such as - the column names. - * When specifying the data type of a list, use ``of`` as a delimiter: ``list - of int``. When the parameter supports arrays giving details about the - shape and/or data type and a list of such arrays, you can use one of - ``array-like of shape (n_samples,) or list of such arrays``. - * When specifying the dtype of an ndarray, use e.g. ``dtype=np.int32`` after - defining the shape: ``ndarray of shape (n_samples,), dtype=np.int32``. You - can specify multiple dtype as a set: ``array-like of shape (n_samples,), - dtype={np.float64, np.float32}``. If one wants to mention arbitrary - precision, use `integral` and `floating` rather than the Python dtype - `int` and `float`. When both `int` and `floating` are supported, there is - no need to specify the dtype. - * When the default is ``None``, ``None`` only needs to be specified at the - end with ``default=None``. Be sure to include in the docstring, what it - means for the parameter or attribute to be ``None``. + * Use Python basic types. (``bool`` instead of ``boolean``) + * Use parenthesis for defining shapes: ``array-like of shape (n_samples,)`` + or ``array-like of shape (n_samples, n_features)`` + * For strings with multiple options, use brackets: ``input: {'log', + 'squared', 'multinomial'}`` + * 1D or 2D data can be a subset of ``{array-like, ndarray, sparse matrix, + dataframe}``. Note that ``array-like`` can also be a ``list``, while + ``ndarray`` is explicitly only a ``numpy.ndarray``. + * Specify ``dataframe`` when "frame-like" features are being used, such as + the column names. + * When specifying the data type of a list, use ``of`` as a delimiter: ``list + of int``. When the parameter supports arrays giving details about the + shape and/or data type and a list of such arrays, you can use one of + ``array-like of shape (n_samples,) or list of such arrays``. + * When specifying the dtype of an ndarray, use e.g. ``dtype=np.int32`` after + defining the shape: ``ndarray of shape (n_samples,), dtype=np.int32``. You + can specify multiple dtype as a set: ``array-like of shape (n_samples,), + dtype={np.float64, np.float32}``. If one wants to mention arbitrary + precision, use `integral` and `floating` rather than the Python dtype + `int` and `float`. When both `int` and `floating` are supported, there is + no need to specify the dtype. + * When the default is ``None``, ``None`` only needs to be specified at the + end with ``default=None``. Be sure to include in the docstring, what it + means for the parameter or attribute to be ``None``. * Add "See Also" in docstrings for related classes/functions. @@ -809,15 +809,15 @@ details, and give intuition to the reader on what the algorithm does. * Information that can be hidden by default using dropdowns is: - * low hierarchy sections such as `References`, `Properties`, etc. (see for - instance the subsections in :ref:`det_curve`); + * low hierarchy sections such as `References`, `Properties`, etc. (see for + instance the subsections in :ref:`det_curve`); - * in-depth mathematical details; + * in-depth mathematical details; - * narrative that is use-case specific; + * narrative that is use-case specific; - * in general, narrative that may only interest users that want to go beyond - the pragmatics of a given tool. + * in general, narrative that may only interest users that want to go beyond + the pragmatics of a given tool. * Do not use dropdowns for the low level section `Examples`, as it should stay visible to all users. Make sure that the `Examples` section comes right after diff --git a/doc/developers/cython.rst b/doc/developers/cython.rst index 8558169848052..e98501879d50e 100644 --- a/doc/developers/cython.rst +++ b/doc/developers/cython.rst @@ -58,13 +58,13 @@ Tips to ease development * You might find this alias to compile individual Cython extension handy: - .. code-block:: + .. code-block:: - # You might want to add this alias to your shell script config. - alias cythonX="cython -X language_level=3 -X boundscheck=False -X wraparound=False -X initializedcheck=False -X nonecheck=False -X cdivision=True" + # You might want to add this alias to your shell script config. + alias cythonX="cython -X language_level=3 -X boundscheck=False -X wraparound=False -X initializedcheck=False -X nonecheck=False -X cdivision=True" - # This generates `source.c` as if you had recompiled scikit-learn entirely. - cythonX --annotate source.pyx + # This generates `source.c` as if you had recompiled scikit-learn entirely. + cythonX --annotate source.pyx * Using the ``--annotate`` option with this flag allows generating a HTML report of code annotation. This report indicates interactions with the CPython interpreter on a line-by-line basis. @@ -72,10 +72,10 @@ Tips to ease development the computationally intensive sections of the algorithms. For more information, please refer to `this section of Cython's tutorial `_ - .. code-block:: + .. code-block:: - # This generates a HTML report (`source.html`) for `source.c`. - cythonX --annotate source.pyx + # This generates a HTML report (`source.html`) for `source.c`. + cythonX --annotate source.pyx Tips for performance ^^^^^^^^^^^^^^^^^^^^ @@ -107,16 +107,16 @@ Tips for performance the GIL when entering them. You have to do that yourself either by passing ``nogil=True`` to ``cython.parallel.prange`` explicitly, or by using an explicit context manager: - .. code-block:: cython + .. code-block:: cython - cdef inline void my_func(self) nogil: + cdef inline void my_func(self) nogil: - # Some logic interacting with CPython, e.g. allocating arrays via NumPy. + # Some logic interacting with CPython, e.g. allocating arrays via NumPy. - with nogil: - # The code here is run as is it were written in C. + with nogil: + # The code here is run as is it were written in C. - return 0 + return 0 This item is based on `this comment from Stéfan's Benhel `_ diff --git a/doc/developers/maintainer.rst b/doc/developers/maintainer.rst index d2a1d21523f78..048ad5d9906a1 100644 --- a/doc/developers/maintainer.rst +++ b/doc/developers/maintainer.rst @@ -81,16 +81,16 @@ tag under that branch. This is done only once, as the major and minor releases happen on the same branch: - .. prompt:: bash $ +.. prompt:: bash $ - # Assuming upstream is an alias for the main scikit-learn repo: - git fetch upstream main - git checkout upstream/main - git checkout -b 0.99.X - git push --set-upstream upstream 0.99.X + # Assuming upstream is an alias for the main scikit-learn repo: + git fetch upstream main + git checkout upstream/main + git checkout -b 0.99.X + git push --set-upstream upstream 0.99.X - Again, `X` is literal here, and `99` is replaced by the release number. - The branches are called ``0.19.X``, ``0.20.X``, etc. +Again, `X` is literal here, and `99` is replaced by the release number. +The branches are called ``0.19.X``, ``0.20.X``, etc. In terms of including changes, the first RC ideally counts as a *feature freeze*. Each coming release candidate and the final release afterwards will @@ -121,67 +121,67 @@ The minor releases should include bug fixes and some relevant documentation changes only. Any PR resulting in a behavior change which is not a bug fix should be excluded. As an example, instructions are given for the `1.2.2` release. - - Create a branch, **on your own fork** (here referred to as `fork`) for the release - from `upstream/main`. +- Create a branch, **on your own fork** (here referred to as `fork`) for the release + from `upstream/main`. - .. prompt:: bash $ + .. prompt:: bash $ - git fetch upstream/main - git checkout -b release-1.2.2 upstream/main - git push -u fork release-1.2.2:release-1.2.2 + git fetch upstream/main + git checkout -b release-1.2.2 upstream/main + git push -u fork release-1.2.2:release-1.2.2 - - Create a **draft** PR to the `upstream/1.2.X` branch (not to `upstream/main`) - with all the desired changes. +- Create a **draft** PR to the `upstream/1.2.X` branch (not to `upstream/main`) + with all the desired changes. - - Do not push anything on that branch yet. +- Do not push anything on that branch yet. - - Locally rebase `release-1.2.2` from the `upstream/1.2.X` branch using: +- Locally rebase `release-1.2.2` from the `upstream/1.2.X` branch using: - .. prompt:: bash $ + .. prompt:: bash $ - git rebase -i upstream/1.2.X + git rebase -i upstream/1.2.X - This will open an interactive rebase with the `git-rebase-todo` containing all - the latest commit on `main`. At this stage, you have to perform - this interactive rebase with at least someone else (being three people rebasing - is better not to forget something and to avoid any doubt). + This will open an interactive rebase with the `git-rebase-todo` containing all + the latest commit on `main`. At this stage, you have to perform + this interactive rebase with at least someone else (being three people rebasing + is better not to forget something and to avoid any doubt). - - **Do not remove lines but drop commit by replace** ``pick`` **with** ``drop`` + - **Do not remove lines but drop commit by replace** ``pick`` **with** ``drop`` - - Commits to pick for bug-fix release *generally* are prefixed with: `FIX`, `CI`, - `DOC`. They should at least include all the commits of the merged PRs - that were milestoned for this release on GitHub and/or documented as such in - the changelog. It's likely that some bugfixes were documented in the - changelog of the main major release instead of the next bugfix release, - in which case, the matching changelog entries will need to be moved, - first in the `main` branch then backported in the release PR. + - Commits to pick for bug-fix release *generally* are prefixed with: `FIX`, `CI`, + `DOC`. They should at least include all the commits of the merged PRs + that were milestoned for this release on GitHub and/or documented as such in + the changelog. It's likely that some bugfixes were documented in the + changelog of the main major release instead of the next bugfix release, + in which case, the matching changelog entries will need to be moved, + first in the `main` branch then backported in the release PR. - - Commits to drop for bug-fix release *generally* are prefixed with: `FEAT`, - `MAINT`, `ENH`, `API`. Reasons for not including them is to prevent change of - behavior (which only must feature in breaking or major releases). + - Commits to drop for bug-fix release *generally* are prefixed with: `FEAT`, + `MAINT`, `ENH`, `API`. Reasons for not including them is to prevent change of + behavior (which only must feature in breaking or major releases). - - After having dropped or picked commit, **do no exit** but paste the content - of the `git-rebase-todo` message in the PR. - This file is located at `.git/rebase-merge/git-rebase-todo`. + - After having dropped or picked commit, **do no exit** but paste the content + of the `git-rebase-todo` message in the PR. + This file is located at `.git/rebase-merge/git-rebase-todo`. - - Save and exit, starting the interactive rebase. + - Save and exit, starting the interactive rebase. - - Resolve merge conflicts when they happen. + - Resolve merge conflicts when they happen. - - Force push the result of the rebase and the extra release commits to the release PR: +- Force push the result of the rebase and the extra release commits to the release PR: - .. prompt:: bash $ + .. prompt:: bash $ - git push -f fork release-1.2.2:release-1.2.2 + git push -f fork release-1.2.2:release-1.2.2 - - Copy the :ref:`release_checklist` template and paste it in the description of the - Pull Request to track progress. +- Copy the :ref:`release_checklist` template and paste it in the description of the + Pull Request to track progress. - - Review all the commits included in the release to make sure that they do not - introduce any new feature. We should not blindly trust the commit message prefixes. +- Review all the commits included in the release to make sure that they do not + introduce any new feature. We should not blindly trust the commit message prefixes. - - Remove the draft status of the release PR and invite other maintainers to review the - list of included commits. +- Remove the draft status of the release PR and invite other maintainers to review the + list of included commits. .. _making_a_release: diff --git a/doc/developers/minimal_reproducer.rst b/doc/developers/minimal_reproducer.rst index 2cc82d083aaf1..b100bccbaa6b4 100644 --- a/doc/developers/minimal_reproducer.rst +++ b/doc/developers/minimal_reproducer.rst @@ -88,9 +88,9 @@ The following code, while **still not minimal**, is already **much better** because it can be copy-pasted in a Python terminal to reproduce the problem in one step. In particular: - - it contains **all necessary imports statements**; - - it can fetch the public dataset without having to manually download a - file and put it in the expected location on the disk. +- it contains **all necessary imports statements**; +- it can fetch the public dataset without having to manually download a + file and put it in the expected location on the disk. **Improved example** @@ -199,21 +199,21 @@ As already mentioned, the key to communication is the readability of the code and good formatting can really be a plus. Notice that in the previous snippet we: - - try to limit all lines to a maximum of 79 characters to avoid horizontal - scrollbars in the code snippets blocks rendered on the GitHub issue; - - use blank lines to separate groups of related functions; - - place all the imports in their own group at the beginning. +- try to limit all lines to a maximum of 79 characters to avoid horizontal + scrollbars in the code snippets blocks rendered on the GitHub issue; +- use blank lines to separate groups of related functions; +- place all the imports in their own group at the beginning. The simplification steps presented in this guide can be implemented in a different order than the progression we have shown here. The important points are: - - a minimal reproducer should be runnable by a simple copy-and-paste in a - python terminal; - - it should be simplified as much as possible by removing any code steps - that are not strictly needed to reproducing the original problem; - - it should ideally only rely on a minimal dataset generated on-the-fly by - running the code instead of relying on external data, if possible. +- a minimal reproducer should be runnable by a simple copy-and-paste in a + python terminal; +- it should be simplified as much as possible by removing any code steps + that are not strictly needed to reproducing the original problem; +- it should ideally only rely on a minimal dataset generated on-the-fly by + running the code instead of relying on external data, if possible. Use markdown formatting @@ -305,50 +305,50 @@ can be used to create dummy numeric data. - regression - Regressions take continuous numeric data as features and target. + Regressions take continuous numeric data as features and target. - .. code-block:: python + .. code-block:: python - import numpy as np + import numpy as np - rng = np.random.RandomState(0) - n_samples, n_features = 5, 5 - X = rng.randn(n_samples, n_features) - y = rng.randn(n_samples) + rng = np.random.RandomState(0) + n_samples, n_features = 5, 5 + X = rng.randn(n_samples, n_features) + y = rng.randn(n_samples) A similar snippet can be used as synthetic data when testing scaling tools such as :class:`sklearn.preprocessing.StandardScaler`. - classification - If the bug is not raised during when encoding a categorical variable, you can - feed numeric data to a classifier. Just remember to ensure that the target - is indeed an integer. + If the bug is not raised during when encoding a categorical variable, you can + feed numeric data to a classifier. Just remember to ensure that the target + is indeed an integer. - .. code-block:: python + .. code-block:: python - import numpy as np + import numpy as np - rng = np.random.RandomState(0) - n_samples, n_features = 5, 5 - X = rng.randn(n_samples, n_features) - y = rng.randint(0, 2, n_samples) # binary target with values in {0, 1} + rng = np.random.RandomState(0) + n_samples, n_features = 5, 5 + X = rng.randn(n_samples, n_features) + y = rng.randint(0, 2, n_samples) # binary target with values in {0, 1} - If the bug only happens with non-numeric class labels, you might want to - generate a random target with `numpy.random.choice - `_. + If the bug only happens with non-numeric class labels, you might want to + generate a random target with `numpy.random.choice + `_. - .. code-block:: python + .. code-block:: python - import numpy as np + import numpy as np - rng = np.random.RandomState(0) - n_samples, n_features = 50, 5 - X = rng.randn(n_samples, n_features) - y = np.random.choice( - ["male", "female", "other"], size=n_samples, p=[0.49, 0.49, 0.02] - ) + rng = np.random.RandomState(0) + n_samples, n_features = 50, 5 + X = rng.randn(n_samples, n_features) + y = np.random.choice( + ["male", "female", "other"], size=n_samples, p=[0.49, 0.49, 0.02] + ) Pandas ------ diff --git a/doc/developers/performance.rst b/doc/developers/performance.rst index 287262255535f..42687945a2bba 100644 --- a/doc/developers/performance.rst +++ b/doc/developers/performance.rst @@ -46,31 +46,31 @@ Sometimes however an algorithm cannot be expressed efficiently in simple vectorized Numpy code. In this case, the recommended strategy is the following: - 1. **Profile** the Python implementation to find the main bottleneck and - isolate it in a **dedicated module level function**. This function - will be reimplemented as a compiled extension module. - - 2. If there exists a well maintained BSD or MIT **C/C++** implementation - of the same algorithm that is not too big, you can write a - **Cython wrapper** for it and include a copy of the source code - of the library in the scikit-learn source tree: this strategy is - used for the classes :class:`svm.LinearSVC`, :class:`svm.SVC` and - :class:`linear_model.LogisticRegression` (wrappers for liblinear - and libsvm). - - 3. Otherwise, write an optimized version of your Python function using - **Cython** directly. This strategy is used - for the :class:`linear_model.ElasticNet` and - :class:`linear_model.SGDClassifier` classes for instance. - - 4. **Move the Python version of the function in the tests** and use - it to check that the results of the compiled extension are consistent - with the gold standard, easy to debug Python version. - - 5. Once the code is optimized (not simple bottleneck spottable by - profiling), check whether it is possible to have **coarse grained - parallelism** that is amenable to **multi-processing** by using the - ``joblib.Parallel`` class. +1. **Profile** the Python implementation to find the main bottleneck and + isolate it in a **dedicated module level function**. This function + will be reimplemented as a compiled extension module. + +2. If there exists a well maintained BSD or MIT **C/C++** implementation + of the same algorithm that is not too big, you can write a + **Cython wrapper** for it and include a copy of the source code + of the library in the scikit-learn source tree: this strategy is + used for the classes :class:`svm.LinearSVC`, :class:`svm.SVC` and + :class:`linear_model.LogisticRegression` (wrappers for liblinear + and libsvm). + +3. Otherwise, write an optimized version of your Python function using + **Cython** directly. This strategy is used + for the :class:`linear_model.ElasticNet` and + :class:`linear_model.SGDClassifier` classes for instance. + +4. **Move the Python version of the function in the tests** and use + it to check that the results of the compiled extension are consistent + with the gold standard, easy to debug Python version. + +5. Once the code is optimized (not simple bottleneck spottable by + profiling), check whether it is possible to have **coarse grained + parallelism** that is amenable to **multi-processing** by using the + ``joblib.Parallel`` class. When using Cython, use either @@ -187,7 +187,7 @@ us install ``line_profiler`` and wire it to IPython: pip install line_profiler -- **Under IPython 0.13+**, first create a configuration profile: +**Under IPython 0.13+**, first create a configuration profile: .. prompt:: bash $ @@ -265,7 +265,7 @@ install the latest version: Then, setup the magics in a manner similar to ``line_profiler``. -- **Under IPython 0.11+**, first create a configuration profile: +**Under IPython 0.11+**, first create a configuration profile: .. prompt:: bash $ diff --git a/doc/developers/tips.rst b/doc/developers/tips.rst index 3d42626126f8a..f8537236c32d8 100644 --- a/doc/developers/tips.rst +++ b/doc/developers/tips.rst @@ -73,27 +73,25 @@ will run all :term:`common tests` for the ``LogisticRegression`` estimator. When a unit test fails, the following tricks can make debugging easier: - 1. The command line argument ``pytest -l`` instructs pytest to print the local - variables when a failure occurs. +1. The command line argument ``pytest -l`` instructs pytest to print the local + variables when a failure occurs. - 2. The argument ``pytest --pdb`` drops into the Python debugger on failure. To - instead drop into the rich IPython debugger ``ipdb``, you may set up a - shell alias to: +2. The argument ``pytest --pdb`` drops into the Python debugger on failure. To + instead drop into the rich IPython debugger ``ipdb``, you may set up a + shell alias to: -.. prompt:: bash $ + .. prompt:: bash $ - pytest --pdbcls=IPython.terminal.debugger:TerminalPdb --capture no + pytest --pdbcls=IPython.terminal.debugger:TerminalPdb --capture no Other `pytest` options that may become useful include: - - ``-x`` which exits on the first failed test - - ``--lf`` to rerun the tests that failed on the previous run - - ``--ff`` to rerun all previous tests, running the ones that failed first - - ``-s`` so that pytest does not capture the output of ``print()`` - statements - - ``--tb=short`` or ``--tb=line`` to control the length of the logs - - ``--runxfail`` also run tests marked as a known failure (XFAIL) and report - errors. +- ``-x`` which exits on the first failed test, +- ``--lf`` to rerun the tests that failed on the previous run, +- ``--ff`` to rerun all previous tests, running the ones that failed first, +- ``-s`` so that pytest does not capture the output of ``print()`` statements, +- ``--tb=short`` or ``--tb=line`` to control the length of the logs, +- ``--runxfail`` also run tests marked as a known failure (XFAIL) and report errors. Since our continuous integration tests will error if ``FutureWarning`` isn't properly caught, @@ -114,113 +112,135 @@ replies `_ for reviewing: Note that putting this content on a single line in a literal is the easiest way to make it copyable and wrapped on screen. Issue: Usage questions - :: - You are asking a usage question. The issue tracker is for bugs and new features. For usage questions, it is recommended to try [Stack Overflow](https://stackoverflow.com/questions/tagged/scikit-learn) or [the Mailing List](https://mail.python.org/mailman/listinfo/scikit-learn). +:: + + You are asking a usage question. The issue tracker is for bugs and new features. For usage questions, it is recommended to try [Stack Overflow](https://stackoverflow.com/questions/tagged/scikit-learn) or [the Mailing List](https://mail.python.org/mailman/listinfo/scikit-learn). - Unfortunately, we need to close this issue as this issue tracker is a communication tool used for the development of scikit-learn. The additional activity created by usage questions crowds it too much and impedes this development. The conversation can continue here, however there is no guarantee that is will receive attention from core developers. + Unfortunately, we need to close this issue as this issue tracker is a communication tool used for the development of scikit-learn. The additional activity created by usage questions crowds it too much and impedes this development. The conversation can continue here, however there is no guarantee that is will receive attention from core developers. Issue: You're welcome to update the docs - :: - Please feel free to offer a pull request updating the documentation if you feel it could be improved. +:: + + Please feel free to offer a pull request updating the documentation if you feel it could be improved. Issue: Self-contained example for bug - :: - Please provide [self-contained example code](https://scikit-learn.org/dev/developers/minimal_reproducer.html), including imports and data (if possible), so that other contributors can just run it and reproduce your issue. Ideally your example code should be minimal. +:: + + Please provide [self-contained example code](https://scikit-learn.org/dev/developers/minimal_reproducer.html), including imports and data (if possible), so that other contributors can just run it and reproduce your issue. Ideally your example code should be minimal. Issue: Software versions - :: - To help diagnose your issue, please paste the output of: - ```py - import sklearn; sklearn.show_versions() - ``` - Thanks. +:: + + To help diagnose your issue, please paste the output of: + ```py + import sklearn; sklearn.show_versions() + ``` + Thanks. Issue: Code blocks - :: - Readability can be greatly improved if you [format](https://help.github.com/articles/creating-and-highlighting-code-blocks/) your code snippets and complete error messages appropriately. For example: +:: + + Readability can be greatly improved if you [format](https://help.github.com/articles/creating-and-highlighting-code-blocks/) your code snippets and complete error messages appropriately. For example: - ```python - print(something) - ``` - generates: ```python print(something) ``` - And: - - ```pytb - Traceback (most recent call last): - File "", line 1, in - ImportError: No module named 'hello' - ``` - generates: + + generates: + + ```python + print(something) + ``` + + And: + ```pytb Traceback (most recent call last): - File "", line 1, in + File "", line 1, in ImportError: No module named 'hello' ``` - You can edit your issue descriptions and comments at any time to improve readability. This helps maintainers a lot. Thanks! + + generates: + + ```pytb + Traceback (most recent call last): + File "", line 1, in + ImportError: No module named 'hello' + ``` + + You can edit your issue descriptions and comments at any time to improve readability. This helps maintainers a lot. Thanks! Issue/Comment: Linking to code - :: - Friendly advice: for clarity's sake, you can link to code like [this](https://help.github.com/articles/creating-a-permanent-link-to-a-code-snippet/). +:: + + Friendly advice: for clarity's sake, you can link to code like [this](https://help.github.com/articles/creating-a-permanent-link-to-a-code-snippet/). Issue/Comment: Linking to comments - :: - Please use links to comments, which make it a lot easier to see what you are referring to, rather than just linking to the issue. See [this](https://stackoverflow.com/questions/25163598/how-do-i-reference-a-specific-issue-comment-on-github) for more details. +:: + + Please use links to comments, which make it a lot easier to see what you are referring to, rather than just linking to the issue. See [this](https://stackoverflow.com/questions/25163598/how-do-i-reference-a-specific-issue-comment-on-github) for more details. PR-NEW: Better description and title - :: - Thanks for the pull request! Please make the title of the PR more descriptive. The title will become the commit message when this is merged. You should state what issue (or PR) it fixes/resolves in the description using the syntax described [here](https://scikit-learn.org/dev/developers/contributing.html#contributing-pull-requests). +:: + + Thanks for the pull request! Please make the title of the PR more descriptive. The title will become the commit message when this is merged. You should state what issue (or PR) it fixes/resolves in the description using the syntax described [here](https://scikit-learn.org/dev/developers/contributing.html#contributing-pull-requests). PR-NEW: Fix # - :: - Please use "Fix #issueNumber" in your PR description (and you can do it more than once). This way the associated issue gets closed automatically when the PR is merged. For more details, look at [this](https://github.com/blog/1506-closing-issues-via-pull-requests). +:: + + Please use "Fix #issueNumber" in your PR description (and you can do it more than once). This way the associated issue gets closed automatically when the PR is merged. For more details, look at [this](https://github.com/blog/1506-closing-issues-via-pull-requests). PR-NEW or Issue: Maintenance cost - :: - Every feature we include has a [maintenance cost](https://scikit-learn.org/dev/faq.html#why-are-you-so-selective-on-what-algorithms-you-include-in-scikit-learn). Our maintainers are mostly volunteers. For a new feature to be included, we need evidence that it is often useful and, ideally, [well-established](https://scikit-learn.org/dev/faq.html#what-are-the-inclusion-criteria-for-new-algorithms) in the literature or in practice. Also, we expect PR authors to take part in the maintenance for the code they submit, at least initially. That doesn't stop you implementing it for yourself and publishing it in a separate repository, or even [scikit-learn-contrib](https://scikit-learn-contrib.github.io). +:: + + Every feature we include has a [maintenance cost](https://scikit-learn.org/dev/faq.html#why-are-you-so-selective-on-what-algorithms-you-include-in-scikit-learn). Our maintainers are mostly volunteers. For a new feature to be included, we need evidence that it is often useful and, ideally, [well-established](https://scikit-learn.org/dev/faq.html#what-are-the-inclusion-criteria-for-new-algorithms) in the literature or in practice. Also, we expect PR authors to take part in the maintenance for the code they submit, at least initially. That doesn't stop you implementing it for yourself and publishing it in a separate repository, or even [scikit-learn-contrib](https://scikit-learn-contrib.github.io). PR-WIP: What's needed before merge? - :: - Please clarify (perhaps as a TODO list in the PR description) what work you believe still needs to be done before it can be reviewed for merge. When it is ready, please prefix the PR title with `[MRG]`. +:: + + Please clarify (perhaps as a TODO list in the PR description) what work you believe still needs to be done before it can be reviewed for merge. When it is ready, please prefix the PR title with `[MRG]`. PR-WIP: Regression test needed - :: - Please add a [non-regression test](https://en.wikipedia.org/wiki/Non-regression_testing) that would fail at main but pass in this PR. +:: + + Please add a [non-regression test](https://en.wikipedia.org/wiki/Non-regression_testing) that would fail at main but pass in this PR. PR-WIP: PEP8 - :: - You have some [PEP8](https://www.python.org/dev/peps/pep-0008/) violations, whose details you can see in the Circle CI `lint` job. It might be worth configuring your code editor to check for such errors on the fly, so you can catch them before committing. +:: + + You have some [PEP8](https://www.python.org/dev/peps/pep-0008/) violations, whose details you can see in the Circle CI `lint` job. It might be worth configuring your code editor to check for such errors on the fly, so you can catch them before committing. PR-MRG: Patience - :: - Before merging, we generally require two core developers to agree that your pull request is desirable and ready. [Please be patient](https://scikit-learn.org/dev/faq.html#why-is-my-pull-request-not-getting-any-attention), as we mostly rely on volunteered time from busy core developers. (You are also welcome to help us out with [reviewing other PRs](https://scikit-learn.org/dev/developers/contributing.html#code-review-guidelines).) +:: + + Before merging, we generally require two core developers to agree that your pull request is desirable and ready. [Please be patient](https://scikit-learn.org/dev/faq.html#why-is-my-pull-request-not-getting-any-attention), as we mostly rely on volunteered time from busy core developers. (You are also welcome to help us out with [reviewing other PRs](https://scikit-learn.org/dev/developers/contributing.html#code-review-guidelines).) PR-MRG: Add to what's new - :: - Please add an entry to the change log at `doc/whats_new/v*.rst`. Like the other entries there, please reference this pull request with `:pr:` and credit yourself (and other contributors if applicable) with `:user:`. +:: + + Please add an entry to the change log at `doc/whats_new/v*.rst`. Like the other entries there, please reference this pull request with `:pr:` and credit yourself (and other contributors if applicable) with `:user:`. PR: Don't change unrelated - :: - Please do not change unrelated lines. It makes your contribution harder to review and may introduce merge conflicts to other pull requests. +:: + + Please do not change unrelated lines. It makes your contribution harder to review and may introduce merge conflicts to other pull requests. .. highlight:: default @@ -244,19 +264,19 @@ valgrind_. Valgrind is a command-line tool that can trace memory errors in a variety of code. Follow these steps: - 1. Install `valgrind`_ on your system. +1. Install `valgrind`_ on your system. - 2. Download the python valgrind suppression file: `valgrind-python.supp`_. +2. Download the python valgrind suppression file: `valgrind-python.supp`_. - 3. Follow the directions in the `README.valgrind`_ file to customize your - python suppressions. If you don't, you will have spurious output coming - related to the python interpreter instead of your own code. +3. Follow the directions in the `README.valgrind`_ file to customize your + python suppressions. If you don't, you will have spurious output coming + related to the python interpreter instead of your own code. - 4. Run valgrind as follows: +4. Run valgrind as follows: -.. prompt:: bash $ + .. prompt:: bash $ - valgrind -v --suppressions=valgrind-python.supp python my_test_script.py + valgrind -v --suppressions=valgrind-python.supp python my_test_script.py .. _valgrind: https://valgrind.org .. _`README.valgrind`: https://github.com/python/cpython/blob/master/Misc/README.valgrind diff --git a/doc/model_persistence.rst b/doc/model_persistence.rst index 53f01fd019d79..b8da5c8a3961f 100644 --- a/doc/model_persistence.rst +++ b/doc/model_persistence.rst @@ -58,7 +58,7 @@ with:: When an estimator is unpickled with a scikit-learn version that is inconsistent with the version the estimator was pickled with, a :class:`~sklearn.exceptions.InconsistentVersionWarning` is raised. This warning -can be caught to obtain the original version the estimator was pickled with: +can be caught to obtain the original version the estimator was pickled with:: from sklearn.exceptions import InconsistentVersionWarning warnings.simplefilter("error", InconsistentVersionWarning) diff --git a/doc/modules/clustering.rst b/doc/modules/clustering.rst index 4cd86a0bf70c1..c64b3d9d646c9 100644 --- a/doc/modules/clustering.rst +++ b/doc/modules/clustering.rst @@ -1042,16 +1042,16 @@ efficiently, HDBSCAN first extracts a minimum spanning tree (MST) from the fully -connected mutual reachability graph, then greedily cuts the edges with highest weight. An outline of the HDBSCAN algorithm is as follows: - 1. Extract the MST of :math:`G_{ms}` - 2. Extend the MST by adding a "self edge" for each vertex, with weight equal - to the core distance of the underlying sample. - 3. Initialize a single cluster and label for the MST. - 4. Remove the edge with the greatest weight from the MST (ties are - removed simultaneously). - 5. Assign cluster labels to the connected components which contain the - end points of the now-removed edge. If the component does not have at least - one edge it is instead assigned a "null" label marking it as noise. - 6. Repeat 4-5 until there are no more connected components. +1. Extract the MST of :math:`G_{ms}`. +2. Extend the MST by adding a "self edge" for each vertex, with weight equal + to the core distance of the underlying sample. +3. Initialize a single cluster and label for the MST. +4. Remove the edge with the greatest weight from the MST (ties are + removed simultaneously). +5. Assign cluster labels to the connected components which contain the + end points of the now-removed edge. If the component does not have at least + one edge it is instead assigned a "null" label marking it as noise. +6. Repeat 4-5 until there are no more connected components. HDBSCAN is therefore able to obtain all possible partitions achievable by DBSCAN* for a fixed choice of `min_samples` in a hierarchical fashion. @@ -1233,11 +1233,11 @@ clusters (labels) and the samples are mapped to the global label of the nearest **BIRCH or MiniBatchKMeans?** - - BIRCH does not scale very well to high dimensional data. As a rule of thumb if - ``n_features`` is greater than twenty, it is generally better to use MiniBatchKMeans. - - If the number of instances of data needs to be reduced, or if one wants a - large number of subclusters either as a preprocessing step or otherwise, - BIRCH is more useful than MiniBatchKMeans. +- BIRCH does not scale very well to high dimensional data. As a rule of thumb if + ``n_features`` is greater than twenty, it is generally better to use MiniBatchKMeans. +- If the number of instances of data needs to be reduced, or if one wants a + large number of subclusters either as a preprocessing step or otherwise, + BIRCH is more useful than MiniBatchKMeans. **How to use partial_fit?** @@ -1245,12 +1245,12 @@ clusters (labels) and the samples are mapped to the global label of the nearest To avoid the computation of global clustering, for every call of ``partial_fit`` the user is advised - 1. To set ``n_clusters=None`` initially - 2. Train all data by multiple calls to partial_fit. - 3. Set ``n_clusters`` to a required value using - ``brc.set_params(n_clusters=n_clusters)``. - 4. Call ``partial_fit`` finally with no arguments, i.e. ``brc.partial_fit()`` - which performs the global clustering. +1. To set ``n_clusters=None`` initially +2. Train all data by multiple calls to partial_fit. +3. Set ``n_clusters`` to a required value using + ``brc.set_params(n_clusters=n_clusters)``. +4. Call ``partial_fit`` finally with no arguments, i.e. ``brc.partial_fit()`` + which performs the global clustering. .. image:: ../auto_examples/cluster/images/sphx_glr_plot_birch_vs_minibatchkmeans_001.png :target: ../auto_examples/cluster/plot_birch_vs_minibatchkmeans.html @@ -2196,19 +2196,19 @@ under the true and predicted clusterings. It has the following entries: - :math:`C_{00}` : number of pairs with both clusterings having the samples - not clustered together +:math:`C_{00}` : number of pairs with both clusterings having the samples +not clustered together - :math:`C_{10}` : number of pairs with the true label clustering having the - samples clustered together but the other clustering not having the samples - clustered together +:math:`C_{10}` : number of pairs with the true label clustering having the +samples clustered together but the other clustering not having the samples +clustered together - :math:`C_{01}` : number of pairs with the true label clustering not having - the samples clustered together but the other clustering having the samples - clustered together +:math:`C_{01}` : number of pairs with the true label clustering not having +the samples clustered together but the other clustering having the samples +clustered together - :math:`C_{11}` : number of pairs with both clusterings having the samples - clustered together +:math:`C_{11}` : number of pairs with both clusterings having the samples +clustered together Considering a pair of samples that is clustered together a positive pair, then as in binary classification the count of true negatives is diff --git a/doc/modules/cross_validation.rst b/doc/modules/cross_validation.rst index 53206bce28c8f..24a8e2f2d2acd 100644 --- a/doc/modules/cross_validation.rst +++ b/doc/modules/cross_validation.rst @@ -86,10 +86,10 @@ the training set is split into *k* smaller sets but generally follow the same principles). The following procedure is followed for each of the *k* "folds": - * A model is trained using :math:`k-1` of the folds as training data; - * the resulting model is validated on the remaining part of the data - (i.e., it is used as a test set to compute a performance measure - such as accuracy). +* A model is trained using :math:`k-1` of the folds as training data; +* the resulting model is validated on the remaining part of the data + (i.e., it is used as a test set to compute a performance measure + such as accuracy). The performance measure reported by *k*-fold cross-validation is then the average of the values computed in the loop. diff --git a/doc/modules/decomposition.rst b/doc/modules/decomposition.rst index 223985c6579f0..e8241a92cfc3b 100644 --- a/doc/modules/decomposition.rst +++ b/doc/modules/decomposition.rst @@ -72,11 +72,11 @@ exactly match the results of :class:`PCA` while processing the data in a minibatch fashion. :class:`IncrementalPCA` makes it possible to implement out-of-core Principal Component Analysis either by: - * Using its ``partial_fit`` method on chunks of data fetched sequentially - from the local hard drive or a network database. +* Using its ``partial_fit`` method on chunks of data fetched sequentially + from the local hard drive or a network database. - * Calling its fit method on a memory mapped file using - ``numpy.memmap``. +* Calling its fit method on a memory mapped file using + ``numpy.memmap``. :class:`IncrementalPCA` only stores estimates of component and noise variances, in order update ``explained_variance_ratio_`` incrementally. This is why @@ -358,14 +358,14 @@ components is less than 10 (strict) and the number of samples is more than 200 * *randomized* solver: - * Algorithm 4.3 in - :arxiv:`"Finding structure with randomness: Stochastic - algorithms for constructing approximate matrix decompositions" <0909.4061>` - Halko, et al. (2009) + * Algorithm 4.3 in + :arxiv:`"Finding structure with randomness: Stochastic + algorithms for constructing approximate matrix decompositions" <0909.4061>` + Halko, et al. (2009) - * :arxiv:`"An implementation of a randomized algorithm - for principal component analysis" <1412.3510>` - A. Szlam et al. (2014) + * :arxiv:`"An implementation of a randomized algorithm + for principal component analysis" <1412.3510>` + A. Szlam et al. (2014) * *arpack* solver: `scipy.sparse.linalg.eigsh documentation @@ -636,7 +636,7 @@ does not fit into the memory. computationally efficient and implements on-line learning with a ``partial_fit`` method. - Example: :ref:`sphx_glr_auto_examples_cluster_plot_dict_face_patches.py` + Example: :ref:`sphx_glr_auto_examples_cluster_plot_dict_face_patches.py` .. currentmodule:: sklearn.decomposition @@ -1008,10 +1008,10 @@ The graphical model of LDA is a three-level generative model: Note on notations presented in the graphical model above, which can be found in Hoffman et al. (2013): - * The corpus is a collection of :math:`D` documents. - * A document is a sequence of :math:`N` words. - * There are :math:`K` topics in the corpus. - * The boxes represent repeated sampling. +* The corpus is a collection of :math:`D` documents. +* A document is a sequence of :math:`N` words. +* There are :math:`K` topics in the corpus. +* The boxes represent repeated sampling. In the graphical model, each node is a random variable and has a role in the generative process. A shaded node indicates an observed variable and an unshaded @@ -1029,21 +1029,21 @@ When modeling text corpora, the model assumes the following generative process for a corpus with :math:`D` documents and :math:`K` topics, with :math:`K` corresponding to `n_components` in the API: - 1. For each topic :math:`k \in K`, draw :math:`\beta_k \sim - \mathrm{Dirichlet}(\eta)`. This provides a distribution over the words, - i.e. the probability of a word appearing in topic :math:`k`. - :math:`\eta` corresponds to `topic_word_prior`. +1. For each topic :math:`k \in K`, draw :math:`\beta_k \sim + \mathrm{Dirichlet}(\eta)`. This provides a distribution over the words, + i.e. the probability of a word appearing in topic :math:`k`. + :math:`\eta` corresponds to `topic_word_prior`. - 2. For each document :math:`d \in D`, draw the topic proportions - :math:`\theta_d \sim \mathrm{Dirichlet}(\alpha)`. :math:`\alpha` - corresponds to `doc_topic_prior`. +2. For each document :math:`d \in D`, draw the topic proportions + :math:`\theta_d \sim \mathrm{Dirichlet}(\alpha)`. :math:`\alpha` + corresponds to `doc_topic_prior`. - 3. For each word :math:`i` in document :math:`d`: +3. For each word :math:`i` in document :math:`d`: - a. Draw the topic assignment :math:`z_{di} \sim \mathrm{Multinomial} - (\theta_d)` - b. Draw the observed word :math:`w_{ij} \sim \mathrm{Multinomial} - (\beta_{z_{di}})` + a. Draw the topic assignment :math:`z_{di} \sim \mathrm{Multinomial} + (\theta_d)` + b. Draw the observed word :math:`w_{ij} \sim \mathrm{Multinomial} + (\beta_{z_{di}})` For parameter estimation, the posterior distribution is: diff --git a/doc/modules/ensemble.rst b/doc/modules/ensemble.rst index 73b4420960717..334e00e35a848 100644 --- a/doc/modules/ensemble.rst +++ b/doc/modules/ensemble.rst @@ -285,13 +285,13 @@ model. For a predictor :math:`F` with two features: - - a **monotonic increase constraint** is a constraint of the form: - .. math:: - x_1 \leq x_1' \implies F(x_1, x_2) \leq F(x_1', x_2) +- a **monotonic increase constraint** is a constraint of the form: + .. math:: + x_1 \leq x_1' \implies F(x_1, x_2) \leq F(x_1', x_2) - - a **monotonic decrease constraint** is a constraint of the form: - .. math:: - x_1 \leq x_1' \implies F(x_1, x_2) \geq F(x_1', x_2) +- a **monotonic decrease constraint** is a constraint of the form: + .. math:: + x_1 \leq x_1' \implies F(x_1, x_2) \geq F(x_1', x_2) You can specify a monotonic constraint on each feature using the `monotonic_cst` parameter. For each feature, a value of 0 indicates no @@ -311,8 +311,8 @@ Nevertheless, monotonic constraints only marginally constrain feature effects on For instance, monotonic increase and decrease constraints cannot be used to enforce the following modelling constraint: - .. math:: - x_1 \leq x_1' \implies F(x_1, x_2) \leq F(x_1', x_2') +.. math:: + x_1 \leq x_1' \implies F(x_1, x_2) \leq F(x_1', x_2') Also, monotonic constraints are not supported for multiclass classification. @@ -584,9 +584,9 @@ Regression GBRT regressors are additive models whose prediction :math:`\hat{y}_i` for a given input :math:`x_i` is of the following form: - .. math:: +.. math:: - \hat{y}_i = F_M(x_i) = \sum_{m=1}^{M} h_m(x_i) + \hat{y}_i = F_M(x_i) = \sum_{m=1}^{M} h_m(x_i) where the :math:`h_m` are estimators called *weak learners* in the context of boosting. Gradient Tree Boosting uses :ref:`decision tree regressors @@ -595,17 +595,17 @@ of boosting. Gradient Tree Boosting uses :ref:`decision tree regressors Similar to other boosting algorithms, a GBRT is built in a greedy fashion: - .. math:: +.. math:: - F_m(x) = F_{m-1}(x) + h_m(x), + F_m(x) = F_{m-1}(x) + h_m(x), where the newly added tree :math:`h_m` is fitted in order to minimize a sum of losses :math:`L_m`, given the previous ensemble :math:`F_{m-1}`: - .. math:: +.. math:: - h_m = \arg\min_{h} L_m = \arg\min_{h} \sum_{i=1}^{n} - l(y_i, F_{m-1}(x_i) + h(x_i)), + h_m = \arg\min_{h} L_m = \arg\min_{h} \sum_{i=1}^{n} + l(y_i, F_{m-1}(x_i) + h(x_i)), where :math:`l(y_i, F(x_i))` is defined by the `loss` parameter, detailed in the next section. @@ -618,12 +618,12 @@ argument. Using a first-order Taylor approximation, the value of :math:`l` can be approximated as follows: - .. math:: +.. math:: - l(y_i, F_{m-1}(x_i) + h_m(x_i)) \approx - l(y_i, F_{m-1}(x_i)) - + h_m(x_i) - \left[ \frac{\partial l(y_i, F(x_i))}{\partial F(x_i)} \right]_{F=F_{m - 1}}. + l(y_i, F_{m-1}(x_i) + h_m(x_i)) \approx + l(y_i, F_{m-1}(x_i)) + + h_m(x_i) + \left[ \frac{\partial l(y_i, F(x_i))}{\partial F(x_i)} \right]_{F=F_{m - 1}}. .. note:: @@ -640,9 +640,9 @@ differentiable. We will denote it by :math:`g_i`. Removing the constant terms, we have: - .. math:: +.. math:: - h_m \approx \arg\min_{h} \sum_{i=1}^{n} h(x_i) g_i + h_m \approx \arg\min_{h} \sum_{i=1}^{n} h(x_i) g_i This is minimized if :math:`h(x_i)` is fitted to predict a value that is proportional to the negative gradient :math:`-g_i`. Therefore, at each @@ -691,40 +691,40 @@ Loss Functions The following loss functions are supported and can be specified using the parameter ``loss``: - * Regression - - * Squared error (``'squared_error'``): The natural choice for regression - due to its superior computational properties. The initial model is - given by the mean of the target values. - * Absolute error (``'absolute_error'``): A robust loss function for - regression. The initial model is given by the median of the - target values. - * Huber (``'huber'``): Another robust loss function that combines - least squares and least absolute deviation; use ``alpha`` to - control the sensitivity with regards to outliers (see [Friedman2001]_ for - more details). - * Quantile (``'quantile'``): A loss function for quantile regression. - Use ``0 < alpha < 1`` to specify the quantile. This loss function - can be used to create prediction intervals - (see :ref:`sphx_glr_auto_examples_ensemble_plot_gradient_boosting_quantile.py`). - - * Classification - - * Binary log-loss (``'log-loss'``): The binomial - negative log-likelihood loss function for binary classification. It provides - probability estimates. The initial model is given by the - log odds-ratio. - * Multi-class log-loss (``'log-loss'``): The multinomial - negative log-likelihood loss function for multi-class classification with - ``n_classes`` mutually exclusive classes. It provides - probability estimates. The initial model is given by the - prior probability of each class. At each iteration ``n_classes`` - regression trees have to be constructed which makes GBRT rather - inefficient for data sets with a large number of classes. - * Exponential loss (``'exponential'``): The same loss function - as :class:`AdaBoostClassifier`. Less robust to mislabeled - examples than ``'log-loss'``; can only be used for binary - classification. +* Regression + + * Squared error (``'squared_error'``): The natural choice for regression + due to its superior computational properties. The initial model is + given by the mean of the target values. + * Absolute error (``'absolute_error'``): A robust loss function for + regression. The initial model is given by the median of the + target values. + * Huber (``'huber'``): Another robust loss function that combines + least squares and least absolute deviation; use ``alpha`` to + control the sensitivity with regards to outliers (see [Friedman2001]_ for + more details). + * Quantile (``'quantile'``): A loss function for quantile regression. + Use ``0 < alpha < 1`` to specify the quantile. This loss function + can be used to create prediction intervals + (see :ref:`sphx_glr_auto_examples_ensemble_plot_gradient_boosting_quantile.py`). + +* Classification + + * Binary log-loss (``'log-loss'``): The binomial + negative log-likelihood loss function for binary classification. It provides + probability estimates. The initial model is given by the + log odds-ratio. + * Multi-class log-loss (``'log-loss'``): The multinomial + negative log-likelihood loss function for multi-class classification with + ``n_classes`` mutually exclusive classes. It provides + probability estimates. The initial model is given by the + prior probability of each class. At each iteration ``n_classes`` + regression trees have to be constructed which makes GBRT rather + inefficient for data sets with a large number of classes. + * Exponential loss (``'exponential'``): The same loss function + as :class:`AdaBoostClassifier`. Less robust to mislabeled + examples than ``'log-loss'``; can only be used for binary + classification. .. _gradient_boosting_shrinkage: @@ -1171,17 +1171,17 @@ shallow decision trees). Bagging methods come in many flavours but mostly differ from each other by the way they draw random subsets of the training set: - * When random subsets of the dataset are drawn as random subsets of the - samples, then this algorithm is known as Pasting [B1999]_. +* When random subsets of the dataset are drawn as random subsets of the + samples, then this algorithm is known as Pasting [B1999]_. - * When samples are drawn with replacement, then the method is known as - Bagging [B1996]_. +* When samples are drawn with replacement, then the method is known as + Bagging [B1996]_. - * When random subsets of the dataset are drawn as random subsets of - the features, then the method is known as Random Subspaces [H1998]_. +* When random subsets of the dataset are drawn as random subsets of + the features, then the method is known as Random Subspaces [H1998]_. - * Finally, when base estimators are built on subsets of both samples and - features, then the method is known as Random Patches [LG2012]_. +* Finally, when base estimators are built on subsets of both samples and + features, then the method is known as Random Patches [LG2012]_. In scikit-learn, bagging methods are offered as a unified :class:`BaggingClassifier` meta-estimator (resp. :class:`BaggingRegressor`), @@ -1591,10 +1591,10 @@ concentrate on the examples that are missed by the previous ones in the sequence AdaBoost can be used both for classification and regression problems: - - For multi-class classification, :class:`AdaBoostClassifier` implements - AdaBoost.SAMME [ZZRH2009]_. +- For multi-class classification, :class:`AdaBoostClassifier` implements + AdaBoost.SAMME [ZZRH2009]_. - - For regression, :class:`AdaBoostRegressor` implements AdaBoost.R2 [D1997]_. +- For regression, :class:`AdaBoostRegressor` implements AdaBoost.R2 [D1997]_. Usage ----- diff --git a/doc/modules/feature_extraction.rst b/doc/modules/feature_extraction.rst index 9653ba9d7b646..7ac538a89849b 100644 --- a/doc/modules/feature_extraction.rst +++ b/doc/modules/feature_extraction.rst @@ -615,7 +615,7 @@ As usual the best way to adjust the feature extraction parameters is to use a cross-validated grid search, for instance by pipelining the feature extractor with a classifier: - * :ref:`sphx_glr_auto_examples_model_selection_plot_grid_search_text_feature_extraction.py` +* :ref:`sphx_glr_auto_examples_model_selection_plot_grid_search_text_feature_extraction.py` |details-end| @@ -715,18 +715,18 @@ In particular in a **supervised setting** it can be successfully combined with fast and scalable linear models to train **document classifiers**, for instance: - * :ref:`sphx_glr_auto_examples_text_plot_document_classification_20newsgroups.py` +* :ref:`sphx_glr_auto_examples_text_plot_document_classification_20newsgroups.py` In an **unsupervised setting** it can be used to group similar documents together by applying clustering algorithms such as :ref:`k_means`: - * :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py` +* :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py` Finally it is possible to discover the main topics of a corpus by relaxing the hard assignment constraint of clustering, for instance by using :ref:`NMF`: - * :ref:`sphx_glr_auto_examples_applications_plot_topics_extraction_with_nmf_lda.py` +* :ref:`sphx_glr_auto_examples_applications_plot_topics_extraction_with_nmf_lda.py` Limitations of the Bag of Words representation @@ -923,19 +923,19 @@ to the vectorizer constructor:: In particular we name: - * ``preprocessor``: a callable that takes an entire document as input (as a - single string), and returns a possibly transformed version of the document, - still as an entire string. This can be used to remove HTML tags, lowercase - the entire document, etc. +* ``preprocessor``: a callable that takes an entire document as input (as a + single string), and returns a possibly transformed version of the document, + still as an entire string. This can be used to remove HTML tags, lowercase + the entire document, etc. - * ``tokenizer``: a callable that takes the output from the preprocessor - and splits it into tokens, then returns a list of these. +* ``tokenizer``: a callable that takes the output from the preprocessor + and splits it into tokens, then returns a list of these. - * ``analyzer``: a callable that replaces the preprocessor and tokenizer. - The default analyzers all call the preprocessor and tokenizer, but custom - analyzers will skip this. N-gram extraction and stop word filtering take - place at the analyzer level, so a custom analyzer may have to reproduce - these steps. +* ``analyzer``: a callable that replaces the preprocessor and tokenizer. + The default analyzers all call the preprocessor and tokenizer, but custom + analyzers will skip this. N-gram extraction and stop word filtering take + place at the analyzer level, so a custom analyzer may have to reproduce + these steps. (Lucene users might recognize these names, but be aware that scikit-learn concepts may not map one-to-one onto Lucene concepts.) @@ -951,53 +951,53 @@ factory methods instead of passing custom functions. Some tips and tricks: - * If documents are pre-tokenized by an external package, then store them in - files (or strings) with the tokens separated by whitespace and pass - ``analyzer=str.split`` - * Fancy token-level analysis such as stemming, lemmatizing, compound - splitting, filtering based on part-of-speech, etc. are not included in the - scikit-learn codebase, but can be added by customizing either the - tokenizer or the analyzer. - Here's a ``CountVectorizer`` with a tokenizer and lemmatizer using - `NLTK `_:: - - >>> from nltk import word_tokenize # doctest: +SKIP - >>> from nltk.stem import WordNetLemmatizer # doctest: +SKIP - >>> class LemmaTokenizer: - ... def __init__(self): - ... self.wnl = WordNetLemmatizer() - ... def __call__(self, doc): - ... return [self.wnl.lemmatize(t) for t in word_tokenize(doc)] - ... - >>> vect = CountVectorizer(tokenizer=LemmaTokenizer()) # doctest: +SKIP - - (Note that this will not filter out punctuation.) - - - The following example will, for instance, transform some British spelling - to American spelling:: - - >>> import re - >>> def to_british(tokens): - ... for t in tokens: - ... t = re.sub(r"(...)our$", r"\1or", t) - ... t = re.sub(r"([bt])re$", r"\1er", t) - ... t = re.sub(r"([iy])s(e$|ing|ation)", r"\1z\2", t) - ... t = re.sub(r"ogue$", "og", t) - ... yield t - ... - >>> class CustomVectorizer(CountVectorizer): - ... def build_tokenizer(self): - ... tokenize = super().build_tokenizer() - ... return lambda doc: list(to_british(tokenize(doc))) - ... - >>> print(CustomVectorizer().build_analyzer()(u"color colour")) - [...'color', ...'color'] - - for other styles of preprocessing; examples include stemming, lemmatization, - or normalizing numerical tokens, with the latter illustrated in: - - * :ref:`sphx_glr_auto_examples_bicluster_plot_bicluster_newsgroups.py` +* If documents are pre-tokenized by an external package, then store them in + files (or strings) with the tokens separated by whitespace and pass + ``analyzer=str.split`` +* Fancy token-level analysis such as stemming, lemmatizing, compound + splitting, filtering based on part-of-speech, etc. are not included in the + scikit-learn codebase, but can be added by customizing either the + tokenizer or the analyzer. + Here's a ``CountVectorizer`` with a tokenizer and lemmatizer using + `NLTK `_:: + + >>> from nltk import word_tokenize # doctest: +SKIP + >>> from nltk.stem import WordNetLemmatizer # doctest: +SKIP + >>> class LemmaTokenizer: + ... def __init__(self): + ... self.wnl = WordNetLemmatizer() + ... def __call__(self, doc): + ... return [self.wnl.lemmatize(t) for t in word_tokenize(doc)] + ... + >>> vect = CountVectorizer(tokenizer=LemmaTokenizer()) # doctest: +SKIP + + (Note that this will not filter out punctuation.) + + + The following example will, for instance, transform some British spelling + to American spelling:: + + >>> import re + >>> def to_british(tokens): + ... for t in tokens: + ... t = re.sub(r"(...)our$", r"\1or", t) + ... t = re.sub(r"([bt])re$", r"\1er", t) + ... t = re.sub(r"([iy])s(e$|ing|ation)", r"\1z\2", t) + ... t = re.sub(r"ogue$", "og", t) + ... yield t + ... + >>> class CustomVectorizer(CountVectorizer): + ... def build_tokenizer(self): + ... tokenize = super().build_tokenizer() + ... return lambda doc: list(to_british(tokenize(doc))) + ... + >>> print(CustomVectorizer().build_analyzer()(u"color colour")) + [...'color', ...'color'] + + for other styles of preprocessing; examples include stemming, lemmatization, + or normalizing numerical tokens, with the latter illustrated in: + + * :ref:`sphx_glr_auto_examples_bicluster_plot_bicluster_newsgroups.py` Customizing the vectorizer can also be useful when handling Asian languages diff --git a/doc/modules/feature_selection.rst b/doc/modules/feature_selection.rst index 7fcec524e7168..1ae950acdfbb6 100644 --- a/doc/modules/feature_selection.rst +++ b/doc/modules/feature_selection.rst @@ -57,18 +57,18 @@ univariate statistical tests. It can be seen as a preprocessing step to an estimator. Scikit-learn exposes feature selection routines as objects that implement the ``transform`` method: - * :class:`SelectKBest` removes all but the :math:`k` highest scoring features +* :class:`SelectKBest` removes all but the :math:`k` highest scoring features - * :class:`SelectPercentile` removes all but a user-specified highest scoring - percentage of features +* :class:`SelectPercentile` removes all but a user-specified highest scoring + percentage of features - * using common univariate statistical tests for each feature: - false positive rate :class:`SelectFpr`, false discovery rate - :class:`SelectFdr`, or family wise error :class:`SelectFwe`. +* using common univariate statistical tests for each feature: + false positive rate :class:`SelectFpr`, false discovery rate + :class:`SelectFdr`, or family wise error :class:`SelectFwe`. - * :class:`GenericUnivariateSelect` allows to perform univariate feature - selection with a configurable strategy. This allows to select the best - univariate selection strategy with hyper-parameter search estimator. +* :class:`GenericUnivariateSelect` allows to perform univariate feature + selection with a configurable strategy. This allows to select the best + univariate selection strategy with hyper-parameter search estimator. For instance, we can use a F-test to retrieve the two best features for a dataset as follows: @@ -87,9 +87,9 @@ These objects take as input a scoring function that returns univariate scores and p-values (or only scores for :class:`SelectKBest` and :class:`SelectPercentile`): - * For regression: :func:`r_regression`, :func:`f_regression`, :func:`mutual_info_regression` +* For regression: :func:`r_regression`, :func:`f_regression`, :func:`mutual_info_regression` - * For classification: :func:`chi2`, :func:`f_classif`, :func:`mutual_info_classif` +* For classification: :func:`chi2`, :func:`f_classif`, :func:`mutual_info_classif` The methods based on F-test estimate the degree of linear dependency between two random variables. On the other hand, mutual information methods can capture diff --git a/doc/modules/gaussian_process.rst b/doc/modules/gaussian_process.rst index 55960e901b166..58e56a557ed73 100644 --- a/doc/modules/gaussian_process.rst +++ b/doc/modules/gaussian_process.rst @@ -11,25 +11,25 @@ to solve *regression* and *probabilistic classification* problems. The advantages of Gaussian processes are: - - The prediction interpolates the observations (at least for regular - kernels). +- The prediction interpolates the observations (at least for regular + kernels). - - The prediction is probabilistic (Gaussian) so that one can compute - empirical confidence intervals and decide based on those if one should - refit (online fitting, adaptive fitting) the prediction in some - region of interest. +- The prediction is probabilistic (Gaussian) so that one can compute + empirical confidence intervals and decide based on those if one should + refit (online fitting, adaptive fitting) the prediction in some + region of interest. - - Versatile: different :ref:`kernels - ` can be specified. Common kernels are provided, but - it is also possible to specify custom kernels. +- Versatile: different :ref:`kernels + ` can be specified. Common kernels are provided, but + it is also possible to specify custom kernels. The disadvantages of Gaussian processes include: - - Our implementation is not sparse, i.e., they use the whole samples/features - information to perform the prediction. +- Our implementation is not sparse, i.e., they use the whole samples/features + information to perform the prediction. - - They lose efficiency in high dimensional spaces -- namely when the number - of features exceeds a few dozens. +- They lose efficiency in high dimensional spaces -- namely when the number + of features exceeds a few dozens. .. _gpr: @@ -386,7 +386,7 @@ Matérn kernel ------------- The :class:`Matern` kernel is a stationary kernel and a generalization of the :class:`RBF` kernel. It has an additional parameter :math:`\nu` which controls -the smoothness of the resulting function. It is parameterized by a length-scale parameter :math:`l>0`, which can either be a scalar (isotropic variant of the kernel) or a vector with the same number of dimensions as the inputs :math:`x` (anisotropic variant of the kernel). +the smoothness of the resulting function. It is parameterized by a length-scale parameter :math:`l>0`, which can either be a scalar (isotropic variant of the kernel) or a vector with the same number of dimensions as the inputs :math:`x` (anisotropic variant of the kernel). |details-start| **Mathematical implementation of Matérn kernel** diff --git a/doc/modules/grid_search.rst b/doc/modules/grid_search.rst index efdde897e841b..01c5a5c72ee52 100644 --- a/doc/modules/grid_search.rst +++ b/doc/modules/grid_search.rst @@ -135,14 +135,14 @@ variate sample) method to sample a value. A call to the ``rvs`` function should provide independent random samples from possible parameter values on consecutive calls. - .. warning:: - - The distributions in ``scipy.stats`` prior to version scipy 0.16 - do not allow specifying a random state. Instead, they use the global - numpy random state, that can be seeded via ``np.random.seed`` or set - using ``np.random.set_state``. However, beginning scikit-learn 0.18, - the :mod:`sklearn.model_selection` module sets the random state provided - by the user if scipy >= 0.16 is also available. +.. warning:: + + The distributions in ``scipy.stats`` prior to version scipy 0.16 + do not allow specifying a random state. Instead, they use the global + numpy random state, that can be seeded via ``np.random.seed`` or set + using ``np.random.set_state``. However, beginning scikit-learn 0.18, + the :mod:`sklearn.model_selection` module sets the random state provided + by the user if scipy >= 0.16 is also available. For continuous parameters, such as ``C`` above, it is important to specify a continuous distribution to take full advantage of the randomization. This way, diff --git a/doc/modules/isotonic.rst b/doc/modules/isotonic.rst index 8967ef18afcb3..c30ee83b74241 100644 --- a/doc/modules/isotonic.rst +++ b/doc/modules/isotonic.rst @@ -9,10 +9,10 @@ Isotonic regression The class :class:`IsotonicRegression` fits a non-decreasing real function to 1-dimensional data. It solves the following problem: - minimize :math:`\sum_i w_i (y_i - \hat{y}_i)^2` - - subject to :math:`\hat{y}_i \le \hat{y}_j` whenever :math:`X_i \le X_j`, +.. math:: + \min \sum_i w_i (y_i - \hat{y}_i)^2 +subject to :math:`\hat{y}_i \le \hat{y}_j` whenever :math:`X_i \le X_j`, where the weights :math:`w_i` are strictly positive, and both `X` and `y` are arbitrary real quantities. diff --git a/doc/modules/kernel_approximation.rst b/doc/modules/kernel_approximation.rst index 30c5a71b1417d..0c67c36178e3b 100644 --- a/doc/modules/kernel_approximation.rst +++ b/doc/modules/kernel_approximation.rst @@ -57,10 +57,10 @@ points. where: - * :math:`U` is orthonormal - * :math:`Ʌ` is diagonal matrix of eigenvalues - * :math:`U_1` is orthonormal matrix of samples that were chosen - * :math:`U_2` is orthonormal matrix of samples that were not chosen +* :math:`U` is orthonormal +* :math:`\Lambda` is diagonal matrix of eigenvalues +* :math:`U_1` is orthonormal matrix of samples that were chosen +* :math:`U_2` is orthonormal matrix of samples that were not chosen Given that :math:`U_1 \Lambda U_1^T` can be obtained by orthonormalization of the matrix :math:`K_{11}`, and :math:`U_2 \Lambda U_1^T` can be evaluated (as @@ -215,8 +215,8 @@ function given by: where: - * ``x``, ``y`` are the input vectors - * ``d`` is the kernel degree +* ``x``, ``y`` are the input vectors +* ``d`` is the kernel degree Intuitively, the feature space of the polynomial kernel of degree `d` consists of all possible degree-`d` products among input features, which enables diff --git a/doc/modules/linear_model.rst b/doc/modules/linear_model.rst index 13fafaf48c953..e538dde2ed6d5 100644 --- a/doc/modules/linear_model.rst +++ b/doc/modules/linear_model.rst @@ -215,10 +215,10 @@ Cross-Validation. **References** |details-split| - * "Notes on Regularized Least Squares", Rifkin & Lippert (`technical report - `_, - `course slides - `_). +* "Notes on Regularized Least Squares", Rifkin & Lippert (`technical report + `_, + `course slides + `_). |details-end| @@ -587,30 +587,30 @@ between the features. The advantages of LARS are: - - It is numerically efficient in contexts where the number of features - is significantly greater than the number of samples. +- It is numerically efficient in contexts where the number of features + is significantly greater than the number of samples. - - It is computationally just as fast as forward selection and has - the same order of complexity as ordinary least squares. +- It is computationally just as fast as forward selection and has + the same order of complexity as ordinary least squares. - - It produces a full piecewise linear solution path, which is - useful in cross-validation or similar attempts to tune the model. +- It produces a full piecewise linear solution path, which is + useful in cross-validation or similar attempts to tune the model. - - If two features are almost equally correlated with the target, - then their coefficients should increase at approximately the same - rate. The algorithm thus behaves as intuition would expect, and - also is more stable. +- If two features are almost equally correlated with the target, + then their coefficients should increase at approximately the same + rate. The algorithm thus behaves as intuition would expect, and + also is more stable. - - It is easily modified to produce solutions for other estimators, - like the Lasso. +- It is easily modified to produce solutions for other estimators, + like the Lasso. The disadvantages of the LARS method include: - - Because LARS is based upon an iterative refitting of the - residuals, it would appear to be especially sensitive to the - effects of noise. This problem is discussed in detail by Weisberg - in the discussion section of the Efron et al. (2004) Annals of - Statistics article. +- Because LARS is based upon an iterative refitting of the + residuals, it would appear to be especially sensitive to the + effects of noise. This problem is discussed in detail by Weisberg + in the discussion section of the Efron et al. (2004) Annals of + Statistics article. The LARS model can be used via the estimator :class:`Lars`, or its low-level implementation :func:`lars_path` or :func:`lars_path_gram`. @@ -707,11 +707,11 @@ previously chosen dictionary elements. **References** |details-split| - * https://www.cs.technion.ac.il/~ronrubin/Publications/KSVD-OMP-v2.pdf +* https://www.cs.technion.ac.il/~ronrubin/Publications/KSVD-OMP-v2.pdf - * `Matching pursuits with time-frequency dictionaries - `_, - S. G. Mallat, Z. Zhang, +* `Matching pursuits with time-frequency dictionaries + `_, + S. G. Mallat, Z. Zhang, |details-end| @@ -743,24 +743,24 @@ estimated from the data. The advantages of Bayesian Regression are: - - It adapts to the data at hand. +- It adapts to the data at hand. - - It can be used to include regularization parameters in the - estimation procedure. +- It can be used to include regularization parameters in the + estimation procedure. The disadvantages of Bayesian regression include: - - Inference of the model can be time consuming. +- Inference of the model can be time consuming. |details-start| **References** |details-split| - * A good introduction to Bayesian methods is given in C. Bishop: Pattern - Recognition and Machine learning +* A good introduction to Bayesian methods is given in C. Bishop: Pattern + Recognition and Machine learning - * Original Algorithm is detailed in the book `Bayesian learning for neural - networks` by Radford M. Neal +* Original Algorithm is detailed in the book `Bayesian learning for neural + networks` by Radford M. Neal |details-end| @@ -827,11 +827,11 @@ is more robust to ill-posed problems. **References** |details-split| - * Section 3.3 in Christopher M. Bishop: Pattern Recognition and Machine Learning, 2006 +* Section 3.3 in Christopher M. Bishop: Pattern Recognition and Machine Learning, 2006 - * David J. C. MacKay, `Bayesian Interpolation `_, 1992. +* David J. C. MacKay, `Bayesian Interpolation `_, 1992. - * Michael E. Tipping, `Sparse Bayesian Learning and the Relevance Vector Machine `_, 2001. +* Michael E. Tipping, `Sparse Bayesian Learning and the Relevance Vector Machine `_, 2001. |details-end| @@ -1372,11 +1372,11 @@ Perceptron The :class:`Perceptron` is another simple classification algorithm suitable for large scale learning. By default: - - It does not require a learning rate. +- It does not require a learning rate. - - It is not regularized (penalized). +- It is not regularized (penalized). - - It updates its model only on mistakes. +- It updates its model only on mistakes. The last characteristic implies that the Perceptron is slightly faster to train than SGD with the hinge loss and that the resulting models are @@ -1407,9 +1407,9 @@ For classification, :class:`PassiveAggressiveClassifier` can be used with **References** |details-split| - * `"Online Passive-Aggressive Algorithms" - `_ - K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR 7 (2006) +* `"Online Passive-Aggressive Algorithms" + `_ + K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR 7 (2006) |details-end| diff --git a/doc/modules/metrics.rst b/doc/modules/metrics.rst index 71e914afad192..caea39319e869 100644 --- a/doc/modules/metrics.rst +++ b/doc/modules/metrics.rst @@ -28,9 +28,9 @@ There are a number of ways to convert between a distance metric and a similarity measure, such as a kernel. Let ``D`` be the distance, and ``S`` be the kernel: - 1. ``S = np.exp(-D * gamma)``, where one heuristic for choosing - ``gamma`` is ``1 / num_features`` - 2. ``S = 1. / (D / np.max(D))`` +1. ``S = np.exp(-D * gamma)``, where one heuristic for choosing + ``gamma`` is ``1 / num_features`` +2. ``S = 1. / (D / np.max(D))`` .. currentmodule:: sklearn.metrics @@ -123,8 +123,8 @@ The polynomial kernel is defined as: where: - * ``x``, ``y`` are the input vectors - * ``d`` is the kernel degree +* ``x``, ``y`` are the input vectors +* ``d`` is the kernel degree If :math:`c_0 = 0` the kernel is said to be homogeneous. @@ -143,9 +143,9 @@ activation function). It is defined as: where: - * ``x``, ``y`` are the input vectors - * :math:`\gamma` is known as slope - * :math:`c_0` is known as intercept +* ``x``, ``y`` are the input vectors +* :math:`\gamma` is known as slope +* :math:`c_0` is known as intercept .. _rbf_kernel: @@ -165,14 +165,14 @@ the kernel is known as the Gaussian kernel of variance :math:`\sigma^2`. Laplacian kernel ---------------- -The function :func:`laplacian_kernel` is a variant on the radial basis +The function :func:`laplacian_kernel` is a variant on the radial basis function kernel defined as: .. math:: k(x, y) = \exp( -\gamma \| x-y \|_1) -where ``x`` and ``y`` are the input vectors and :math:`\|x-y\|_1` is the +where ``x`` and ``y`` are the input vectors and :math:`\|x-y\|_1` is the Manhattan distance between the input vectors. It has proven useful in ML applied to noiseless data. @@ -229,4 +229,3 @@ The chi squared kernel is most commonly used on histograms (bags) of visual word categories: A comprehensive study International Journal of Computer Vision 2007 https://hal.archives-ouvertes.fr/hal-00171412/document - diff --git a/doc/modules/mixture.rst b/doc/modules/mixture.rst index e9cc94b1d493d..df5d8020a1369 100644 --- a/doc/modules/mixture.rst +++ b/doc/modules/mixture.rst @@ -14,13 +14,13 @@ matrices supported), sample them, and estimate them from data. Facilities to help determine the appropriate number of components are also provided. - .. figure:: ../auto_examples/mixture/images/sphx_glr_plot_gmm_pdf_001.png - :target: ../auto_examples/mixture/plot_gmm_pdf.html - :align: center - :scale: 50% +.. figure:: ../auto_examples/mixture/images/sphx_glr_plot_gmm_pdf_001.png + :target: ../auto_examples/mixture/plot_gmm_pdf.html + :align: center + :scale: 50% - **Two-component Gaussian mixture model:** *data points, and equi-probability - surfaces of the model.* + **Two-component Gaussian mixture model:** *data points, and equi-probability + surfaces of the model.* A Gaussian mixture model is a probabilistic model that assumes all the data points are generated from a mixture of a finite number of diff --git a/doc/modules/multiclass.rst b/doc/modules/multiclass.rst index beee41e2aea0b..d3a83997c2dd9 100644 --- a/doc/modules/multiclass.rst +++ b/doc/modules/multiclass.rst @@ -147,35 +147,35 @@ Target format Valid :term:`multiclass` representations for :func:`~sklearn.utils.multiclass.type_of_target` (`y`) are: - - 1d or column vector containing more than two discrete values. An - example of a vector ``y`` for 4 samples: - - >>> import numpy as np - >>> y = np.array(['apple', 'pear', 'apple', 'orange']) - >>> print(y) - ['apple' 'pear' 'apple' 'orange'] - - - Dense or sparse :term:`binary` matrix of shape ``(n_samples, n_classes)`` - with a single sample per row, where each column represents one class. An - example of both a dense and sparse :term:`binary` matrix ``y`` for 4 - samples, where the columns, in order, are apple, orange, and pear: - - >>> import numpy as np - >>> from sklearn.preprocessing import LabelBinarizer - >>> y = np.array(['apple', 'pear', 'apple', 'orange']) - >>> y_dense = LabelBinarizer().fit_transform(y) - >>> print(y_dense) - [[1 0 0] - [0 0 1] - [1 0 0] - [0 1 0]] - >>> from scipy import sparse - >>> y_sparse = sparse.csr_matrix(y_dense) - >>> print(y_sparse) - (0, 0) 1 - (1, 2) 1 - (2, 0) 1 - (3, 1) 1 +- 1d or column vector containing more than two discrete values. An + example of a vector ``y`` for 4 samples: + + >>> import numpy as np + >>> y = np.array(['apple', 'pear', 'apple', 'orange']) + >>> print(y) + ['apple' 'pear' 'apple' 'orange'] + +- Dense or sparse :term:`binary` matrix of shape ``(n_samples, n_classes)`` + with a single sample per row, where each column represents one class. An + example of both a dense and sparse :term:`binary` matrix ``y`` for 4 + samples, where the columns, in order, are apple, orange, and pear: + + >>> import numpy as np + >>> from sklearn.preprocessing import LabelBinarizer + >>> y = np.array(['apple', 'pear', 'apple', 'orange']) + >>> y_dense = LabelBinarizer().fit_transform(y) + >>> print(y_dense) + [[1 0 0] + [0 0 1] + [1 0 0] + [0 1 0]] + >>> from scipy import sparse + >>> y_sparse = sparse.csr_matrix(y_dense) + >>> print(y_sparse) + (0, 0) 1 + (1, 2) 1 + (2, 0) 1 + (3, 1) 1 For more information about :class:`~sklearn.preprocessing.LabelBinarizer`, refer to :ref:`preprocessing_targets`. diff --git a/doc/modules/neighbors.rst b/doc/modules/neighbors.rst index 81543be3b494e..b77f1952bece8 100644 --- a/doc/modules/neighbors.rst +++ b/doc/modules/neighbors.rst @@ -59,12 +59,12 @@ The choice of neighbors search algorithm is controlled through the keyword from the training data. For a discussion of the strengths and weaknesses of each option, see `Nearest Neighbor Algorithms`_. - .. warning:: +.. warning:: - Regarding the Nearest Neighbors algorithms, if two - neighbors :math:`k+1` and :math:`k` have identical distances - but different labels, the result will depend on the ordering of the - training data. + Regarding the Nearest Neighbors algorithms, if two + neighbors :math:`k+1` and :math:`k` have identical distances + but different labels, the result will depend on the ordering of the + training data. Finding the Nearest Neighbors ----------------------------- diff --git a/doc/modules/neural_networks_supervised.rst b/doc/modules/neural_networks_supervised.rst index 388f32e7c6925..64b394b2db7c5 100644 --- a/doc/modules/neural_networks_supervised.rst +++ b/doc/modules/neural_networks_supervised.rst @@ -51,22 +51,22 @@ at index :math:`i` represents the bias values added to layer :math:`i+1`. The advantages of Multi-layer Perceptron are: - + Capability to learn non-linear models. ++ Capability to learn non-linear models. - + Capability to learn models in real-time (on-line learning) - using ``partial_fit``. ++ Capability to learn models in real-time (on-line learning) + using ``partial_fit``. The disadvantages of Multi-layer Perceptron (MLP) include: - + MLP with hidden layers have a non-convex loss function where there exists - more than one local minimum. Therefore different random weight - initializations can lead to different validation accuracy. ++ MLP with hidden layers have a non-convex loss function where there exists + more than one local minimum. Therefore different random weight + initializations can lead to different validation accuracy. - + MLP requires tuning a number of hyperparameters such as the number of - hidden neurons, layers, and iterations. ++ MLP requires tuning a number of hyperparameters such as the number of + hidden neurons, layers, and iterations. - + MLP is sensitive to feature scaling. ++ MLP is sensitive to feature scaling. Please see :ref:`Tips on Practical Use ` section that addresses some of these disadvantages. @@ -311,35 +311,35 @@ when the improvement in loss is below a certain, small number. Tips on Practical Use ===================== - * Multi-layer Perceptron is sensitive to feature scaling, so it - is highly recommended to scale your data. For example, scale each - attribute on the input vector X to [0, 1] or [-1, +1], or standardize - it to have mean 0 and variance 1. Note that you must apply the *same* - scaling to the test set for meaningful results. - You can use :class:`~sklearn.preprocessing.StandardScaler` for standardization. - - >>> from sklearn.preprocessing import StandardScaler # doctest: +SKIP - >>> scaler = StandardScaler() # doctest: +SKIP - >>> # Don't cheat - fit only on training data - >>> scaler.fit(X_train) # doctest: +SKIP - >>> X_train = scaler.transform(X_train) # doctest: +SKIP - >>> # apply same transformation to test data - >>> X_test = scaler.transform(X_test) # doctest: +SKIP - - An alternative and recommended approach is to use - :class:`~sklearn.preprocessing.StandardScaler` in a - :class:`~sklearn.pipeline.Pipeline` - - * Finding a reasonable regularization parameter :math:`\alpha` is best done - using :class:`~sklearn.model_selection.GridSearchCV`, usually in the range - ``10.0 ** -np.arange(1, 7)``. - - * Empirically, we observed that `L-BFGS` converges faster and - with better solutions on small datasets. For relatively large - datasets, however, `Adam` is very robust. It usually converges - quickly and gives pretty good performance. `SGD` with momentum or - nesterov's momentum, on the other hand, can perform better than - those two algorithms if learning rate is correctly tuned. +* Multi-layer Perceptron is sensitive to feature scaling, so it + is highly recommended to scale your data. For example, scale each + attribute on the input vector X to [0, 1] or [-1, +1], or standardize + it to have mean 0 and variance 1. Note that you must apply the *same* + scaling to the test set for meaningful results. + You can use :class:`~sklearn.preprocessing.StandardScaler` for standardization. + + >>> from sklearn.preprocessing import StandardScaler # doctest: +SKIP + >>> scaler = StandardScaler() # doctest: +SKIP + >>> # Don't cheat - fit only on training data + >>> scaler.fit(X_train) # doctest: +SKIP + >>> X_train = scaler.transform(X_train) # doctest: +SKIP + >>> # apply same transformation to test data + >>> X_test = scaler.transform(X_test) # doctest: +SKIP + + An alternative and recommended approach is to use + :class:`~sklearn.preprocessing.StandardScaler` in a + :class:`~sklearn.pipeline.Pipeline` + +* Finding a reasonable regularization parameter :math:`\alpha` is best done + using :class:`~sklearn.model_selection.GridSearchCV`, usually in the range + ``10.0 ** -np.arange(1, 7)``. + +* Empirically, we observed that `L-BFGS` converges faster and + with better solutions on small datasets. For relatively large + datasets, however, `Adam` is very robust. It usually converges + quickly and gives pretty good performance. `SGD` with momentum or + nesterov's momentum, on the other hand, can perform better than + those two algorithms if learning rate is correctly tuned. More control with warm_start ============================ diff --git a/doc/modules/outlier_detection.rst b/doc/modules/outlier_detection.rst index 572674328108d..d003b645eb19c 100644 --- a/doc/modules/outlier_detection.rst +++ b/doc/modules/outlier_detection.rst @@ -411,7 +411,7 @@ Note that ``fit_predict`` is not available in this case to avoid inconsistencies Novelty detection with Local Outlier Factor is illustrated below. - .. figure:: ../auto_examples/neighbors/images/sphx_glr_plot_lof_novelty_detection_001.png - :target: ../auto_examples/neighbors/plot_lof_novelty_detection.html - :align: center - :scale: 75% +.. figure:: ../auto_examples/neighbors/images/sphx_glr_plot_lof_novelty_detection_001.png + :target: ../auto_examples/neighbors/plot_lof_novelty_detection.html + :align: center + :scale: 75% diff --git a/doc/modules/preprocessing.rst b/doc/modules/preprocessing.rst index 475098c0d685c..b619b88110d63 100644 --- a/doc/modules/preprocessing.rst +++ b/doc/modules/preprocessing.rst @@ -1008,9 +1008,9 @@ For each feature, the bin edges are computed during ``fit`` and together with the number of bins, they will define the intervals. Therefore, for the current example, these intervals are defined as: - - feature 1: :math:`{[-\infty, -1), [-1, 2), [2, \infty)}` - - feature 2: :math:`{[-\infty, 5), [5, \infty)}` - - feature 3: :math:`{[-\infty, 14), [14, \infty)}` +- feature 1: :math:`{[-\infty, -1), [-1, 2), [2, \infty)}` +- feature 2: :math:`{[-\infty, 5), [5, \infty)}` +- feature 3: :math:`{[-\infty, 14), [14, \infty)}` Based on these bin intervals, ``X`` is transformed as follows:: @@ -1199,23 +1199,23 @@ below. Some of the advantages of splines over polynomials are: - - B-splines are very flexible and robust if you keep a fixed low degree, - usually 3, and parsimoniously adapt the number of knots. Polynomials - would need a higher degree, which leads to the next point. - - B-splines do not have oscillatory behaviour at the boundaries as have - polynomials (the higher the degree, the worse). This is known as `Runge's - phenomenon `_. - - B-splines provide good options for extrapolation beyond the boundaries, - i.e. beyond the range of fitted values. Have a look at the option - ``extrapolation``. - - B-splines generate a feature matrix with a banded structure. For a single - feature, every row contains only ``degree + 1`` non-zero elements, which - occur consecutively and are even positive. This results in a matrix with - good numerical properties, e.g. a low condition number, in sharp contrast - to a matrix of polynomials, which goes under the name - `Vandermonde matrix `_. - A low condition number is important for stable algorithms of linear - models. +- B-splines are very flexible and robust if you keep a fixed low degree, + usually 3, and parsimoniously adapt the number of knots. Polynomials + would need a higher degree, which leads to the next point. +- B-splines do not have oscillatory behaviour at the boundaries as have + polynomials (the higher the degree, the worse). This is known as `Runge's + phenomenon `_. +- B-splines provide good options for extrapolation beyond the boundaries, + i.e. beyond the range of fitted values. Have a look at the option + ``extrapolation``. +- B-splines generate a feature matrix with a banded structure. For a single + feature, every row contains only ``degree + 1`` non-zero elements, which + occur consecutively and are even positive. This results in a matrix with + good numerical properties, e.g. a low condition number, in sharp contrast + to a matrix of polynomials, which goes under the name + `Vandermonde matrix `_. + A low condition number is important for stable algorithms of linear + models. The following code snippet shows splines in action:: diff --git a/doc/modules/semi_supervised.rst b/doc/modules/semi_supervised.rst index 47e8bfffdd9a7..f8cae0a9ddcdf 100644 --- a/doc/modules/semi_supervised.rst +++ b/doc/modules/semi_supervised.rst @@ -121,11 +121,11 @@ Label propagation models have two built-in kernel methods. Choice of kernel effects both scalability and performance of the algorithms. The following are available: - * rbf (:math:`\exp(-\gamma |x-y|^2), \gamma > 0`). :math:`\gamma` is - specified by keyword gamma. +* rbf (:math:`\exp(-\gamma |x-y|^2), \gamma > 0`). :math:`\gamma` is + specified by keyword gamma. - * knn (:math:`1[x' \in kNN(x)]`). :math:`k` is specified by keyword - n_neighbors. +* knn (:math:`1[x' \in kNN(x)]`). :math:`k` is specified by keyword + n_neighbors. The RBF kernel will produce a fully connected graph which is represented in memory by a dense matrix. This matrix may be very large and combined with the cost of diff --git a/doc/modules/sgd.rst b/doc/modules/sgd.rst index b37a0209af24d..a7981e9d4ec28 100644 --- a/doc/modules/sgd.rst +++ b/doc/modules/sgd.rst @@ -36,16 +36,16 @@ different means. The advantages of Stochastic Gradient Descent are: - + Efficiency. ++ Efficiency. - + Ease of implementation (lots of opportunities for code tuning). ++ Ease of implementation (lots of opportunities for code tuning). The disadvantages of Stochastic Gradient Descent include: - + SGD requires a number of hyperparameters such as the regularization - parameter and the number of iterations. ++ SGD requires a number of hyperparameters such as the regularization + parameter and the number of iterations. - + SGD is sensitive to feature scaling. ++ SGD is sensitive to feature scaling. .. warning:: @@ -111,12 +111,12 @@ the coefficients and the input sample, plus the intercept) is given by The concrete loss function can be set via the ``loss`` parameter. :class:`SGDClassifier` supports the following loss functions: - * ``loss="hinge"``: (soft-margin) linear Support Vector Machine, - * ``loss="modified_huber"``: smoothed hinge loss, - * ``loss="log_loss"``: logistic regression, - * and all regression losses below. In this case the target is encoded as -1 - or 1, and the problem is treated as a regression problem. The predicted - class then correspond to the sign of the predicted target. +* ``loss="hinge"``: (soft-margin) linear Support Vector Machine, +* ``loss="modified_huber"``: smoothed hinge loss, +* ``loss="log_loss"``: logistic regression, +* and all regression losses below. In this case the target is encoded as -1 + or 1, and the problem is treated as a regression problem. The predicted + class then correspond to the sign of the predicted target. Please refer to the :ref:`mathematical section below ` for formulas. @@ -136,10 +136,10 @@ Using ``loss="log_loss"`` or ``loss="modified_huber"`` enables the The concrete penalty can be set via the ``penalty`` parameter. SGD supports the following penalties: - * ``penalty="l2"``: L2 norm penalty on ``coef_``. - * ``penalty="l1"``: L1 norm penalty on ``coef_``. - * ``penalty="elasticnet"``: Convex combination of L2 and L1; - ``(1 - l1_ratio) * L2 + l1_ratio * L1``. +* ``penalty="l2"``: L2 norm penalty on ``coef_``. +* ``penalty="l1"``: L1 norm penalty on ``coef_``. +* ``penalty="elasticnet"``: Convex combination of L2 and L1; + ``(1 - l1_ratio) * L2 + l1_ratio * L1``. The default setting is ``penalty="l2"``. The L1 penalty leads to sparse solutions, driving most coefficients to zero. The Elastic Net [#5]_ solves @@ -211,9 +211,9 @@ samples (> 10.000), for other problems we recommend :class:`Ridge`, The concrete loss function can be set via the ``loss`` parameter. :class:`SGDRegressor` supports the following loss functions: - * ``loss="squared_error"``: Ordinary least squares, - * ``loss="huber"``: Huber loss for robust regression, - * ``loss="epsilon_insensitive"``: linear Support Vector Regression. +* ``loss="squared_error"``: Ordinary least squares, +* ``loss="huber"``: Huber loss for robust regression, +* ``loss="epsilon_insensitive"``: linear Support Vector Regression. Please refer to the :ref:`mathematical section below ` for formulas. @@ -327,14 +327,14 @@ Stopping criterion The classes :class:`SGDClassifier` and :class:`SGDRegressor` provide two criteria to stop the algorithm when a given level of convergence is reached: - * With ``early_stopping=True``, the input data is split into a training set - and a validation set. The model is then fitted on the training set, and the - stopping criterion is based on the prediction score (using the `score` - method) computed on the validation set. The size of the validation set - can be changed with the parameter ``validation_fraction``. - * With ``early_stopping=False``, the model is fitted on the entire input data - and the stopping criterion is based on the objective function computed on - the training data. +* With ``early_stopping=True``, the input data is split into a training set + and a validation set. The model is then fitted on the training set, and the + stopping criterion is based on the prediction score (using the `score` + method) computed on the validation set. The size of the validation set + can be changed with the parameter ``validation_fraction``. +* With ``early_stopping=False``, the model is fitted on the entire input data + and the stopping criterion is based on the objective function computed on + the training data. In both cases, the criterion is evaluated once by epoch, and the algorithm stops when the criterion does not improve ``n_iter_no_change`` times in a row. The @@ -345,45 +345,45 @@ stops in any case after a maximum number of iteration ``max_iter``. Tips on Practical Use ===================== - * Stochastic Gradient Descent is sensitive to feature scaling, so it - is highly recommended to scale your data. For example, scale each - attribute on the input vector X to [0,1] or [-1,+1], or standardize - it to have mean 0 and variance 1. Note that the *same* scaling must be - applied to the test vector to obtain meaningful results. This can be easily - done using :class:`~sklearn.preprocessing.StandardScaler`:: - - from sklearn.preprocessing import StandardScaler - scaler = StandardScaler() - scaler.fit(X_train) # Don't cheat - fit only on training data - X_train = scaler.transform(X_train) - X_test = scaler.transform(X_test) # apply same transformation to test data - - # Or better yet: use a pipeline! - from sklearn.pipeline import make_pipeline - est = make_pipeline(StandardScaler(), SGDClassifier()) - est.fit(X_train) - est.predict(X_test) - - If your attributes have an intrinsic scale (e.g. word frequencies or - indicator features) scaling is not needed. - - * Finding a reasonable regularization term :math:`\alpha` is - best done using automatic hyper-parameter search, e.g. - :class:`~sklearn.model_selection.GridSearchCV` or - :class:`~sklearn.model_selection.RandomizedSearchCV`, usually in the - range ``10.0**-np.arange(1,7)``. - - * Empirically, we found that SGD converges after observing - approximately 10^6 training samples. Thus, a reasonable first guess - for the number of iterations is ``max_iter = np.ceil(10**6 / n)``, - where ``n`` is the size of the training set. - - * If you apply SGD to features extracted using PCA we found that - it is often wise to scale the feature values by some constant `c` - such that the average L2 norm of the training data equals one. - - * We found that Averaged SGD works best with a larger number of features - and a higher eta0 +* Stochastic Gradient Descent is sensitive to feature scaling, so it + is highly recommended to scale your data. For example, scale each + attribute on the input vector X to [0,1] or [-1,+1], or standardize + it to have mean 0 and variance 1. Note that the *same* scaling must be + applied to the test vector to obtain meaningful results. This can be easily + done using :class:`~sklearn.preprocessing.StandardScaler`:: + + from sklearn.preprocessing import StandardScaler + scaler = StandardScaler() + scaler.fit(X_train) # Don't cheat - fit only on training data + X_train = scaler.transform(X_train) + X_test = scaler.transform(X_test) # apply same transformation to test data + + # Or better yet: use a pipeline! + from sklearn.pipeline import make_pipeline + est = make_pipeline(StandardScaler(), SGDClassifier()) + est.fit(X_train) + est.predict(X_test) + + If your attributes have an intrinsic scale (e.g. word frequencies or + indicator features) scaling is not needed. + +* Finding a reasonable regularization term :math:`\alpha` is + best done using automatic hyper-parameter search, e.g. + :class:`~sklearn.model_selection.GridSearchCV` or + :class:`~sklearn.model_selection.RandomizedSearchCV`, usually in the + range ``10.0**-np.arange(1,7)``. + +* Empirically, we found that SGD converges after observing + approximately 10^6 training samples. Thus, a reasonable first guess + for the number of iterations is ``max_iter = np.ceil(10**6 / n)``, + where ``n`` is the size of the training set. + +* If you apply SGD to features extracted using PCA we found that + it is often wise to scale the feature values by some constant `c` + such that the average L2 norm of the training data equals one. + +* We found that Averaged SGD works best with a larger number of features + and a higher eta0. .. topic:: References: @@ -454,12 +454,12 @@ misclassification error (Zero-one loss) as shown in the Figure below. Popular choices for the regularization term :math:`R` (the `penalty` parameter) include: - - L2 norm: :math:`R(w) := \frac{1}{2} \sum_{j=1}^{m} w_j^2 = ||w||_2^2`, - - L1 norm: :math:`R(w) := \sum_{j=1}^{m} |w_j|`, which leads to sparse - solutions. - - Elastic Net: :math:`R(w) := \frac{\rho}{2} \sum_{j=1}^{n} w_j^2 + - (1-\rho) \sum_{j=1}^{m} |w_j|`, a convex combination of L2 and L1, where - :math:`\rho` is given by ``1 - l1_ratio``. +- L2 norm: :math:`R(w) := \frac{1}{2} \sum_{j=1}^{m} w_j^2 = ||w||_2^2`, +- L1 norm: :math:`R(w) := \sum_{j=1}^{m} |w_j|`, which leads to sparse + solutions. +- Elastic Net: :math:`R(w) := \frac{\rho}{2} \sum_{j=1}^{n} w_j^2 + + (1-\rho) \sum_{j=1}^{m} |w_j|`, a convex combination of L2 and L1, where + :math:`\rho` is given by ``1 - l1_ratio``. The Figure below shows the contours of the different regularization terms in a 2-dimensional parameter space (:math:`m=2`) when :math:`R(w) = 1`. diff --git a/doc/modules/svm.rst b/doc/modules/svm.rst index 1a8b6d6c5741e..06eee7de50855 100644 --- a/doc/modules/svm.rst +++ b/doc/modules/svm.rst @@ -16,27 +16,27 @@ methods used for :ref:`classification `, The advantages of support vector machines are: - - Effective in high dimensional spaces. +- Effective in high dimensional spaces. - - Still effective in cases where number of dimensions is greater - than the number of samples. +- Still effective in cases where number of dimensions is greater + than the number of samples. - - Uses a subset of training points in the decision function (called - support vectors), so it is also memory efficient. +- Uses a subset of training points in the decision function (called + support vectors), so it is also memory efficient. - - Versatile: different :ref:`svm_kernels` can be - specified for the decision function. Common kernels are - provided, but it is also possible to specify custom kernels. +- Versatile: different :ref:`svm_kernels` can be + specified for the decision function. Common kernels are + provided, but it is also possible to specify custom kernels. The disadvantages of support vector machines include: - - If the number of features is much greater than the number of - samples, avoid over-fitting in choosing :ref:`svm_kernels` and regularization - term is crucial. +- If the number of features is much greater than the number of + samples, avoid over-fitting in choosing :ref:`svm_kernels` and regularization + term is crucial. - - SVMs do not directly provide probability estimates, these are - calculated using an expensive five-fold cross-validation - (see :ref:`Scores and probabilities `, below). +- SVMs do not directly provide probability estimates, these are + calculated using an expensive five-fold cross-validation + (see :ref:`Scores and probabilities `, below). The support vector machines in scikit-learn support both dense (``numpy.ndarray`` and convertible to that by ``numpy.asarray``) and @@ -381,95 +381,95 @@ Tips on Practical Use ===================== - * **Avoiding data copy**: For :class:`SVC`, :class:`SVR`, :class:`NuSVC` and - :class:`NuSVR`, if the data passed to certain methods is not C-ordered - contiguous and double precision, it will be copied before calling the - underlying C implementation. You can check whether a given numpy array is - C-contiguous by inspecting its ``flags`` attribute. - - For :class:`LinearSVC` (and :class:`LogisticRegression - `) any input passed as a numpy - array will be copied and converted to the `liblinear`_ internal sparse data - representation (double precision floats and int32 indices of non-zero - components). If you want to fit a large-scale linear classifier without - copying a dense numpy C-contiguous double precision array as input, we - suggest to use the :class:`SGDClassifier - ` class instead. The objective - function can be configured to be almost the same as the :class:`LinearSVC` - model. - - * **Kernel cache size**: For :class:`SVC`, :class:`SVR`, :class:`NuSVC` and - :class:`NuSVR`, the size of the kernel cache has a strong impact on run - times for larger problems. If you have enough RAM available, it is - recommended to set ``cache_size`` to a higher value than the default of - 200(MB), such as 500(MB) or 1000(MB). - - - * **Setting C**: ``C`` is ``1`` by default and it's a reasonable default - choice. If you have a lot of noisy observations you should decrease it: - decreasing C corresponds to more regularization. - - :class:`LinearSVC` and :class:`LinearSVR` are less sensitive to ``C`` when - it becomes large, and prediction results stop improving after a certain - threshold. Meanwhile, larger ``C`` values will take more time to train, - sometimes up to 10 times longer, as shown in [#3]_. - - * Support Vector Machine algorithms are not scale invariant, so **it - is highly recommended to scale your data**. For example, scale each - attribute on the input vector X to [0,1] or [-1,+1], or standardize it - to have mean 0 and variance 1. Note that the *same* scaling must be - applied to the test vector to obtain meaningful results. This can be done - easily by using a :class:`~sklearn.pipeline.Pipeline`:: - - >>> from sklearn.pipeline import make_pipeline - >>> from sklearn.preprocessing import StandardScaler - >>> from sklearn.svm import SVC - - >>> clf = make_pipeline(StandardScaler(), SVC()) - - See section :ref:`preprocessing` for more details on scaling and - normalization. - - .. _shrinking_svm: - - * Regarding the `shrinking` parameter, quoting [#4]_: *We found that if the - number of iterations is large, then shrinking can shorten the training - time. However, if we loosely solve the optimization problem (e.g., by - using a large stopping tolerance), the code without using shrinking may - be much faster* - - * Parameter ``nu`` in :class:`NuSVC`/:class:`OneClassSVM`/:class:`NuSVR` - approximates the fraction of training errors and support vectors. - - * In :class:`SVC`, if the data is unbalanced (e.g. many - positive and few negative), set ``class_weight='balanced'`` and/or try - different penalty parameters ``C``. - - * **Randomness of the underlying implementations**: The underlying - implementations of :class:`SVC` and :class:`NuSVC` use a random number - generator only to shuffle the data for probability estimation (when - ``probability`` is set to ``True``). This randomness can be controlled - with the ``random_state`` parameter. If ``probability`` is set to ``False`` - these estimators are not random and ``random_state`` has no effect on the - results. The underlying :class:`OneClassSVM` implementation is similar to - the ones of :class:`SVC` and :class:`NuSVC`. As no probability estimation - is provided for :class:`OneClassSVM`, it is not random. - - The underlying :class:`LinearSVC` implementation uses a random number - generator to select features when fitting the model with a dual coordinate - descent (i.e. when ``dual`` is set to ``True``). It is thus not uncommon - to have slightly different results for the same input data. If that - happens, try with a smaller `tol` parameter. This randomness can also be - controlled with the ``random_state`` parameter. When ``dual`` is - set to ``False`` the underlying implementation of :class:`LinearSVC` is - not random and ``random_state`` has no effect on the results. - - * Using L1 penalization as provided by ``LinearSVC(penalty='l1', - dual=False)`` yields a sparse solution, i.e. only a subset of feature - weights is different from zero and contribute to the decision function. - Increasing ``C`` yields a more complex model (more features are selected). - The ``C`` value that yields a "null" model (all weights equal to zero) can - be calculated using :func:`l1_min_c`. +* **Avoiding data copy**: For :class:`SVC`, :class:`SVR`, :class:`NuSVC` and + :class:`NuSVR`, if the data passed to certain methods is not C-ordered + contiguous and double precision, it will be copied before calling the + underlying C implementation. You can check whether a given numpy array is + C-contiguous by inspecting its ``flags`` attribute. + + For :class:`LinearSVC` (and :class:`LogisticRegression + `) any input passed as a numpy + array will be copied and converted to the `liblinear`_ internal sparse data + representation (double precision floats and int32 indices of non-zero + components). If you want to fit a large-scale linear classifier without + copying a dense numpy C-contiguous double precision array as input, we + suggest to use the :class:`SGDClassifier + ` class instead. The objective + function can be configured to be almost the same as the :class:`LinearSVC` + model. + +* **Kernel cache size**: For :class:`SVC`, :class:`SVR`, :class:`NuSVC` and + :class:`NuSVR`, the size of the kernel cache has a strong impact on run + times for larger problems. If you have enough RAM available, it is + recommended to set ``cache_size`` to a higher value than the default of + 200(MB), such as 500(MB) or 1000(MB). + + +* **Setting C**: ``C`` is ``1`` by default and it's a reasonable default + choice. If you have a lot of noisy observations you should decrease it: + decreasing C corresponds to more regularization. + + :class:`LinearSVC` and :class:`LinearSVR` are less sensitive to ``C`` when + it becomes large, and prediction results stop improving after a certain + threshold. Meanwhile, larger ``C`` values will take more time to train, + sometimes up to 10 times longer, as shown in [#3]_. + +* Support Vector Machine algorithms are not scale invariant, so **it + is highly recommended to scale your data**. For example, scale each + attribute on the input vector X to [0,1] or [-1,+1], or standardize it + to have mean 0 and variance 1. Note that the *same* scaling must be + applied to the test vector to obtain meaningful results. This can be done + easily by using a :class:`~sklearn.pipeline.Pipeline`:: + + >>> from sklearn.pipeline import make_pipeline + >>> from sklearn.preprocessing import StandardScaler + >>> from sklearn.svm import SVC + + >>> clf = make_pipeline(StandardScaler(), SVC()) + + See section :ref:`preprocessing` for more details on scaling and + normalization. + +.. _shrinking_svm: + +* Regarding the `shrinking` parameter, quoting [#4]_: *We found that if the + number of iterations is large, then shrinking can shorten the training + time. However, if we loosely solve the optimization problem (e.g., by + using a large stopping tolerance), the code without using shrinking may + be much faster* + +* Parameter ``nu`` in :class:`NuSVC`/:class:`OneClassSVM`/:class:`NuSVR` + approximates the fraction of training errors and support vectors. + +* In :class:`SVC`, if the data is unbalanced (e.g. many + positive and few negative), set ``class_weight='balanced'`` and/or try + different penalty parameters ``C``. + +* **Randomness of the underlying implementations**: The underlying + implementations of :class:`SVC` and :class:`NuSVC` use a random number + generator only to shuffle the data for probability estimation (when + ``probability`` is set to ``True``). This randomness can be controlled + with the ``random_state`` parameter. If ``probability`` is set to ``False`` + these estimators are not random and ``random_state`` has no effect on the + results. The underlying :class:`OneClassSVM` implementation is similar to + the ones of :class:`SVC` and :class:`NuSVC`. As no probability estimation + is provided for :class:`OneClassSVM`, it is not random. + + The underlying :class:`LinearSVC` implementation uses a random number + generator to select features when fitting the model with a dual coordinate + descent (i.e. when ``dual`` is set to ``True``). It is thus not uncommon + to have slightly different results for the same input data. If that + happens, try with a smaller `tol` parameter. This randomness can also be + controlled with the ``random_state`` parameter. When ``dual`` is + set to ``False`` the underlying implementation of :class:`LinearSVC` is + not random and ``random_state`` has no effect on the results. + +* Using L1 penalization as provided by ``LinearSVC(penalty='l1', + dual=False)`` yields a sparse solution, i.e. only a subset of feature + weights is different from zero and contribute to the decision function. + Increasing ``C`` yields a more complex model (more features are selected). + The ``C`` value that yields a "null" model (all weights equal to zero) can + be calculated using :func:`l1_min_c`. .. _svm_kernels: @@ -479,16 +479,16 @@ Kernel functions The *kernel function* can be any of the following: - * linear: :math:`\langle x, x'\rangle`. +* linear: :math:`\langle x, x'\rangle`. - * polynomial: :math:`(\gamma \langle x, x'\rangle + r)^d`, where - :math:`d` is specified by parameter ``degree``, :math:`r` by ``coef0``. +* polynomial: :math:`(\gamma \langle x, x'\rangle + r)^d`, where + :math:`d` is specified by parameter ``degree``, :math:`r` by ``coef0``. - * rbf: :math:`\exp(-\gamma \|x-x'\|^2)`, where :math:`\gamma` is - specified by parameter ``gamma``, must be greater than 0. +* rbf: :math:`\exp(-\gamma \|x-x'\|^2)`, where :math:`\gamma` is + specified by parameter ``gamma``, must be greater than 0. - * sigmoid :math:`\tanh(\gamma \langle x,x'\rangle + r)`, - where :math:`r` is specified by ``coef0``. +* sigmoid :math:`\tanh(\gamma \langle x,x'\rangle + r)`, + where :math:`r` is specified by ``coef0``. Different kernels are specified by the `kernel` parameter:: @@ -530,12 +530,12 @@ python function or by precomputing the Gram matrix. Classifiers with custom kernels behave the same way as any other classifiers, except that: - * Field ``support_vectors_`` is now empty, only indices of support - vectors are stored in ``support_`` +* Field ``support_vectors_`` is now empty, only indices of support + vectors are stored in ``support_`` - * A reference (and not a copy) of the first argument in the ``fit()`` - method is stored for future reference. If that array changes between the - use of ``fit()`` and ``predict()`` you will have unexpected results. +* A reference (and not a copy) of the first argument in the ``fit()`` + method is stored for future reference. If that array changes between the + use of ``fit()`` and ``predict()`` you will have unexpected results. |details-start| diff --git a/doc/modules/tree.rst b/doc/modules/tree.rst index e0a55547f4dea..b54b913573a34 100644 --- a/doc/modules/tree.rst +++ b/doc/modules/tree.rst @@ -23,68 +23,68 @@ the tree, the more complex the decision rules and the fitter the model. Some advantages of decision trees are: - - Simple to understand and to interpret. Trees can be visualized. +- Simple to understand and to interpret. Trees can be visualized. - - Requires little data preparation. Other techniques often require data - normalization, dummy variables need to be created and blank values to - be removed. Some tree and algorithm combinations support - :ref:`missing values `. +- Requires little data preparation. Other techniques often require data + normalization, dummy variables need to be created and blank values to + be removed. Some tree and algorithm combinations support + :ref:`missing values `. - - The cost of using the tree (i.e., predicting data) is logarithmic in the - number of data points used to train the tree. +- The cost of using the tree (i.e., predicting data) is logarithmic in the + number of data points used to train the tree. - - Able to handle both numerical and categorical data. However, the scikit-learn - implementation does not support categorical variables for now. Other - techniques are usually specialized in analyzing datasets that have only one type - of variable. See :ref:`algorithms ` for more - information. +- Able to handle both numerical and categorical data. However, the scikit-learn + implementation does not support categorical variables for now. Other + techniques are usually specialized in analyzing datasets that have only one type + of variable. See :ref:`algorithms ` for more + information. - - Able to handle multi-output problems. +- Able to handle multi-output problems. - - Uses a white box model. If a given situation is observable in a model, - the explanation for the condition is easily explained by boolean logic. - By contrast, in a black box model (e.g., in an artificial neural - network), results may be more difficult to interpret. +- Uses a white box model. If a given situation is observable in a model, + the explanation for the condition is easily explained by boolean logic. + By contrast, in a black box model (e.g., in an artificial neural + network), results may be more difficult to interpret. - - Possible to validate a model using statistical tests. That makes it - possible to account for the reliability of the model. +- Possible to validate a model using statistical tests. That makes it + possible to account for the reliability of the model. - - Performs well even if its assumptions are somewhat violated by - the true model from which the data were generated. +- Performs well even if its assumptions are somewhat violated by + the true model from which the data were generated. The disadvantages of decision trees include: - - Decision-tree learners can create over-complex trees that do not - generalize the data well. This is called overfitting. Mechanisms - such as pruning, setting the minimum number of samples required - at a leaf node or setting the maximum depth of the tree are - necessary to avoid this problem. +- Decision-tree learners can create over-complex trees that do not + generalize the data well. This is called overfitting. Mechanisms + such as pruning, setting the minimum number of samples required + at a leaf node or setting the maximum depth of the tree are + necessary to avoid this problem. - - Decision trees can be unstable because small variations in the - data might result in a completely different tree being generated. - This problem is mitigated by using decision trees within an - ensemble. +- Decision trees can be unstable because small variations in the + data might result in a completely different tree being generated. + This problem is mitigated by using decision trees within an + ensemble. - - Predictions of decision trees are neither smooth nor continuous, but - piecewise constant approximations as seen in the above figure. Therefore, - they are not good at extrapolation. +- Predictions of decision trees are neither smooth nor continuous, but + piecewise constant approximations as seen in the above figure. Therefore, + they are not good at extrapolation. - - The problem of learning an optimal decision tree is known to be - NP-complete under several aspects of optimality and even for simple - concepts. Consequently, practical decision-tree learning algorithms - are based on heuristic algorithms such as the greedy algorithm where - locally optimal decisions are made at each node. Such algorithms - cannot guarantee to return the globally optimal decision tree. This - can be mitigated by training multiple trees in an ensemble learner, - where the features and samples are randomly sampled with replacement. +- The problem of learning an optimal decision tree is known to be + NP-complete under several aspects of optimality and even for simple + concepts. Consequently, practical decision-tree learning algorithms + are based on heuristic algorithms such as the greedy algorithm where + locally optimal decisions are made at each node. Such algorithms + cannot guarantee to return the globally optimal decision tree. This + can be mitigated by training multiple trees in an ensemble learner, + where the features and samples are randomly sampled with replacement. - - There are concepts that are hard to learn because decision trees - do not express them easily, such as XOR, parity or multiplexer problems. +- There are concepts that are hard to learn because decision trees + do not express them easily, such as XOR, parity or multiplexer problems. - - Decision tree learners create biased trees if some classes dominate. - It is therefore recommended to balance the dataset prior to fitting - with the decision tree. +- Decision tree learners create biased trees if some classes dominate. + It is therefore recommended to balance the dataset prior to fitting + with the decision tree. .. _tree_classification: @@ -273,19 +273,19 @@ generalization accuracy of the resulting estimator may often be increased. With regard to decision trees, this strategy can readily be used to support multi-output problems. This requires the following changes: - - Store n output values in leaves, instead of 1; - - Use splitting criteria that compute the average reduction across all - n outputs. +- Store n output values in leaves, instead of 1; +- Use splitting criteria that compute the average reduction across all + n outputs. This module offers support for multi-output problems by implementing this strategy in both :class:`DecisionTreeClassifier` and :class:`DecisionTreeRegressor`. If a decision tree is fit on an output array Y of shape ``(n_samples, n_outputs)`` then the resulting estimator will: - * Output n_output values upon ``predict``; +* Output n_output values upon ``predict``; - * Output a list of n_output arrays of class probabilities upon - ``predict_proba``. +* Output a list of n_output arrays of class probabilities upon + ``predict_proba``. The use of multi-output trees for regression is demonstrated in :ref:`sphx_glr_auto_examples_tree_plot_tree_regression_multioutput.py`. In this example, the input @@ -315,10 +315,10 @@ the lower half of those faces. **References** |details-split| - * M. Dumont et al, `Fast multi-class image annotation with random subwindows - and multiple output randomized trees - `_, International Conference on - Computer Vision Theory and Applications 2009 +* M. Dumont et al, `Fast multi-class image annotation with random subwindows + and multiple output randomized trees + `_, International Conference on + Computer Vision Theory and Applications 2009 |details-end| @@ -343,65 +343,65 @@ total cost over the entire trees (by summing the cost at each node) of Tips on practical use ===================== - * Decision trees tend to overfit on data with a large number of features. - Getting the right ratio of samples to number of features is important, since - a tree with few samples in high dimensional space is very likely to overfit. - - * Consider performing dimensionality reduction (:ref:`PCA `, - :ref:`ICA `, or :ref:`feature_selection`) beforehand to - give your tree a better chance of finding features that are discriminative. - - * :ref:`sphx_glr_auto_examples_tree_plot_unveil_tree_structure.py` will help - in gaining more insights about how the decision tree makes predictions, which is - important for understanding the important features in the data. - - * Visualize your tree as you are training by using the ``export`` - function. Use ``max_depth=3`` as an initial tree depth to get a feel for - how the tree is fitting to your data, and then increase the depth. - - * Remember that the number of samples required to populate the tree doubles - for each additional level the tree grows to. Use ``max_depth`` to control - the size of the tree to prevent overfitting. - - * Use ``min_samples_split`` or ``min_samples_leaf`` to ensure that multiple - samples inform every decision in the tree, by controlling which splits will - be considered. A very small number will usually mean the tree will overfit, - whereas a large number will prevent the tree from learning the data. Try - ``min_samples_leaf=5`` as an initial value. If the sample size varies - greatly, a float number can be used as percentage in these two parameters. - While ``min_samples_split`` can create arbitrarily small leaves, - ``min_samples_leaf`` guarantees that each leaf has a minimum size, avoiding - low-variance, over-fit leaf nodes in regression problems. For - classification with few classes, ``min_samples_leaf=1`` is often the best - choice. - - Note that ``min_samples_split`` considers samples directly and independent of - ``sample_weight``, if provided (e.g. a node with m weighted samples is still - treated as having exactly m samples). Consider ``min_weight_fraction_leaf`` or - ``min_impurity_decrease`` if accounting for sample weights is required at splits. - - * Balance your dataset before training to prevent the tree from being biased - toward the classes that are dominant. Class balancing can be done by - sampling an equal number of samples from each class, or preferably by - normalizing the sum of the sample weights (``sample_weight``) for each - class to the same value. Also note that weight-based pre-pruning criteria, - such as ``min_weight_fraction_leaf``, will then be less biased toward - dominant classes than criteria that are not aware of the sample weights, - like ``min_samples_leaf``. - - * If the samples are weighted, it will be easier to optimize the tree - structure using weight-based pre-pruning criterion such as - ``min_weight_fraction_leaf``, which ensure that leaf nodes contain at least - a fraction of the overall sum of the sample weights. - - * All decision trees use ``np.float32`` arrays internally. - If training data is not in this format, a copy of the dataset will be made. - - * If the input matrix X is very sparse, it is recommended to convert to sparse - ``csc_matrix`` before calling fit and sparse ``csr_matrix`` before calling - predict. Training time can be orders of magnitude faster for a sparse - matrix input compared to a dense matrix when features have zero values in - most of the samples. +* Decision trees tend to overfit on data with a large number of features. + Getting the right ratio of samples to number of features is important, since + a tree with few samples in high dimensional space is very likely to overfit. + +* Consider performing dimensionality reduction (:ref:`PCA `, + :ref:`ICA `, or :ref:`feature_selection`) beforehand to + give your tree a better chance of finding features that are discriminative. + +* :ref:`sphx_glr_auto_examples_tree_plot_unveil_tree_structure.py` will help + in gaining more insights about how the decision tree makes predictions, which is + important for understanding the important features in the data. + +* Visualize your tree as you are training by using the ``export`` + function. Use ``max_depth=3`` as an initial tree depth to get a feel for + how the tree is fitting to your data, and then increase the depth. + +* Remember that the number of samples required to populate the tree doubles + for each additional level the tree grows to. Use ``max_depth`` to control + the size of the tree to prevent overfitting. + +* Use ``min_samples_split`` or ``min_samples_leaf`` to ensure that multiple + samples inform every decision in the tree, by controlling which splits will + be considered. A very small number will usually mean the tree will overfit, + whereas a large number will prevent the tree from learning the data. Try + ``min_samples_leaf=5`` as an initial value. If the sample size varies + greatly, a float number can be used as percentage in these two parameters. + While ``min_samples_split`` can create arbitrarily small leaves, + ``min_samples_leaf`` guarantees that each leaf has a minimum size, avoiding + low-variance, over-fit leaf nodes in regression problems. For + classification with few classes, ``min_samples_leaf=1`` is often the best + choice. + + Note that ``min_samples_split`` considers samples directly and independent of + ``sample_weight``, if provided (e.g. a node with m weighted samples is still + treated as having exactly m samples). Consider ``min_weight_fraction_leaf`` or + ``min_impurity_decrease`` if accounting for sample weights is required at splits. + +* Balance your dataset before training to prevent the tree from being biased + toward the classes that are dominant. Class balancing can be done by + sampling an equal number of samples from each class, or preferably by + normalizing the sum of the sample weights (``sample_weight``) for each + class to the same value. Also note that weight-based pre-pruning criteria, + such as ``min_weight_fraction_leaf``, will then be less biased toward + dominant classes than criteria that are not aware of the sample weights, + like ``min_samples_leaf``. + +* If the samples are weighted, it will be easier to optimize the tree + structure using weight-based pre-pruning criterion such as + ``min_weight_fraction_leaf``, which ensure that leaf nodes contain at least + a fraction of the overall sum of the sample weights. + +* All decision trees use ``np.float32`` arrays internally. + If training data is not in this format, a copy of the dataset will be made. + +* If the input matrix X is very sparse, it is recommended to convert to sparse + ``csc_matrix`` before calling fit and sparse ``csr_matrix`` before calling + predict. Training time can be orders of magnitude faster for a sparse + matrix input compared to a dense matrix when features have zero values in + most of the samples. .. _tree_algorithms: @@ -516,36 +516,36 @@ Log Loss or Entropy: H(Q_m) = - \sum_k p_{mk} \log(p_{mk}) |details-start| -Shannon entropy: +**Shannon entropy** |details-split| - The entropy criterion computes the Shannon entropy of the possible classes. It - takes the class frequencies of the training data points that reached a given - leaf :math:`m` as their probability. Using the **Shannon entropy as tree node - splitting criterion is equivalent to minimizing the log loss** (also known as - cross-entropy and multinomial deviance) between the true labels :math:`y_i` - and the probabilistic predictions :math:`T_k(x_i)` of the tree model :math:`T` for class :math:`k`. +The entropy criterion computes the Shannon entropy of the possible classes. It +takes the class frequencies of the training data points that reached a given +leaf :math:`m` as their probability. Using the **Shannon entropy as tree node +splitting criterion is equivalent to minimizing the log loss** (also known as +cross-entropy and multinomial deviance) between the true labels :math:`y_i` +and the probabilistic predictions :math:`T_k(x_i)` of the tree model :math:`T` for class :math:`k`. - To see this, first recall that the log loss of a tree model :math:`T` - computed on a dataset :math:`D` is defined as follows: +To see this, first recall that the log loss of a tree model :math:`T` +computed on a dataset :math:`D` is defined as follows: - .. math:: +.. math:: - \mathrm{LL}(D, T) = -\frac{1}{n} \sum_{(x_i, y_i) \in D} \sum_k I(y_i = k) \log(T_k(x_i)) + \mathrm{LL}(D, T) = -\frac{1}{n} \sum_{(x_i, y_i) \in D} \sum_k I(y_i = k) \log(T_k(x_i)) - where :math:`D` is a training dataset of :math:`n` pairs :math:`(x_i, y_i)`. +where :math:`D` is a training dataset of :math:`n` pairs :math:`(x_i, y_i)`. - In a classification tree, the predicted class probabilities within leaf nodes - are constant, that is: for all :math:`(x_i, y_i) \in Q_m`, one has: - :math:`T_k(x_i) = p_{mk}` for each class :math:`k`. +In a classification tree, the predicted class probabilities within leaf nodes +are constant, that is: for all :math:`(x_i, y_i) \in Q_m`, one has: +:math:`T_k(x_i) = p_{mk}` for each class :math:`k`. - This property makes it possible to rewrite :math:`\mathrm{LL}(D, T)` as the - sum of the Shannon entropies computed for each leaf of :math:`T` weighted by - the number of training data points that reached each leaf: +This property makes it possible to rewrite :math:`\mathrm{LL}(D, T)` as the +sum of the Shannon entropies computed for each leaf of :math:`T` weighted by +the number of training data points that reached each leaf: - .. math:: +.. math:: - \mathrm{LL}(D, T) = \sum_{m \in T} \frac{n_m}{n} H(Q_m) + \mathrm{LL}(D, T) = \sum_{m \in T} \frac{n_m}{n} H(Q_m) |details-end| @@ -605,50 +605,50 @@ the split with all the missing values going to the left node or the right node. Decisions are made as follows: - - By default when predicting, the samples with missing values are classified - with the class used in the split found during training:: +- By default when predicting, the samples with missing values are classified + with the class used in the split found during training:: - >>> from sklearn.tree import DecisionTreeClassifier - >>> import numpy as np + >>> from sklearn.tree import DecisionTreeClassifier + >>> import numpy as np - >>> X = np.array([0, 1, 6, np.nan]).reshape(-1, 1) - >>> y = [0, 0, 1, 1] + >>> X = np.array([0, 1, 6, np.nan]).reshape(-1, 1) + >>> y = [0, 0, 1, 1] - >>> tree = DecisionTreeClassifier(random_state=0).fit(X, y) - >>> tree.predict(X) - array([0, 0, 1, 1]) + >>> tree = DecisionTreeClassifier(random_state=0).fit(X, y) + >>> tree.predict(X) + array([0, 0, 1, 1]) - - If the criterion evaluation is the same for both nodes, - then the tie for missing value at predict time is broken by going to the - right node. The splitter also checks the split where all the missing - values go to one child and non-missing values go to the other:: +- If the criterion evaluation is the same for both nodes, + then the tie for missing value at predict time is broken by going to the + right node. The splitter also checks the split where all the missing + values go to one child and non-missing values go to the other:: - >>> from sklearn.tree import DecisionTreeClassifier - >>> import numpy as np + >>> from sklearn.tree import DecisionTreeClassifier + >>> import numpy as np - >>> X = np.array([np.nan, -1, np.nan, 1]).reshape(-1, 1) - >>> y = [0, 0, 1, 1] + >>> X = np.array([np.nan, -1, np.nan, 1]).reshape(-1, 1) + >>> y = [0, 0, 1, 1] - >>> tree = DecisionTreeClassifier(random_state=0).fit(X, y) + >>> tree = DecisionTreeClassifier(random_state=0).fit(X, y) - >>> X_test = np.array([np.nan]).reshape(-1, 1) - >>> tree.predict(X_test) - array([1]) + >>> X_test = np.array([np.nan]).reshape(-1, 1) + >>> tree.predict(X_test) + array([1]) - - If no missing values are seen during training for a given feature, then during - prediction missing values are mapped to the child with the most samples:: +- If no missing values are seen during training for a given feature, then during + prediction missing values are mapped to the child with the most samples:: - >>> from sklearn.tree import DecisionTreeClassifier - >>> import numpy as np + >>> from sklearn.tree import DecisionTreeClassifier + >>> import numpy as np - >>> X = np.array([0, 1, 2, 3]).reshape(-1, 1) - >>> y = [0, 1, 1, 1] + >>> X = np.array([0, 1, 2, 3]).reshape(-1, 1) + >>> y = [0, 1, 1, 1] - >>> tree = DecisionTreeClassifier(random_state=0).fit(X, y) + >>> tree = DecisionTreeClassifier(random_state=0).fit(X, y) - >>> X_test = np.array([np.nan]).reshape(-1, 1) - >>> tree.predict(X_test) - array([1]) + >>> X_test = np.array([np.nan]).reshape(-1, 1) + >>> tree.predict(X_test) + array([1]) .. _minimal_cost_complexity_pruning: @@ -693,17 +693,17 @@ be pruned. This process stops when the pruned tree's minimal **References** |details-split| - .. [BRE] L. Breiman, J. Friedman, R. Olshen, and C. Stone. Classification - and Regression Trees. Wadsworth, Belmont, CA, 1984. +.. [BRE] L. Breiman, J. Friedman, R. Olshen, and C. Stone. Classification + and Regression Trees. Wadsworth, Belmont, CA, 1984. - * https://en.wikipedia.org/wiki/Decision_tree_learning +* https://en.wikipedia.org/wiki/Decision_tree_learning - * https://en.wikipedia.org/wiki/Predictive_analytics +* https://en.wikipedia.org/wiki/Predictive_analytics - * J.R. Quinlan. C4. 5: programs for machine learning. Morgan - Kaufmann, 1993. +* J.R. Quinlan. C4. 5: programs for machine learning. Morgan + Kaufmann, 1993. - * T. Hastie, R. Tibshirani and J. Friedman. Elements of Statistical - Learning, Springer, 2009. +* T. Hastie, R. Tibshirani and J. Friedman. Elements of Statistical + Learning, Springer, 2009. |details-end| diff --git a/doc/presentations.rst b/doc/presentations.rst index 47b7f16bd74a0..19fd09218b5fd 100644 --- a/doc/presentations.rst +++ b/doc/presentations.rst @@ -37,40 +37,40 @@ Videos `_ by `Gael Varoquaux`_ at ICML 2010 - A three minute video from a very early stage of scikit-learn, explaining the - basic idea and approach we are following. + A three minute video from a very early stage of scikit-learn, explaining the + basic idea and approach we are following. - `Introduction to statistical learning with scikit-learn `_ by `Gael Varoquaux`_ at SciPy 2011 - An extensive tutorial, consisting of four sessions of one hour. - The tutorial covers the basics of machine learning, - many algorithms and how to apply them using scikit-learn. The - material corresponding is now in the scikit-learn documentation - section :ref:`stat_learn_tut_index`. + An extensive tutorial, consisting of four sessions of one hour. + The tutorial covers the basics of machine learning, + many algorithms and how to apply them using scikit-learn. The + material corresponding is now in the scikit-learn documentation + section :ref:`stat_learn_tut_index`. - `Statistical Learning for Text Classification with scikit-learn and NLTK `_ (and `slides `_) by `Olivier Grisel`_ at PyCon 2011 - Thirty minute introduction to text classification. Explains how to - use NLTK and scikit-learn to solve real-world text classification - tasks and compares against cloud-based solutions. + Thirty minute introduction to text classification. Explains how to + use NLTK and scikit-learn to solve real-world text classification + tasks and compares against cloud-based solutions. - `Introduction to Interactive Predictive Analytics in Python with scikit-learn `_ by `Olivier Grisel`_ at PyCon 2012 - 3-hours long introduction to prediction tasks using scikit-learn. + 3-hours long introduction to prediction tasks using scikit-learn. - `scikit-learn - Machine Learning in Python `_ by `Jake Vanderplas`_ at the 2012 PyData workshop at Google - Interactive demonstration of some scikit-learn features. 75 minutes. + Interactive demonstration of some scikit-learn features. 75 minutes. - `scikit-learn tutorial `_ by `Jake Vanderplas`_ at PyData NYC 2012 - Presentation using the online tutorial, 45 minutes. + Presentation using the online tutorial, 45 minutes. .. _Gael Varoquaux: https://gael-varoquaux.info diff --git a/doc/support.rst b/doc/support.rst index 520bd015ff6da..bb60f49c70716 100644 --- a/doc/support.rst +++ b/doc/support.rst @@ -60,11 +60,11 @@ https://github.com/scikit-learn/scikit-learn/issues Don't forget to include: - - steps (or better script) to reproduce, +- steps (or better script) to reproduce, - - expected outcome, +- expected outcome, - - observed outcome or Python (or gdb) tracebacks +- observed outcome or Python (or gdb) tracebacks To help developers fix your bug faster, please link to a https://gist.github.com holding a standalone minimalistic python script that reproduces your bug and diff --git a/doc/tutorial/basic/tutorial.rst b/doc/tutorial/basic/tutorial.rst index d983d7806dce6..27dddb4e0e909 100644 --- a/doc/tutorial/basic/tutorial.rst +++ b/doc/tutorial/basic/tutorial.rst @@ -23,41 +23,41 @@ data), it is said to have several attributes or **features**. Learning problems fall into a few categories: - * `supervised learning `_, - in which the data comes with additional attributes that we want to predict - (:ref:`Click here ` - to go to the scikit-learn supervised learning page).This problem - can be either: - - * `classification - `_: - samples belong to two or more classes and we - want to learn from already labeled data how to predict the class - of unlabeled data. An example of a classification problem would - be handwritten digit recognition, in which the aim is - to assign each input vector to one of a finite number of discrete - categories. Another way to think of classification is as a discrete - (as opposed to continuous) form of supervised learning where one has a - limited number of categories and for each of the n samples provided, - one is to try to label them with the correct category or class. - - * `regression `_: - if the desired output consists of one or more - continuous variables, then the task is called *regression*. An - example of a regression problem would be the prediction of the - length of a salmon as a function of its age and weight. - - * `unsupervised learning `_, - in which the training data consists of a set of input vectors x - without any corresponding target values. The goal in such problems - may be to discover groups of similar examples within the data, where - it is called `clustering `_, - or to determine the distribution of data within the input space, known as - `density estimation `_, or - to project the data from a high-dimensional space down to two or three - dimensions for the purpose of *visualization* - (:ref:`Click here ` - to go to the Scikit-Learn unsupervised learning page). +* `supervised learning `_, + in which the data comes with additional attributes that we want to predict + (:ref:`Click here ` + to go to the scikit-learn supervised learning page).This problem + can be either: + + * `classification + `_: + samples belong to two or more classes and we + want to learn from already labeled data how to predict the class + of unlabeled data. An example of a classification problem would + be handwritten digit recognition, in which the aim is + to assign each input vector to one of a finite number of discrete + categories. Another way to think of classification is as a discrete + (as opposed to continuous) form of supervised learning where one has a + limited number of categories and for each of the n samples provided, + one is to try to label them with the correct category or class. + + * `regression `_: + if the desired output consists of one or more + continuous variables, then the task is called *regression*. An + example of a regression problem would be the prediction of the + length of a salmon as a function of its age and weight. + +* `unsupervised learning `_, + in which the training data consists of a set of input vectors x + without any corresponding target values. The goal in such problems + may be to discover groups of similar examples within the data, where + it is called `clustering `_, + or to determine the distribution of data within the input space, known as + `density estimation `_, or + to project the data from a high-dimensional space down to two or three + dimensions for the purpose of *visualization* + (:ref:`Click here ` + to go to the Scikit-Learn unsupervised learning page). .. topic:: Training set and testing set diff --git a/doc/tutorial/statistical_inference/model_selection.rst b/doc/tutorial/statistical_inference/model_selection.rst index dd0cec4de4db0..bf0290c9f7337 100644 --- a/doc/tutorial/statistical_inference/model_selection.rst +++ b/doc/tutorial/statistical_inference/model_selection.rst @@ -98,7 +98,7 @@ scoring method. ... scoring='precision_macro') array([0.96578289, 0.92708922, 0.96681476, 0.96362897, 0.93192644]) - **Cross-validation generators** +**Cross-validation generators** .. list-table:: @@ -185,8 +185,8 @@ scoring method. estimator with a linear kernel as a function of parameter ``C`` (use a logarithmic grid of points, from 1 to 10). - .. literalinclude:: ../../auto_examples/exercises/plot_cv_digits.py - :lines: 13-23 + .. literalinclude:: ../../auto_examples/exercises/plot_cv_digits.py + :lines: 13-23 .. image:: /auto_examples/exercises/images/sphx_glr_plot_cv_digits_001.png :target: ../../auto_examples/exercises/plot_cv_digits.html diff --git a/doc/tutorial/statistical_inference/putting_together.rst b/doc/tutorial/statistical_inference/putting_together.rst index 033bed2e33884..b28ba77bfac33 100644 --- a/doc/tutorial/statistical_inference/putting_together.rst +++ b/doc/tutorial/statistical_inference/putting_together.rst @@ -25,7 +25,7 @@ Face recognition with eigenfaces The dataset used in this example is a preprocessed excerpt of the "Labeled Faces in the Wild", also known as LFW_: - http://vis-www.cs.umass.edu/lfw/lfw-funneled.tgz (233MB) +http://vis-www.cs.umass.edu/lfw/lfw-funneled.tgz (233MB) .. _LFW: http://vis-www.cs.umass.edu/lfw/ diff --git a/doc/tutorial/statistical_inference/supervised_learning.rst b/doc/tutorial/statistical_inference/supervised_learning.rst index d7477b279662d..45fc4cf5b9bc0 100644 --- a/doc/tutorial/statistical_inference/supervised_learning.rst +++ b/doc/tutorial/statistical_inference/supervised_learning.rst @@ -157,10 +157,10 @@ of the model as small as possible. Linear models: :math:`y = X\beta + \epsilon` - * :math:`X`: data - * :math:`y`: target variable - * :math:`\beta`: Coefficients - * :math:`\epsilon`: Observation noise +* :math:`X`: data +* :math:`y`: target variable +* :math:`\beta`: Coefficients +* :math:`\epsilon`: Observation noise .. image:: /auto_examples/linear_model/images/sphx_glr_plot_ols_001.png :target: ../../auto_examples/linear_model/plot_ols.html diff --git a/doc/tutorial/statistical_inference/unsupervised_learning.rst b/doc/tutorial/statistical_inference/unsupervised_learning.rst index e385eccaf592c..fd827cc75b212 100644 --- a/doc/tutorial/statistical_inference/unsupervised_learning.rst +++ b/doc/tutorial/statistical_inference/unsupervised_learning.rst @@ -12,7 +12,8 @@ Clustering: grouping observations together **clustering task**: split the observations into well-separated group called *clusters*. -.. +:: + >>> # Set the PRNG >>> import numpy as np >>> np.random.seed(1) @@ -100,18 +101,18 @@ A :ref:`hierarchical_clustering` method is a type of cluster analysis that aims to build a hierarchy of clusters. In general, the various approaches of this technique are either: - * **Agglomerative** - bottom-up approaches: each observation starts in its - own cluster, and clusters are iteratively merged in such a way to - minimize a *linkage* criterion. This approach is particularly interesting - when the clusters of interest are made of only a few observations. When - the number of clusters is large, it is much more computationally efficient - than k-means. - - * **Divisive** - top-down approaches: all observations start in one - cluster, which is iteratively split as one moves down the hierarchy. - For estimating large numbers of clusters, this approach is both slow (due - to all observations starting as one cluster, which it splits recursively) - and statistically ill-posed. +* **Agglomerative** - bottom-up approaches: each observation starts in its + own cluster, and clusters are iteratively merged in such a way to + minimize a *linkage* criterion. This approach is particularly interesting + when the clusters of interest are made of only a few observations. When + the number of clusters is large, it is much more computationally efficient + than k-means. + +* **Divisive** - top-down approaches: all observations start in one + cluster, which is iteratively split as one moves down the hierarchy. + For estimating large numbers of clusters, this approach is both slow (due + to all observations starting as one cluster, which it splits recursively) + and statistically ill-posed. Connectivity-constrained clustering ..................................... diff --git a/doc/tutorial/text_analytics/working_with_text_data.rst b/doc/tutorial/text_analytics/working_with_text_data.rst index 0880fe3118e4f..43fd305c3b8b6 100644 --- a/doc/tutorial/text_analytics/working_with_text_data.rst +++ b/doc/tutorial/text_analytics/working_with_text_data.rst @@ -10,14 +10,14 @@ documents (newsgroups posts) on twenty different topics. In this section we will see how to: - - load the file contents and the categories +- load the file contents and the categories - - extract feature vectors suitable for machine learning +- extract feature vectors suitable for machine learning - - train a linear model to perform categorization +- train a linear model to perform categorization - - use a grid search strategy to find a good configuration of both - the feature extraction components and the classifier +- use a grid search strategy to find a good configuration of both + the feature extraction components and the classifier Tutorial setup @@ -38,13 +38,13 @@ The source can also be found `on Github The tutorial folder should contain the following sub-folders: - * ``*.rst files`` - the source of the tutorial document written with sphinx +* ``*.rst files`` - the source of the tutorial document written with sphinx - * ``data`` - folder to put the datasets used during the tutorial +* ``data`` - folder to put the datasets used during the tutorial - * ``skeletons`` - sample incomplete scripts for the exercises +* ``skeletons`` - sample incomplete scripts for the exercises - * ``solutions`` - solutions of the exercises +* ``solutions`` - solutions of the exercises You can already copy the skeletons into a new folder somewhere @@ -180,13 +180,13 @@ Bags of words The most intuitive way to do so is to use a bags of words representation: - 1. Assign a fixed integer id to each word occurring in any document - of the training set (for instance by building a dictionary - from words to integer indices). +1. Assign a fixed integer id to each word occurring in any document + of the training set (for instance by building a dictionary + from words to integer indices). - 2. For each document ``#i``, count the number of occurrences of each - word ``w`` and store it in ``X[i, j]`` as the value of feature - ``#j`` where ``j`` is the index of word ``w`` in the dictionary. +2. For each document ``#i``, count the number of occurrences of each + word ``w`` and store it in ``X[i, j]`` as the value of feature + ``#j`` where ``j`` is the index of word ``w`` in the dictionary. The bags of words representation implies that ``n_features`` is the number of distinct words in the corpus: this number is typically diff --git a/doc/whats_new/older_versions.rst b/doc/whats_new/older_versions.rst index 5a1d6a1c7c13f..12ed10a6206f4 100644 --- a/doc/whats_new/older_versions.rst +++ b/doc/whats_new/older_versions.rst @@ -40,14 +40,14 @@ Changelog People ------ - * 14 `Peter Prettenhofer`_ - * 12 `Gael Varoquaux`_ - * 10 `Andreas Müller`_ - * 5 `Lars Buitinck`_ - * 3 :user:`Virgile Fritsch ` - * 1 `Alexandre Gramfort`_ - * 1 `Gilles Louppe`_ - * 1 `Mathieu Blondel`_ +* 14 `Peter Prettenhofer`_ +* 12 `Gael Varoquaux`_ +* 10 `Andreas Müller`_ +* 5 `Lars Buitinck`_ +* 3 :user:`Virgile Fritsch ` +* 1 `Alexandre Gramfort`_ +* 1 `Gilles Louppe`_ +* 1 `Mathieu Blondel`_ .. _changes_0_12: @@ -194,53 +194,53 @@ API changes summary People ------ - * 267 `Andreas Müller`_ - * 94 `Gilles Louppe`_ - * 89 `Gael Varoquaux`_ - * 79 `Peter Prettenhofer`_ - * 60 `Mathieu Blondel`_ - * 57 `Alexandre Gramfort`_ - * 52 `Vlad Niculae`_ - * 45 `Lars Buitinck`_ - * 44 Nelle Varoquaux - * 37 `Jaques Grobler`_ - * 30 Alexis Mignon - * 30 Immanuel Bayer - * 27 `Olivier Grisel`_ - * 16 Subhodeep Moitra - * 13 Yannick Schwartz - * 12 :user:`@kernc ` - * 11 :user:`Virgile Fritsch ` - * 9 Daniel Duckworth - * 9 `Fabian Pedregosa`_ - * 9 `Robert Layton`_ - * 8 John Benediktsson - * 7 Marko Burjek - * 5 `Nicolas Pinto`_ - * 4 Alexandre Abraham - * 4 `Jake Vanderplas`_ - * 3 `Brian Holt`_ - * 3 `Edouard Duchesnay`_ - * 3 Florian Hoenig - * 3 flyingimmidev - * 2 Francois Savard - * 2 Hannes Schulz - * 2 Peter Welinder - * 2 `Yaroslav Halchenko`_ - * 2 Wei Li - * 1 Alex Companioni - * 1 Brandyn A. White - * 1 Bussonnier Matthias - * 1 Charles-Pierre Astolfi - * 1 Dan O'Huiginn - * 1 David Cournapeau - * 1 Keith Goodman - * 1 Ludwig Schwardt - * 1 Olivier Hervieu - * 1 Sergio Medina - * 1 Shiqiao Du - * 1 Tim Sheerman-Chase - * 1 buguen +* 267 `Andreas Müller`_ +* 94 `Gilles Louppe`_ +* 89 `Gael Varoquaux`_ +* 79 `Peter Prettenhofer`_ +* 60 `Mathieu Blondel`_ +* 57 `Alexandre Gramfort`_ +* 52 `Vlad Niculae`_ +* 45 `Lars Buitinck`_ +* 44 Nelle Varoquaux +* 37 `Jaques Grobler`_ +* 30 Alexis Mignon +* 30 Immanuel Bayer +* 27 `Olivier Grisel`_ +* 16 Subhodeep Moitra +* 13 Yannick Schwartz +* 12 :user:`@kernc ` +* 11 :user:`Virgile Fritsch ` +* 9 Daniel Duckworth +* 9 `Fabian Pedregosa`_ +* 9 `Robert Layton`_ +* 8 John Benediktsson +* 7 Marko Burjek +* 5 `Nicolas Pinto`_ +* 4 Alexandre Abraham +* 4 `Jake Vanderplas`_ +* 3 `Brian Holt`_ +* 3 `Edouard Duchesnay`_ +* 3 Florian Hoenig +* 3 flyingimmidev +* 2 Francois Savard +* 2 Hannes Schulz +* 2 Peter Welinder +* 2 `Yaroslav Halchenko`_ +* 2 Wei Li +* 1 Alex Companioni +* 1 Brandyn A. White +* 1 Bussonnier Matthias +* 1 Charles-Pierre Astolfi +* 1 Dan O'Huiginn +* 1 David Cournapeau +* 1 Keith Goodman +* 1 Ludwig Schwardt +* 1 Olivier Hervieu +* 1 Sergio Medina +* 1 Shiqiao Du +* 1 Tim Sheerman-Chase +* 1 buguen @@ -431,54 +431,55 @@ API changes summary People ------ - * 282 `Andreas Müller`_ - * 239 `Peter Prettenhofer`_ - * 198 `Gael Varoquaux`_ - * 129 `Olivier Grisel`_ - * 114 `Mathieu Blondel`_ - * 103 Clay Woolam - * 96 `Lars Buitinck`_ - * 88 `Jaques Grobler`_ - * 82 `Alexandre Gramfort`_ - * 50 `Bertrand Thirion`_ - * 42 `Robert Layton`_ - * 28 flyingimmidev - * 26 `Jake Vanderplas`_ - * 26 Shiqiao Du - * 21 `Satrajit Ghosh`_ - * 17 `David Marek`_ - * 17 `Gilles Louppe`_ - * 14 `Vlad Niculae`_ - * 11 Yannick Schwartz - * 10 `Fabian Pedregosa`_ - * 9 fcostin - * 7 Nick Wilson - * 5 Adrien Gaidon - * 5 `Nicolas Pinto`_ - * 4 `David Warde-Farley`_ - * 5 Nelle Varoquaux - * 5 Emmanuelle Gouillart - * 3 Joonas Sillanpää - * 3 Paolo Losi - * 2 Charles McCarthy - * 2 Roy Hyunjin Han - * 2 Scott White - * 2 ibayer - * 1 Brandyn White - * 1 Carlos Scheidegger - * 1 Claire Revillet - * 1 Conrad Lee - * 1 `Edouard Duchesnay`_ - * 1 Jan Hendrik Metzen - * 1 Meng Xinfan - * 1 `Rob Zinkov`_ - * 1 Shiqiao - * 1 Udi Weinsberg - * 1 Virgile Fritsch - * 1 Xinfan Meng - * 1 Yaroslav Halchenko - * 1 jansoe - * 1 Leon Palafox + +* 282 `Andreas Müller`_ +* 239 `Peter Prettenhofer`_ +* 198 `Gael Varoquaux`_ +* 129 `Olivier Grisel`_ +* 114 `Mathieu Blondel`_ +* 103 Clay Woolam +* 96 `Lars Buitinck`_ +* 88 `Jaques Grobler`_ +* 82 `Alexandre Gramfort`_ +* 50 `Bertrand Thirion`_ +* 42 `Robert Layton`_ +* 28 flyingimmidev +* 26 `Jake Vanderplas`_ +* 26 Shiqiao Du +* 21 `Satrajit Ghosh`_ +* 17 `David Marek`_ +* 17 `Gilles Louppe`_ +* 14 `Vlad Niculae`_ +* 11 Yannick Schwartz +* 10 `Fabian Pedregosa`_ +* 9 fcostin +* 7 Nick Wilson +* 5 Adrien Gaidon +* 5 `Nicolas Pinto`_ +* 4 `David Warde-Farley`_ +* 5 Nelle Varoquaux +* 5 Emmanuelle Gouillart +* 3 Joonas Sillanpää +* 3 Paolo Losi +* 2 Charles McCarthy +* 2 Roy Hyunjin Han +* 2 Scott White +* 2 ibayer +* 1 Brandyn White +* 1 Carlos Scheidegger +* 1 Claire Revillet +* 1 Conrad Lee +* 1 `Edouard Duchesnay`_ +* 1 Jan Hendrik Metzen +* 1 Meng Xinfan +* 1 `Rob Zinkov`_ +* 1 Shiqiao +* 1 Udi Weinsberg +* 1 Virgile Fritsch +* 1 Xinfan Meng +* 1 Yaroslav Halchenko +* 1 jansoe +* 1 Leon Palafox .. _changes_0_10: @@ -634,37 +635,37 @@ People The following people contributed to scikit-learn since last release: - * 246 `Andreas Müller`_ - * 242 `Olivier Grisel`_ - * 220 `Gilles Louppe`_ - * 183 `Brian Holt`_ - * 166 `Gael Varoquaux`_ - * 144 `Lars Buitinck`_ - * 73 `Vlad Niculae`_ - * 65 `Peter Prettenhofer`_ - * 64 `Fabian Pedregosa`_ - * 60 Robert Layton - * 55 `Mathieu Blondel`_ - * 52 `Jake Vanderplas`_ - * 44 Noel Dawe - * 38 `Alexandre Gramfort`_ - * 24 :user:`Virgile Fritsch ` - * 23 `Satrajit Ghosh`_ - * 3 Jan Hendrik Metzen - * 3 Kenneth C. Arnold - * 3 Shiqiao Du - * 3 Tim Sheerman-Chase - * 3 `Yaroslav Halchenko`_ - * 2 Bala Subrahmanyam Varanasi - * 2 DraXus - * 2 Michael Eickenberg - * 1 Bogdan Trach - * 1 Félix-Antoine Fortin - * 1 Juan Manuel Caicedo Carvajal - * 1 Nelle Varoquaux - * 1 `Nicolas Pinto`_ - * 1 Tiziano Zito - * 1 Xinfan Meng +* 246 `Andreas Müller`_ +* 242 `Olivier Grisel`_ +* 220 `Gilles Louppe`_ +* 183 `Brian Holt`_ +* 166 `Gael Varoquaux`_ +* 144 `Lars Buitinck`_ +* 73 `Vlad Niculae`_ +* 65 `Peter Prettenhofer`_ +* 64 `Fabian Pedregosa`_ +* 60 Robert Layton +* 55 `Mathieu Blondel`_ +* 52 `Jake Vanderplas`_ +* 44 Noel Dawe +* 38 `Alexandre Gramfort`_ +* 24 :user:`Virgile Fritsch ` +* 23 `Satrajit Ghosh`_ +* 3 Jan Hendrik Metzen +* 3 Kenneth C. Arnold +* 3 Shiqiao Du +* 3 Tim Sheerman-Chase +* 3 `Yaroslav Halchenko`_ +* 2 Bala Subrahmanyam Varanasi +* 2 DraXus +* 2 Michael Eickenberg +* 1 Bogdan Trach +* 1 Félix-Antoine Fortin +* 1 Juan Manuel Caicedo Carvajal +* 1 Nelle Varoquaux +* 1 `Nicolas Pinto`_ +* 1 Tiziano Zito +* 1 Xinfan Meng @@ -993,20 +994,20 @@ People that made this release possible preceded by number of commits: - 25 `Peter Prettenhofer`_ - 22 `Nicolas Pinto`_ - 11 :user:`Virgile Fritsch ` - - 7 Lars Buitinck - - 6 Vincent Michel - - 5 `Bertrand Thirion`_ - - 4 Thouis (Ray) Jones - - 4 Vincent Schut - - 3 Jan Schlüter - - 2 Julien Miotte - - 2 `Matthieu Perrot`_ - - 2 Yann Malet - - 2 `Yaroslav Halchenko`_ - - 1 Amit Aides - - 1 `Andreas Müller`_ - - 1 Feth Arezki - - 1 Meng Xinfan +- 7 Lars Buitinck +- 6 Vincent Michel +- 5 `Bertrand Thirion`_ +- 4 Thouis (Ray) Jones +- 4 Vincent Schut +- 3 Jan Schlüter +- 2 Julien Miotte +- 2 `Matthieu Perrot`_ +- 2 Yann Malet +- 2 `Yaroslav Halchenko`_ +- 1 Amit Aides +- 1 `Andreas Müller`_ +- 1 Feth Arezki +- 1 Meng Xinfan .. _changes_0_7: @@ -1175,31 +1176,31 @@ People People that made this release possible preceded by number of commits: - * 207 `Olivier Grisel`_ +* 207 `Olivier Grisel`_ - * 167 `Fabian Pedregosa`_ +* 167 `Fabian Pedregosa`_ - * 97 `Peter Prettenhofer`_ +* 97 `Peter Prettenhofer`_ - * 68 `Alexandre Gramfort`_ +* 68 `Alexandre Gramfort`_ - * 59 `Mathieu Blondel`_ +* 59 `Mathieu Blondel`_ - * 55 `Gael Varoquaux`_ +* 55 `Gael Varoquaux`_ - * 33 Vincent Dubourg +* 33 Vincent Dubourg - * 21 `Ron Weiss`_ +* 21 `Ron Weiss`_ - * 9 Bertrand Thirion +* 9 Bertrand Thirion - * 3 `Alexandre Passos`_ +* 3 `Alexandre Passos`_ - * 3 Anne-Laure Fouque +* 3 Anne-Laure Fouque - * 2 Ronan Amicel +* 2 Ronan Amicel - * 1 `Christian Osendorfer`_ +* 1 `Christian Osendorfer`_ @@ -1304,20 +1305,20 @@ Authors The following is a list of authors for this release, preceded by number of commits: - * 262 Fabian Pedregosa - * 240 Gael Varoquaux - * 149 Alexandre Gramfort - * 116 Olivier Grisel - * 40 Vincent Michel - * 38 Ron Weiss - * 23 Matthieu Perrot - * 10 Bertrand Thirion - * 7 Yaroslav Halchenko - * 9 VirgileFritsch - * 6 Edouard Duchesnay - * 4 Mathieu Blondel - * 1 Ariel Rokem - * 1 Matthieu Brucher +* 262 Fabian Pedregosa +* 240 Gael Varoquaux +* 149 Alexandre Gramfort +* 116 Olivier Grisel +* 40 Vincent Michel +* 38 Ron Weiss +* 23 Matthieu Perrot +* 10 Bertrand Thirion +* 7 Yaroslav Halchenko +* 9 VirgileFritsch +* 6 Edouard Duchesnay +* 4 Mathieu Blondel +* 1 Ariel Rokem +* 1 Matthieu Brucher Version 0.4 =========== @@ -1368,13 +1369,13 @@ Authors The committer list for this release is the following (preceded by number of commits): - * 143 Fabian Pedregosa - * 35 Alexandre Gramfort - * 34 Olivier Grisel - * 11 Gael Varoquaux - * 5 Yaroslav Halchenko - * 2 Vincent Michel - * 1 Chris Filo Gorgolewski +* 143 Fabian Pedregosa +* 35 Alexandre Gramfort +* 34 Olivier Grisel +* 11 Gael Varoquaux +* 5 Yaroslav Halchenko +* 2 Vincent Michel +* 1 Chris Filo Gorgolewski Earlier versions diff --git a/doc/whats_new/v0.13.rst b/doc/whats_new/v0.13.rst index 00be322bf38fc..6c24d1c52b150 100644 --- a/doc/whats_new/v0.13.rst +++ b/doc/whats_new/v0.13.rst @@ -33,21 +33,22 @@ Changelog People ------ List of contributors for release 0.13.1 by number of commits. - * 16 `Lars Buitinck`_ - * 12 `Andreas Müller`_ - * 8 `Gael Varoquaux`_ - * 5 Robert Marchman - * 3 `Peter Prettenhofer`_ - * 2 Hrishikesh Huilgolkar - * 1 Bastiaan van den Berg - * 1 Diego Molla - * 1 `Gilles Louppe`_ - * 1 `Mathieu Blondel`_ - * 1 `Nelle Varoquaux`_ - * 1 Rafael Cunha de Almeida - * 1 Rolando Espinoza La fuente - * 1 `Vlad Niculae`_ - * 1 `Yaroslav Halchenko`_ + +* 16 `Lars Buitinck`_ +* 12 `Andreas Müller`_ +* 8 `Gael Varoquaux`_ +* 5 Robert Marchman +* 3 `Peter Prettenhofer`_ +* 2 Hrishikesh Huilgolkar +* 1 Bastiaan van den Berg +* 1 Diego Molla +* 1 `Gilles Louppe`_ +* 1 `Mathieu Blondel`_ +* 1 `Nelle Varoquaux`_ +* 1 Rafael Cunha de Almeida +* 1 Rolando Espinoza La fuente +* 1 `Vlad Niculae`_ +* 1 `Yaroslav Halchenko`_ .. _changes_0_13: @@ -323,69 +324,69 @@ People ------ List of contributors for release 0.13 by number of commits. - * 364 `Andreas Müller`_ - * 143 `Arnaud Joly`_ - * 137 `Peter Prettenhofer`_ - * 131 `Gael Varoquaux`_ - * 117 `Mathieu Blondel`_ - * 108 `Lars Buitinck`_ - * 106 Wei Li - * 101 `Olivier Grisel`_ - * 65 `Vlad Niculae`_ - * 54 `Gilles Louppe`_ - * 40 `Jaques Grobler`_ - * 38 `Alexandre Gramfort`_ - * 30 `Rob Zinkov`_ - * 19 Aymeric Masurelle - * 18 Andrew Winterman - * 17 `Fabian Pedregosa`_ - * 17 Nelle Varoquaux - * 16 `Christian Osendorfer`_ - * 14 `Daniel Nouri`_ - * 13 :user:`Virgile Fritsch ` - * 13 syhw - * 12 `Satrajit Ghosh`_ - * 10 Corey Lynch - * 10 Kyle Beauchamp - * 9 Brian Cheung - * 9 Immanuel Bayer - * 9 mr.Shu - * 8 Conrad Lee - * 8 `James Bergstra`_ - * 7 Tadej Janež - * 6 Brian Cajes - * 6 `Jake Vanderplas`_ - * 6 Michael - * 6 Noel Dawe - * 6 Tiago Nunes - * 6 cow - * 5 Anze - * 5 Shiqiao Du - * 4 Christian Jauvin - * 4 Jacques Kvam - * 4 Richard T. Guy - * 4 `Robert Layton`_ - * 3 Alexandre Abraham - * 3 Doug Coleman - * 3 Scott Dickerson - * 2 ApproximateIdentity - * 2 John Benediktsson - * 2 Mark Veronda - * 2 Matti Lyra - * 2 Mikhail Korobov - * 2 Xinfan Meng - * 1 Alejandro Weinstein - * 1 `Alexandre Passos`_ - * 1 Christoph Deil - * 1 Eugene Nizhibitsky - * 1 Kenneth C. Arnold - * 1 Luis Pedro Coelho - * 1 Miroslav Batchkarov - * 1 Pavel - * 1 Sebastian Berg - * 1 Shaun Jackman - * 1 Subhodeep Moitra - * 1 bob - * 1 dengemann - * 1 emanuele - * 1 x006 +* 364 `Andreas Müller`_ +* 143 `Arnaud Joly`_ +* 137 `Peter Prettenhofer`_ +* 131 `Gael Varoquaux`_ +* 117 `Mathieu Blondel`_ +* 108 `Lars Buitinck`_ +* 106 Wei Li +* 101 `Olivier Grisel`_ +* 65 `Vlad Niculae`_ +* 54 `Gilles Louppe`_ +* 40 `Jaques Grobler`_ +* 38 `Alexandre Gramfort`_ +* 30 `Rob Zinkov`_ +* 19 Aymeric Masurelle +* 18 Andrew Winterman +* 17 `Fabian Pedregosa`_ +* 17 Nelle Varoquaux +* 16 `Christian Osendorfer`_ +* 14 `Daniel Nouri`_ +* 13 :user:`Virgile Fritsch ` +* 13 syhw +* 12 `Satrajit Ghosh`_ +* 10 Corey Lynch +* 10 Kyle Beauchamp +* 9 Brian Cheung +* 9 Immanuel Bayer +* 9 mr.Shu +* 8 Conrad Lee +* 8 `James Bergstra`_ +* 7 Tadej Janež +* 6 Brian Cajes +* 6 `Jake Vanderplas`_ +* 6 Michael +* 6 Noel Dawe +* 6 Tiago Nunes +* 6 cow +* 5 Anze +* 5 Shiqiao Du +* 4 Christian Jauvin +* 4 Jacques Kvam +* 4 Richard T. Guy +* 4 `Robert Layton`_ +* 3 Alexandre Abraham +* 3 Doug Coleman +* 3 Scott Dickerson +* 2 ApproximateIdentity +* 2 John Benediktsson +* 2 Mark Veronda +* 2 Matti Lyra +* 2 Mikhail Korobov +* 2 Xinfan Meng +* 1 Alejandro Weinstein +* 1 `Alexandre Passos`_ +* 1 Christoph Deil +* 1 Eugene Nizhibitsky +* 1 Kenneth C. Arnold +* 1 Luis Pedro Coelho +* 1 Miroslav Batchkarov +* 1 Pavel +* 1 Sebastian Berg +* 1 Shaun Jackman +* 1 Subhodeep Moitra +* 1 bob +* 1 dengemann +* 1 emanuele +* 1 x006 diff --git a/doc/whats_new/v0.14.rst b/doc/whats_new/v0.14.rst index 4bd04ad180c4e..74ef162e20e5a 100644 --- a/doc/whats_new/v0.14.rst +++ b/doc/whats_new/v0.14.rst @@ -297,91 +297,91 @@ People ------ List of contributors for release 0.14 by number of commits. - * 277 Gilles Louppe - * 245 Lars Buitinck - * 187 Andreas Mueller - * 124 Arnaud Joly - * 112 Jaques Grobler - * 109 Gael Varoquaux - * 107 Olivier Grisel - * 102 Noel Dawe - * 99 Kemal Eren - * 79 Joel Nothman - * 75 Jake VanderPlas - * 73 Nelle Varoquaux - * 71 Vlad Niculae - * 65 Peter Prettenhofer - * 64 Alexandre Gramfort - * 54 Mathieu Blondel - * 38 Nicolas Trésegnie - * 35 eustache - * 27 Denis Engemann - * 25 Yann N. Dauphin - * 19 Justin Vincent - * 17 Robert Layton - * 15 Doug Coleman - * 14 Michael Eickenberg - * 13 Robert Marchman - * 11 Fabian Pedregosa - * 11 Philippe Gervais - * 10 Jim Holmström - * 10 Tadej Janež - * 10 syhw - * 9 Mikhail Korobov - * 9 Steven De Gryze - * 8 sergeyf - * 7 Ben Root - * 7 Hrishikesh Huilgolkar - * 6 Kyle Kastner - * 6 Martin Luessi - * 6 Rob Speer - * 5 Federico Vaggi - * 5 Raul Garreta - * 5 Rob Zinkov - * 4 Ken Geis - * 3 A. Flaxman - * 3 Denton Cockburn - * 3 Dougal Sutherland - * 3 Ian Ozsvald - * 3 Johannes Schönberger - * 3 Robert McGibbon - * 3 Roman Sinayev - * 3 Szabo Roland - * 2 Diego Molla - * 2 Imran Haque - * 2 Jochen Wersdörfer - * 2 Sergey Karayev - * 2 Yannick Schwartz - * 2 jamestwebber - * 1 Abhijeet Kolhe - * 1 Alexander Fabisch - * 1 Bastiaan van den Berg - * 1 Benjamin Peterson - * 1 Daniel Velkov - * 1 Fazlul Shahriar - * 1 Felix Brockherde - * 1 Félix-Antoine Fortin - * 1 Harikrishnan S - * 1 Jack Hale - * 1 JakeMick - * 1 James McDermott - * 1 John Benediktsson - * 1 John Zwinck - * 1 Joshua Vredevoogd - * 1 Justin Pati - * 1 Kevin Hughes - * 1 Kyle Kelley - * 1 Matthias Ekman - * 1 Miroslav Shubernetskiy - * 1 Naoki Orii - * 1 Norbert Crombach - * 1 Rafael Cunha de Almeida - * 1 Rolando Espinoza La fuente - * 1 Seamus Abshere - * 1 Sergey Feldman - * 1 Sergio Medina - * 1 Stefano Lattarini - * 1 Steve Koch - * 1 Sturla Molden - * 1 Thomas Jarosch - * 1 Yaroslav Halchenko +* 277 Gilles Louppe +* 245 Lars Buitinck +* 187 Andreas Mueller +* 124 Arnaud Joly +* 112 Jaques Grobler +* 109 Gael Varoquaux +* 107 Olivier Grisel +* 102 Noel Dawe +* 99 Kemal Eren +* 79 Joel Nothman +* 75 Jake VanderPlas +* 73 Nelle Varoquaux +* 71 Vlad Niculae +* 65 Peter Prettenhofer +* 64 Alexandre Gramfort +* 54 Mathieu Blondel +* 38 Nicolas Trésegnie +* 35 eustache +* 27 Denis Engemann +* 25 Yann N. Dauphin +* 19 Justin Vincent +* 17 Robert Layton +* 15 Doug Coleman +* 14 Michael Eickenberg +* 13 Robert Marchman +* 11 Fabian Pedregosa +* 11 Philippe Gervais +* 10 Jim Holmström +* 10 Tadej Janež +* 10 syhw +* 9 Mikhail Korobov +* 9 Steven De Gryze +* 8 sergeyf +* 7 Ben Root +* 7 Hrishikesh Huilgolkar +* 6 Kyle Kastner +* 6 Martin Luessi +* 6 Rob Speer +* 5 Federico Vaggi +* 5 Raul Garreta +* 5 Rob Zinkov +* 4 Ken Geis +* 3 A. Flaxman +* 3 Denton Cockburn +* 3 Dougal Sutherland +* 3 Ian Ozsvald +* 3 Johannes Schönberger +* 3 Robert McGibbon +* 3 Roman Sinayev +* 3 Szabo Roland +* 2 Diego Molla +* 2 Imran Haque +* 2 Jochen Wersdörfer +* 2 Sergey Karayev +* 2 Yannick Schwartz +* 2 jamestwebber +* 1 Abhijeet Kolhe +* 1 Alexander Fabisch +* 1 Bastiaan van den Berg +* 1 Benjamin Peterson +* 1 Daniel Velkov +* 1 Fazlul Shahriar +* 1 Felix Brockherde +* 1 Félix-Antoine Fortin +* 1 Harikrishnan S +* 1 Jack Hale +* 1 JakeMick +* 1 James McDermott +* 1 John Benediktsson +* 1 John Zwinck +* 1 Joshua Vredevoogd +* 1 Justin Pati +* 1 Kevin Hughes +* 1 Kyle Kelley +* 1 Matthias Ekman +* 1 Miroslav Shubernetskiy +* 1 Naoki Orii +* 1 Norbert Crombach +* 1 Rafael Cunha de Almeida +* 1 Rolando Espinoza La fuente +* 1 Seamus Abshere +* 1 Sergey Feldman +* 1 Sergio Medina +* 1 Stefano Lattarini +* 1 Steve Koch +* 1 Sturla Molden +* 1 Thomas Jarosch +* 1 Yaroslav Halchenko diff --git a/doc/whats_new/v0.20.rst b/doc/whats_new/v0.20.rst index 55c3aa5ef59e2..b295205bbbe57 100644 --- a/doc/whats_new/v0.20.rst +++ b/doc/whats_new/v0.20.rst @@ -53,7 +53,7 @@ The bundled version of joblib was upgraded from 0.13.0 to 0.13.2. restored from a pickle if ``sample_weight`` had been used. :issue:`13772` by :user:`Aditya Vyas `. - .. _changes_0_20_3: +.. _changes_0_20_3: Version 0.20.3 ============== diff --git a/sklearn/datasets/descr/breast_cancer.rst b/sklearn/datasets/descr/breast_cancer.rst index a532ef960737f..ceabd33e14ddc 100644 --- a/sklearn/datasets/descr/breast_cancer.rst +++ b/sklearn/datasets/descr/breast_cancer.rst @@ -5,77 +5,77 @@ Breast cancer wisconsin (diagnostic) dataset **Data Set Characteristics:** - :Number of Instances: 569 - - :Number of Attributes: 30 numeric, predictive attributes and the class - - :Attribute Information: - - radius (mean of distances from center to points on the perimeter) - - texture (standard deviation of gray-scale values) - - perimeter - - area - - smoothness (local variation in radius lengths) - - compactness (perimeter^2 / area - 1.0) - - concavity (severity of concave portions of the contour) - - concave points (number of concave portions of the contour) - - symmetry - - fractal dimension ("coastline approximation" - 1) - - The mean, standard error, and "worst" or largest (mean of the three - worst/largest values) of these features were computed for each image, - resulting in 30 features. For instance, field 0 is Mean Radius, field - 10 is Radius SE, field 20 is Worst Radius. - - - class: - - WDBC-Malignant - - WDBC-Benign - - :Summary Statistics: - - ===================================== ====== ====== - Min Max - ===================================== ====== ====== - radius (mean): 6.981 28.11 - texture (mean): 9.71 39.28 - perimeter (mean): 43.79 188.5 - area (mean): 143.5 2501.0 - smoothness (mean): 0.053 0.163 - compactness (mean): 0.019 0.345 - concavity (mean): 0.0 0.427 - concave points (mean): 0.0 0.201 - symmetry (mean): 0.106 0.304 - fractal dimension (mean): 0.05 0.097 - radius (standard error): 0.112 2.873 - texture (standard error): 0.36 4.885 - perimeter (standard error): 0.757 21.98 - area (standard error): 6.802 542.2 - smoothness (standard error): 0.002 0.031 - compactness (standard error): 0.002 0.135 - concavity (standard error): 0.0 0.396 - concave points (standard error): 0.0 0.053 - symmetry (standard error): 0.008 0.079 - fractal dimension (standard error): 0.001 0.03 - radius (worst): 7.93 36.04 - texture (worst): 12.02 49.54 - perimeter (worst): 50.41 251.2 - area (worst): 185.2 4254.0 - smoothness (worst): 0.071 0.223 - compactness (worst): 0.027 1.058 - concavity (worst): 0.0 1.252 - concave points (worst): 0.0 0.291 - symmetry (worst): 0.156 0.664 - fractal dimension (worst): 0.055 0.208 - ===================================== ====== ====== - - :Missing Attribute Values: None - - :Class Distribution: 212 - Malignant, 357 - Benign - - :Creator: Dr. William H. Wolberg, W. Nick Street, Olvi L. Mangasarian - - :Donor: Nick Street - - :Date: November, 1995 +:Number of Instances: 569 + +:Number of Attributes: 30 numeric, predictive attributes and the class + +:Attribute Information: + - radius (mean of distances from center to points on the perimeter) + - texture (standard deviation of gray-scale values) + - perimeter + - area + - smoothness (local variation in radius lengths) + - compactness (perimeter^2 / area - 1.0) + - concavity (severity of concave portions of the contour) + - concave points (number of concave portions of the contour) + - symmetry + - fractal dimension ("coastline approximation" - 1) + + The mean, standard error, and "worst" or largest (mean of the three + worst/largest values) of these features were computed for each image, + resulting in 30 features. For instance, field 0 is Mean Radius, field + 10 is Radius SE, field 20 is Worst Radius. + + - class: + - WDBC-Malignant + - WDBC-Benign + +:Summary Statistics: + +===================================== ====== ====== + Min Max +===================================== ====== ====== +radius (mean): 6.981 28.11 +texture (mean): 9.71 39.28 +perimeter (mean): 43.79 188.5 +area (mean): 143.5 2501.0 +smoothness (mean): 0.053 0.163 +compactness (mean): 0.019 0.345 +concavity (mean): 0.0 0.427 +concave points (mean): 0.0 0.201 +symmetry (mean): 0.106 0.304 +fractal dimension (mean): 0.05 0.097 +radius (standard error): 0.112 2.873 +texture (standard error): 0.36 4.885 +perimeter (standard error): 0.757 21.98 +area (standard error): 6.802 542.2 +smoothness (standard error): 0.002 0.031 +compactness (standard error): 0.002 0.135 +concavity (standard error): 0.0 0.396 +concave points (standard error): 0.0 0.053 +symmetry (standard error): 0.008 0.079 +fractal dimension (standard error): 0.001 0.03 +radius (worst): 7.93 36.04 +texture (worst): 12.02 49.54 +perimeter (worst): 50.41 251.2 +area (worst): 185.2 4254.0 +smoothness (worst): 0.071 0.223 +compactness (worst): 0.027 1.058 +concavity (worst): 0.0 1.252 +concave points (worst): 0.0 0.291 +symmetry (worst): 0.156 0.664 +fractal dimension (worst): 0.055 0.208 +===================================== ====== ====== + +:Missing Attribute Values: None + +:Class Distribution: 212 - Malignant, 357 - Benign + +:Creator: Dr. William H. Wolberg, W. Nick Street, Olvi L. Mangasarian + +:Donor: Nick Street + +:Date: November, 1995 This is a copy of UCI ML Breast Cancer Wisconsin (Diagnostic) datasets. https://goo.gl/U2Uwz2 @@ -108,15 +108,15 @@ cd math-prog/cpo-dataset/machine-learn/WDBC/ **References** |details-split| -- W.N. Street, W.H. Wolberg and O.L. Mangasarian. Nuclear feature extraction - for breast tumor diagnosis. IS&T/SPIE 1993 International Symposium on +- W.N. Street, W.H. Wolberg and O.L. Mangasarian. Nuclear feature extraction + for breast tumor diagnosis. IS&T/SPIE 1993 International Symposium on Electronic Imaging: Science and Technology, volume 1905, pages 861-870, San Jose, CA, 1993. -- O.L. Mangasarian, W.N. Street and W.H. Wolberg. Breast cancer diagnosis and - prognosis via linear programming. Operations Research, 43(4), pages 570-577, +- O.L. Mangasarian, W.N. Street and W.H. Wolberg. Breast cancer diagnosis and + prognosis via linear programming. Operations Research, 43(4), pages 570-577, July-August 1995. - W.H. Wolberg, W.N. Street, and O.L. Mangasarian. Machine learning techniques - to diagnose breast cancer from fine-needle aspirates. Cancer Letters 77 (1994) + to diagnose breast cancer from fine-needle aspirates. Cancer Letters 77 (1994) 163-171. -|details-end| \ No newline at end of file +|details-end| diff --git a/sklearn/datasets/descr/california_housing.rst b/sklearn/datasets/descr/california_housing.rst index f5756533b2769..33ff111fef541 100644 --- a/sklearn/datasets/descr/california_housing.rst +++ b/sklearn/datasets/descr/california_housing.rst @@ -5,21 +5,21 @@ California Housing dataset **Data Set Characteristics:** - :Number of Instances: 20640 +:Number of Instances: 20640 - :Number of Attributes: 8 numeric, predictive attributes and the target +:Number of Attributes: 8 numeric, predictive attributes and the target - :Attribute Information: - - MedInc median income in block group - - HouseAge median house age in block group - - AveRooms average number of rooms per household - - AveBedrms average number of bedrooms per household - - Population block group population - - AveOccup average number of household members - - Latitude block group latitude - - Longitude block group longitude +:Attribute Information: + - MedInc median income in block group + - HouseAge median house age in block group + - AveRooms average number of rooms per household + - AveBedrms average number of bedrooms per household + - Population block group population + - AveOccup average number of household members + - Latitude block group latitude + - Longitude block group longitude - :Missing Attribute Values: None +:Missing Attribute Values: None This dataset was obtained from the StatLib repository. https://www.dcc.fc.up.pt/~ltorgo/Regression/cal_housing.html diff --git a/sklearn/datasets/descr/covtype.rst b/sklearn/datasets/descr/covtype.rst index 0090b8e4a6b7d..f4b752ade17a7 100644 --- a/sklearn/datasets/descr/covtype.rst +++ b/sklearn/datasets/descr/covtype.rst @@ -14,12 +14,12 @@ while others are discrete or continuous measurements. **Data Set Characteristics:** - ================= ============ - Classes 7 - Samples total 581012 - Dimensionality 54 - Features int - ================= ============ +================= ============ +Classes 7 +Samples total 581012 +Dimensionality 54 +Features int +================= ============ :func:`sklearn.datasets.fetch_covtype` will load the covertype dataset; it returns a dictionary-like 'Bunch' object diff --git a/sklearn/datasets/descr/diabetes.rst b/sklearn/datasets/descr/diabetes.rst index 173d9561bf511..b977c36cf29a0 100644 --- a/sklearn/datasets/descr/diabetes.rst +++ b/sklearn/datasets/descr/diabetes.rst @@ -10,23 +10,23 @@ quantitative measure of disease progression one year after baseline. **Data Set Characteristics:** - :Number of Instances: 442 - - :Number of Attributes: First 10 columns are numeric predictive values - - :Target: Column 11 is a quantitative measure of disease progression one year after baseline - - :Attribute Information: - - age age in years - - sex - - bmi body mass index - - bp average blood pressure - - s1 tc, total serum cholesterol - - s2 ldl, low-density lipoproteins - - s3 hdl, high-density lipoproteins - - s4 tch, total cholesterol / HDL - - s5 ltg, possibly log of serum triglycerides level - - s6 glu, blood sugar level +:Number of Instances: 442 + +:Number of Attributes: First 10 columns are numeric predictive values + +:Target: Column 11 is a quantitative measure of disease progression one year after baseline + +:Attribute Information: + - age age in years + - sex + - bmi body mass index + - bp average blood pressure + - s1 tc, total serum cholesterol + - s2 ldl, low-density lipoproteins + - s3 hdl, high-density lipoproteins + - s4 tch, total cholesterol / HDL + - s5 ltg, possibly log of serum triglycerides level + - s6 glu, blood sugar level Note: Each of these 10 feature variables have been mean centered and scaled by the standard deviation times the square root of `n_samples` (i.e. the sum of squares of each column totals 1). diff --git a/sklearn/datasets/descr/digits.rst b/sklearn/datasets/descr/digits.rst index 40d819e92b7ab..3b07233721d69 100644 --- a/sklearn/datasets/descr/digits.rst +++ b/sklearn/datasets/descr/digits.rst @@ -5,12 +5,12 @@ Optical recognition of handwritten digits dataset **Data Set Characteristics:** - :Number of Instances: 1797 - :Number of Attributes: 64 - :Attribute Information: 8x8 image of integer pixels in the range 0..16. - :Missing Attribute Values: None - :Creator: E. Alpaydin (alpaydin '@' boun.edu.tr) - :Date: July; 1998 +:Number of Instances: 1797 +:Number of Attributes: 64 +:Attribute Information: 8x8 image of integer pixels in the range 0..16. +:Missing Attribute Values: None +:Creator: E. Alpaydin (alpaydin '@' boun.edu.tr) +:Date: July; 1998 This is a copy of the test set of the UCI ML hand-written digits datasets https://archive.ics.uci.edu/ml/datasets/Optical+Recognition+of+Handwritten+Digits @@ -47,4 +47,4 @@ L. Wilson, NIST Form-Based Handprint Recognition System, NISTIR 5469, - Claudio Gentile. A New Approximate Maximal Margin Classification Algorithm. NIPS. 2000. -|details-end| \ No newline at end of file +|details-end| diff --git a/sklearn/datasets/descr/iris.rst b/sklearn/datasets/descr/iris.rst index 02236dcb1c19f..771c92faa9899 100644 --- a/sklearn/datasets/descr/iris.rst +++ b/sklearn/datasets/descr/iris.rst @@ -5,34 +5,34 @@ Iris plants dataset **Data Set Characteristics:** - :Number of Instances: 150 (50 in each of three classes) - :Number of Attributes: 4 numeric, predictive attributes and the class - :Attribute Information: - - sepal length in cm - - sepal width in cm - - petal length in cm - - petal width in cm - - class: - - Iris-Setosa - - Iris-Versicolour - - Iris-Virginica - - :Summary Statistics: +:Number of Instances: 150 (50 in each of three classes) +:Number of Attributes: 4 numeric, predictive attributes and the class +:Attribute Information: + - sepal length in cm + - sepal width in cm + - petal length in cm + - petal width in cm + - class: + - Iris-Setosa + - Iris-Versicolour + - Iris-Virginica - ============== ==== ==== ======= ===== ==================== - Min Max Mean SD Class Correlation - ============== ==== ==== ======= ===== ==================== - sepal length: 4.3 7.9 5.84 0.83 0.7826 - sepal width: 2.0 4.4 3.05 0.43 -0.4194 - petal length: 1.0 6.9 3.76 1.76 0.9490 (high!) - petal width: 0.1 2.5 1.20 0.76 0.9565 (high!) - ============== ==== ==== ======= ===== ==================== +:Summary Statistics: - :Missing Attribute Values: None - :Class Distribution: 33.3% for each of 3 classes. - :Creator: R.A. Fisher - :Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov) - :Date: July, 1988 +============== ==== ==== ======= ===== ==================== + Min Max Mean SD Class Correlation +============== ==== ==== ======= ===== ==================== +sepal length: 4.3 7.9 5.84 0.83 0.7826 +sepal width: 2.0 4.4 3.05 0.43 -0.4194 +petal length: 1.0 6.9 3.76 1.76 0.9490 (high!) +petal width: 0.1 2.5 1.20 0.76 0.9565 (high!) +============== ==== ==== ======= ===== ==================== + +:Missing Attribute Values: None +:Class Distribution: 33.3% for each of 3 classes. +:Creator: R.A. Fisher +:Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov) +:Date: July, 1988 The famous Iris database, first used by Sir R.A. Fisher. The dataset is taken from Fisher's paper. Note that it's the same as in R, but not as in the UCI @@ -64,4 +64,4 @@ latter are NOT linearly separable from each other. conceptual clustering system finds 3 classes in the data. - Many, many more ... -|details-end| \ No newline at end of file +|details-end| diff --git a/sklearn/datasets/descr/kddcup99.rst b/sklearn/datasets/descr/kddcup99.rst index d53a7c878dd17..fe8a0c8f4168c 100644 --- a/sklearn/datasets/descr/kddcup99.rst +++ b/sklearn/datasets/descr/kddcup99.rst @@ -30,50 +30,50 @@ We thus transform the KDD Data set into two different data sets: SA and SF. * http and smtp are two subsets of SF corresponding with third feature equal to 'http' (resp. to 'smtp'). -General KDD structure : - - ================ ========================================== - Samples total 4898431 - Dimensionality 41 - Features discrete (int) or continuous (float) - Targets str, 'normal.' or name of the anomaly type - ================ ========================================== - - SA structure : - - ================ ========================================== - Samples total 976158 - Dimensionality 41 - Features discrete (int) or continuous (float) - Targets str, 'normal.' or name of the anomaly type - ================ ========================================== - - SF structure : - - ================ ========================================== - Samples total 699691 - Dimensionality 4 - Features discrete (int) or continuous (float) - Targets str, 'normal.' or name of the anomaly type - ================ ========================================== - - http structure : - - ================ ========================================== - Samples total 619052 - Dimensionality 3 - Features discrete (int) or continuous (float) - Targets str, 'normal.' or name of the anomaly type - ================ ========================================== - - smtp structure : - - ================ ========================================== - Samples total 95373 - Dimensionality 3 - Features discrete (int) or continuous (float) - Targets str, 'normal.' or name of the anomaly type - ================ ========================================== +General KDD structure: + +================ ========================================== +Samples total 4898431 +Dimensionality 41 +Features discrete (int) or continuous (float) +Targets str, 'normal.' or name of the anomaly type +================ ========================================== + +SA structure: + +================ ========================================== +Samples total 976158 +Dimensionality 41 +Features discrete (int) or continuous (float) +Targets str, 'normal.' or name of the anomaly type +================ ========================================== + +SF structure: + +================ ========================================== +Samples total 699691 +Dimensionality 4 +Features discrete (int) or continuous (float) +Targets str, 'normal.' or name of the anomaly type +================ ========================================== + +http structure: + +================ ========================================== +Samples total 619052 +Dimensionality 3 +Features discrete (int) or continuous (float) +Targets str, 'normal.' or name of the anomaly type +================ ========================================== + +smtp structure: + +================ ========================================== +Samples total 95373 +Dimensionality 3 +Features discrete (int) or continuous (float) +Targets str, 'normal.' or name of the anomaly type +================ ========================================== :func:`sklearn.datasets.fetch_kddcup99` will load the kddcup99 dataset; it returns a dictionary-like object with the feature matrix in the ``data`` member diff --git a/sklearn/datasets/descr/lfw.rst b/sklearn/datasets/descr/lfw.rst index 8105d7d6d633a..f7d80558be373 100644 --- a/sklearn/datasets/descr/lfw.rst +++ b/sklearn/datasets/descr/lfw.rst @@ -6,7 +6,7 @@ The Labeled Faces in the Wild face recognition dataset This dataset is a collection of JPEG pictures of famous people collected over the internet, all details are available on the official website: - http://vis-www.cs.umass.edu/lfw/ +http://vis-www.cs.umass.edu/lfw/ Each picture is centered on a single face. The typical task is called Face Verification: given a pair of two pictures, a binary classifier @@ -25,12 +25,12 @@ face detector from various online websites. **Data Set Characteristics:** - ================= ======================= - Classes 5749 - Samples total 13233 - Dimensionality 5828 - Features real, between 0 and 255 - ================= ======================= +================= ======================= +Classes 5749 +Samples total 13233 +Dimensionality 5828 +Features real, between 0 and 255 +================= ======================= |details-start| **Usage** diff --git a/sklearn/datasets/descr/linnerud.rst b/sklearn/datasets/descr/linnerud.rst index 81c970bb6e3e6..108611a4722ad 100644 --- a/sklearn/datasets/descr/linnerud.rst +++ b/sklearn/datasets/descr/linnerud.rst @@ -5,9 +5,9 @@ Linnerrud dataset **Data Set Characteristics:** - :Number of Instances: 20 - :Number of Attributes: 3 - :Missing Attribute Values: None +:Number of Instances: 20 +:Number of Attributes: 3 +:Missing Attribute Values: None The Linnerud dataset is a multi-output regression dataset. It consists of three exercise (data) and three physiological (target) variables collected from @@ -25,4 +25,4 @@ twenty middle-aged men in a fitness club: * Tenenhaus, M. (1998). La regression PLS: theorie et pratique. Paris: Editions Technic. -|details-end| \ No newline at end of file +|details-end| diff --git a/sklearn/datasets/descr/olivetti_faces.rst b/sklearn/datasets/descr/olivetti_faces.rst index 4feadcc4b2fb1..060c866213e8e 100644 --- a/sklearn/datasets/descr/olivetti_faces.rst +++ b/sklearn/datasets/descr/olivetti_faces.rst @@ -3,7 +3,7 @@ The Olivetti faces dataset -------------------------- -`This dataset contains a set of face images`_ taken between April 1992 and +`This dataset contains a set of face images`_ taken between April 1992 and April 1994 at AT&T Laboratories Cambridge. The :func:`sklearn.datasets.fetch_olivetti_faces` function is the data fetching / caching function that downloads the data @@ -17,20 +17,20 @@ As described on the original website: subjects, the images were taken at different times, varying the lighting, facial expressions (open / closed eyes, smiling / not smiling) and facial details (glasses / no glasses). All the images were taken against a dark - homogeneous background with the subjects in an upright, frontal position + homogeneous background with the subjects in an upright, frontal position (with tolerance for some side movement). **Data Set Characteristics:** - ================= ===================== - Classes 40 - Samples total 400 - Dimensionality 4096 - Features real, between 0 and 1 - ================= ===================== +================= ===================== +Classes 40 +Samples total 400 +Dimensionality 4096 +Features real, between 0 and 1 +================= ===================== -The image is quantized to 256 grey levels and stored as unsigned 8-bit -integers; the loader will convert these to floating point values on the +The image is quantized to 256 grey levels and stored as unsigned 8-bit +integers; the loader will convert these to floating point values on the interval [0, 1], which are easier to work with for many algorithms. The "target" for this database is an integer from 0 to 39 indicating the diff --git a/sklearn/datasets/descr/rcv1.rst b/sklearn/datasets/descr/rcv1.rst index afaadbfb45afc..7cf3730a17554 100644 --- a/sklearn/datasets/descr/rcv1.rst +++ b/sklearn/datasets/descr/rcv1.rst @@ -3,20 +3,20 @@ RCV1 dataset ------------ -Reuters Corpus Volume I (RCV1) is an archive of over 800,000 manually -categorized newswire stories made available by Reuters, Ltd. for research +Reuters Corpus Volume I (RCV1) is an archive of over 800,000 manually +categorized newswire stories made available by Reuters, Ltd. for research purposes. The dataset is extensively described in [1]_. **Data Set Characteristics:** - ============== ===================== - Classes 103 - Samples total 804414 - Dimensionality 47236 - Features real, between 0 and 1 - ============== ===================== +============== ===================== +Classes 103 +Samples total 804414 +Dimensionality 47236 +Features real, between 0 and 1 +============== ===================== -:func:`sklearn.datasets.fetch_rcv1` will load the following +:func:`sklearn.datasets.fetch_rcv1` will load the following version: RCV1-v2, vectors, full sets, topics multilabels:: >>> from sklearn.datasets import fetch_rcv1 @@ -28,32 +28,32 @@ It returns a dictionary-like object, with the following attributes: The feature matrix is a scipy CSR sparse matrix, with 804414 samples and 47236 features. Non-zero values contains cosine-normalized, log TF-IDF vectors. A nearly chronological split is proposed in [1]_: The first 23149 samples are -the training set. The last 781265 samples are the testing set. This follows -the official LYRL2004 chronological split. The array has 0.16% of non zero +the training set. The last 781265 samples are the testing set. This follows +the official LYRL2004 chronological split. The array has 0.16% of non zero values:: >>> rcv1.data.shape (804414, 47236) ``target``: -The target values are stored in a scipy CSR sparse matrix, with 804414 samples -and 103 categories. Each sample has a value of 1 in its categories, and 0 in +The target values are stored in a scipy CSR sparse matrix, with 804414 samples +and 103 categories. Each sample has a value of 1 in its categories, and 0 in others. The array has 3.15% of non zero values:: >>> rcv1.target.shape (804414, 103) ``sample_id``: -Each sample can be identified by its ID, ranging (with gaps) from 2286 +Each sample can be identified by its ID, ranging (with gaps) from 2286 to 810596:: >>> rcv1.sample_id[:3] array([2286, 2287, 2288], dtype=uint32) ``target_names``: -The target values are the topics of each sample. Each sample belongs to at -least one topic, and to up to 17 topics. There are 103 topics, each -represented by a string. Their corpus frequencies span five orders of +The target values are the topics of each sample. Each sample belongs to at +least one topic, and to up to 17 topics. There are 103 topics, each +represented by a string. Their corpus frequencies span five orders of magnitude, from 5 occurrences for 'GMIL', to 381327 for 'CCAT':: >>> rcv1.target_names[:3].tolist() # doctest: +SKIP @@ -67,6 +67,6 @@ The compressed size is about 656 MB. .. topic:: References - .. [1] Lewis, D. D., Yang, Y., Rose, T. G., & Li, F. (2004). - RCV1: A new benchmark collection for text categorization research. + .. [1] Lewis, D. D., Yang, Y., Rose, T. G., & Li, F. (2004). + RCV1: A new benchmark collection for text categorization research. The Journal of Machine Learning Research, 5, 361-397. diff --git a/sklearn/datasets/descr/twenty_newsgroups.rst b/sklearn/datasets/descr/twenty_newsgroups.rst index 669e158244134..d1a049869dd7f 100644 --- a/sklearn/datasets/descr/twenty_newsgroups.rst +++ b/sklearn/datasets/descr/twenty_newsgroups.rst @@ -20,12 +20,12 @@ extractor. **Data Set Characteristics:** - ================= ========== - Classes 20 - Samples total 18846 - Dimensionality 1 - Features text - ================= ========== +================= ========== +Classes 20 +Samples total 18846 +Dimensionality 1 +Features text +================= ========== |details-start| **Usage** diff --git a/sklearn/datasets/descr/wine_data.rst b/sklearn/datasets/descr/wine_data.rst index e20efea9ba719..0325af6233c17 100644 --- a/sklearn/datasets/descr/wine_data.rst +++ b/sklearn/datasets/descr/wine_data.rst @@ -5,53 +5,52 @@ Wine recognition dataset **Data Set Characteristics:** - :Number of Instances: 178 - :Number of Attributes: 13 numeric, predictive attributes and the class - :Attribute Information: - - Alcohol - - Malic acid - - Ash - - Alcalinity of ash - - Magnesium - - Total phenols - - Flavanoids - - Nonflavanoid phenols - - Proanthocyanins - - Color intensity - - Hue - - OD280/OD315 of diluted wines - - Proline - +:Number of Instances: 178 +:Number of Attributes: 13 numeric, predictive attributes and the class +:Attribute Information: + - Alcohol + - Malic acid + - Ash + - Alcalinity of ash + - Magnesium + - Total phenols + - Flavanoids + - Nonflavanoid phenols + - Proanthocyanins + - Color intensity + - Hue + - OD280/OD315 of diluted wines + - Proline - class: - - class_0 - - class_1 - - class_2 - - :Summary Statistics: - - ============================= ==== ===== ======= ===== - Min Max Mean SD - ============================= ==== ===== ======= ===== - Alcohol: 11.0 14.8 13.0 0.8 - Malic Acid: 0.74 5.80 2.34 1.12 - Ash: 1.36 3.23 2.36 0.27 - Alcalinity of Ash: 10.6 30.0 19.5 3.3 - Magnesium: 70.0 162.0 99.7 14.3 - Total Phenols: 0.98 3.88 2.29 0.63 - Flavanoids: 0.34 5.08 2.03 1.00 - Nonflavanoid Phenols: 0.13 0.66 0.36 0.12 - Proanthocyanins: 0.41 3.58 1.59 0.57 - Colour Intensity: 1.3 13.0 5.1 2.3 - Hue: 0.48 1.71 0.96 0.23 - OD280/OD315 of diluted wines: 1.27 4.00 2.61 0.71 - Proline: 278 1680 746 315 - ============================= ==== ===== ======= ===== - - :Missing Attribute Values: None - :Class Distribution: class_0 (59), class_1 (71), class_2 (48) - :Creator: R.A. Fisher - :Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov) - :Date: July, 1988 + - class_0 + - class_1 + - class_2 + +:Summary Statistics: + +============================= ==== ===== ======= ===== + Min Max Mean SD +============================= ==== ===== ======= ===== +Alcohol: 11.0 14.8 13.0 0.8 +Malic Acid: 0.74 5.80 2.34 1.12 +Ash: 1.36 3.23 2.36 0.27 +Alcalinity of Ash: 10.6 30.0 19.5 3.3 +Magnesium: 70.0 162.0 99.7 14.3 +Total Phenols: 0.98 3.88 2.29 0.63 +Flavanoids: 0.34 5.08 2.03 1.00 +Nonflavanoid Phenols: 0.13 0.66 0.36 0.12 +Proanthocyanins: 0.41 3.58 1.59 0.57 +Colour Intensity: 1.3 13.0 5.1 2.3 +Hue: 0.48 1.71 0.96 0.23 +OD280/OD315 of diluted wines: 1.27 4.00 2.61 0.71 +Proline: 278 1680 746 315 +============================= ==== ===== ======= ===== + +:Missing Attribute Values: None +:Class Distribution: class_0 (59), class_1 (71), class_2 (48) +:Creator: R.A. Fisher +:Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov) +:Date: July, 1988 This is a copy of UCI ML Wine recognition datasets. https://archive.ics.uci.edu/ml/machine-learning-databases/wine/wine.data @@ -61,10 +60,10 @@ region in Italy by three different cultivators. There are thirteen different measurements taken for different constituents found in the three types of wine. -Original Owners: +Original Owners: -Forina, M. et al, PARVUS - -An Extendible Package for Data Exploration, Classification and Correlation. +Forina, M. et al, PARVUS - +An Extendible Package for Data Exploration, Classification and Correlation. Institute of Pharmaceutical and Food Analysis and Technologies, Via Brigata Salerno, 16147 Genoa, Italy. @@ -72,28 +71,28 @@ Citation: Lichman, M. (2013). UCI Machine Learning Repository [https://archive.ics.uci.edu/ml]. Irvine, CA: University of California, -School of Information and Computer Science. +School of Information and Computer Science. |details-start| **References** |details-split| -(1) S. Aeberhard, D. Coomans and O. de Vel, -Comparison of Classifiers in High Dimensional Settings, -Tech. Rep. no. 92-02, (1992), Dept. of Computer Science and Dept. of -Mathematics and Statistics, James Cook University of North Queensland. -(Also submitted to Technometrics). - -The data was used with many others for comparing various -classifiers. The classes are separable, though only RDA -has achieved 100% correct classification. -(RDA : 100%, QDA 99.4%, LDA 98.9%, 1NN 96.1% (z-transformed data)) -(All results using the leave-one-out technique) - -(2) S. Aeberhard, D. Coomans and O. de Vel, -"THE CLASSIFICATION PERFORMANCE OF RDA" -Tech. Rep. no. 92-01, (1992), Dept. of Computer Science and Dept. of -Mathematics and Statistics, James Cook University of North Queensland. +(1) S. Aeberhard, D. Coomans and O. de Vel, +Comparison of Classifiers in High Dimensional Settings, +Tech. Rep. no. 92-02, (1992), Dept. of Computer Science and Dept. of +Mathematics and Statistics, James Cook University of North Queensland. +(Also submitted to Technometrics). + +The data was used with many others for comparing various +classifiers. The classes are separable, though only RDA +has achieved 100% correct classification. +(RDA : 100%, QDA 99.4%, LDA 98.9%, 1NN 96.1% (z-transformed data)) +(All results using the leave-one-out technique) + +(2) S. Aeberhard, D. Coomans and O. de Vel, +"THE CLASSIFICATION PERFORMANCE OF RDA" +Tech. Rep. no. 92-01, (1992), Dept. of Computer Science and Dept. of +Mathematics and Statistics, James Cook University of North Queensland. (Also submitted to Journal of Chemometrics). -|details-end| \ No newline at end of file +|details-end| From 93e199d517aca98c0eeb222ebfaa2ac99b368e0a Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Mon, 15 Jan 2024 01:29:39 +0800 Subject: [PATCH 050/554] FIX improve error message in `check_array` when getting a `Series` and expecting a 2D container (#28090) Co-authored-by: Stanislas Furrer --- doc/whats_new/v1.4.rst | 7 ++++++ sklearn/preprocessing/tests/test_encoders.py | 2 +- sklearn/utils/tests/test_validation.py | 15 ++++++++++++ sklearn/utils/validation.py | 25 +++++++++++++++----- 4 files changed, 42 insertions(+), 7 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index 25a3f600c5446..c0261a51384c6 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -19,6 +19,13 @@ Changelog is read-only, e.g., a `numpy.memmap` instance. :pr:`28111` by :user:`Yao Xiao `. +:mod:`sklearn.utils` +.................... + +- |Fix| Fix the function :func:`~utils.check_array` to output the right error message + when the input is Series instead of a DataFrame. + :pr:`28090` by :user:`Stan Furrer ` and :user:`Yao Xiao `. + .. _changes_1_4: diff --git a/sklearn/preprocessing/tests/test_encoders.py b/sklearn/preprocessing/tests/test_encoders.py index df7e02355db3d..ee5e1152fc710 100644 --- a/sklearn/preprocessing/tests/test_encoders.py +++ b/sklearn/preprocessing/tests/test_encoders.py @@ -387,7 +387,7 @@ def test_X_is_not_1D_pandas(method): X = pd.Series([6, 3, 4, 6]) oh = OneHotEncoder() - msg = "Expected 2D array, got 1D array instead" + msg = f"Expected a 2-dimensional container but got {type(X)} instead." with pytest.raises(ValueError, match=msg): getattr(oh, method)(X) diff --git a/sklearn/utils/tests/test_validation.py b/sklearn/utils/tests/test_validation.py index b627c55a7ef12..ee26772d8731b 100644 --- a/sklearn/utils/tests/test_validation.py +++ b/sklearn/utils/tests/test_validation.py @@ -305,6 +305,21 @@ def test_check_array_force_all_finite_object_unsafe_casting( check_array(X, dtype=int, force_all_finite=force_all_finite) +def test_check_array_series_err_msg(): + """ + Check that we raise a proper error message when passing a Series and we expect a + 2-dimensional container. + + Non-regression test for: + https://github.com/scikit-learn/scikit-learn/issues/27498 + """ + pd = pytest.importorskip("pandas") + ser = pd.Series([1, 2, 3]) + msg = f"Expected a 2-dimensional container but got {type(ser)} instead." + with pytest.raises(ValueError, match=msg): + check_array(ser, ensure_2d=True) + + @ignore_warnings def test_check_array(): # accept_sparse == False diff --git a/sklearn/utils/validation.py b/sklearn/utils/validation.py index a7553993f7ded..e58fb41501c96 100644 --- a/sklearn/utils/validation.py +++ b/sklearn/utils/validation.py @@ -802,6 +802,8 @@ def check_array( # DataFrame), and store them. If not, store None. dtypes_orig = None pandas_requires_conversion = False + # track if we have a Series-like object to raise a better error message + type_if_series = None if hasattr(array, "dtypes") and hasattr(array.dtypes, "__array__"): # throw warning if columns are sparse. If all columns are sparse, then # array.sparse exists and sparsity will be preserved (later). @@ -831,6 +833,7 @@ def is_sparse(dtype): array, "dtype" ): # array is a pandas series + type_if_series = type(array) pandas_requires_conversion = _pandas_dtype_needs_early_conversion(array.dtype) if isinstance(array.dtype, np.dtype): dtype_orig = array.dtype @@ -962,12 +965,22 @@ def is_sparse(dtype): ) # If input is 1D raise error if array.ndim == 1: - raise ValueError( - "Expected 2D array, got 1D array instead:\narray={}.\n" - "Reshape your data either using array.reshape(-1, 1) if " - "your data has a single feature or array.reshape(1, -1) " - "if it contains a single sample.".format(array) - ) + # If input is a Series-like object (eg. pandas Series or polars Series) + if type_if_series is not None: + msg = ( + f"Expected a 2-dimensional container but got {type_if_series} " + "instead. Pass a DataFrame containing a single row (i.e. " + "single sample) or a single column (i.e. single feature) " + "instead." + ) + else: + msg = ( + f"Expected 2D array, got 1D array instead:\narray={array}.\n" + "Reshape your data either using array.reshape(-1, 1) if " + "your data has a single feature or array.reshape(1, -1) " + "if it contains a single sample." + ) + raise ValueError(msg) if dtype_numeric and hasattr(array.dtype, "kind") and array.dtype.kind in "USV": raise ValueError( From be07298637c384f934541ad10e4d9c1fb1d4cacb Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Mon, 15 Jan 2024 03:39:37 +0800 Subject: [PATCH 051/554] DOC fix the confusing ordering of `whats_new/v1.5.rst` (#28120) --- doc/whats_new/v1.5.rst | 31 ++++++++++++++++--------------- 1 file changed, 16 insertions(+), 15 deletions(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 0e3e37caeeb05..6f4778d9784e8 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -25,26 +25,18 @@ Changelog :pr:`123456` by :user:`Joe Bloggs `. where 123455 is the *pull request* number, not the issue number. -:mod:`sklearn.impute` -..................... -- |Enhancement| :class:`impute.SimpleImputer` now supports custom strategies - by passing a function in place of a strategy name. - :pr:`28053` by :user:`Mark Elliot `. - -Code and Documentation Contributors ------------------------------------ - -Thanks to everyone who has contributed to the maintenance and improvement of -the project since version 1.4, including: - -TODO: update at the time of the release. - :mod:`sklearn.compose` ...................... - |Feature| A fitted :class:`compose.ColumnTransformer` now implements `__getitem__` which returns the fitted transformers by name. :pr:`27990` by `Thomas Fan`_. +:mod:`sklearn.impute` +..................... + +- |Enhancement| :class:`impute.SimpleImputer` now supports custom strategies + by passing a function in place of a strategy name. + :pr:`28053` by :user:`Mark Elliot `. :mod:`sklearn.metrics` ...................... @@ -56,4 +48,13 @@ TODO: update at the time of the release. and `from_predictions` in :class:`~metrics.RocCurveDisplay`, :class:`~metrics.PrecisionRecallDisplay`, :class:`~metrics.DetCurveDisplay`, :class:`~calibration.CalibrationDisplay`. - :pr:`28051` by :user:`Pierre de Fréminville ` + :pr:`28051` by :user:`Pierre de Fréminville `. + + +Code and Documentation Contributors +----------------------------------- + +Thanks to everyone who has contributed to the maintenance and improvement of +the project since version 1.4, including: + +TODO: update at the time of the release. From 6f9d565064cecac7d5242e7adcbc9b4524668f63 Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Mon, 15 Jan 2024 09:15:46 +0800 Subject: [PATCH 052/554] FIX `KNeighborsClassifier` raise when all neighbors of some sample have zero weights (#26410) Co-authored-by: Thomas J. Fan --- doc/whats_new/v1.4.rst | 9 ++++++ sklearn/neighbors/_classification.py | 14 +++++++- sklearn/neighbors/tests/test_neighbors.py | 27 ++++++++++++++++ sklearn/utils/arrayfuncs.pyx | 39 ++++++++++++++++++++++- sklearn/utils/tests/test_arrayfuncs.py | 14 +++++++- 5 files changed, 100 insertions(+), 3 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index c0261a51384c6..ba0facf7cdd13 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -19,6 +19,15 @@ Changelog is read-only, e.g., a `numpy.memmap` instance. :pr:`28111` by :user:`Yao Xiao `. +:mod:`sklearn.neighbors` +........................ + +- |Fix| :meth:`neighbors.KNeighborsClassifier.predict` and + :meth:`neighbors.KNeighborsClassifier.predict_proba` now raises an error when the + weights of all neighbors of some sample are zero. This can happen when `weights` + is a user-defined function. + :pr:`26410` by :user:`Yao Xiao `. + :mod:`sklearn.utils` .................... diff --git a/sklearn/neighbors/_classification.py b/sklearn/neighbors/_classification.py index e921ec3a9d165..26ffa273d0a60 100644 --- a/sklearn/neighbors/_classification.py +++ b/sklearn/neighbors/_classification.py @@ -20,6 +20,7 @@ RadiusNeighborsClassMode, ) from ..utils._param_validation import StrOptions +from ..utils.arrayfuncs import _all_with_any_reduction_axis_1 from ..utils.extmath import weighted_mode from ..utils.fixes import _mode from ..utils.validation import _is_arraylike, _num_samples, check_is_fitted @@ -281,6 +282,12 @@ def predict(self, X): n_outputs = len(classes_) n_queries = _num_samples(X) weights = _get_weights(neigh_dist, self.weights) + if weights is not None and _all_with_any_reduction_axis_1(weights, value=0): + raise ValueError( + "All neighbors of some sample is getting zero weights. " + "Please modify 'weights' to avoid this case if you are " + "using a user-defined function." + ) y_pred = np.empty((n_queries, n_outputs), dtype=classes_[0].dtype) for k, classes_k in enumerate(classes_): @@ -372,6 +379,12 @@ def predict_proba(self, X): weights = _get_weights(neigh_dist, self.weights) if weights is None: weights = np.ones_like(neigh_ind) + elif _all_with_any_reduction_axis_1(weights, value=0): + raise ValueError( + "All neighbors of some sample is getting zero weights. " + "Please modify 'weights' to avoid this case if you are " + "using a user-defined function." + ) all_rows = np.arange(n_queries) probabilities = [] @@ -385,7 +398,6 @@ def predict_proba(self, X): # normalize 'votes' into real [0,1] probabilities normalizer = proba_k.sum(axis=1)[:, np.newaxis] - normalizer[normalizer == 0.0] = 1.0 proba_k /= normalizer probabilities.append(proba_k) diff --git a/sklearn/neighbors/tests/test_neighbors.py b/sklearn/neighbors/tests/test_neighbors.py index 2be0237cd5f7e..d3fc71478e6f5 100644 --- a/sklearn/neighbors/tests/test_neighbors.py +++ b/sklearn/neighbors/tests/test_neighbors.py @@ -2343,3 +2343,30 @@ def test_nearest_neighbours_works_with_p_less_than_1(): y = neigh.kneighbors(X[0].reshape(1, -1), return_distance=False) assert_allclose(y[0], [0, 1, 2]) + + +def test_KNeighborsClassifier_raise_on_all_zero_weights(): + """Check that `predict` and `predict_proba` raises on sample of all zeros weights. + + Related to Issue #25854. + """ + X = [[0, 1], [1, 2], [2, 3], [3, 4]] + y = [0, 0, 1, 1] + + def _weights(dist): + return np.vectorize(lambda x: 0 if x > 0.5 else 1)(dist) + + est = neighbors.KNeighborsClassifier(n_neighbors=3, weights=_weights) + est.fit(X, y) + + msg = ( + "All neighbors of some sample is getting zero weights. " + "Please modify 'weights' to avoid this case if you are " + "using a user-defined function." + ) + + with pytest.raises(ValueError, match=msg): + est.predict([[1.1, 1.1]]) + + with pytest.raises(ValueError, match=msg): + est.predict_proba([[1.1, 1.1]]) diff --git a/sklearn/utils/arrayfuncs.pyx b/sklearn/utils/arrayfuncs.pyx index d060c7bada92a..59dc43084b3d9 100644 --- a/sklearn/utils/arrayfuncs.pyx +++ b/sklearn/utils/arrayfuncs.pyx @@ -10,8 +10,16 @@ from libc.float cimport DBL_MAX, FLT_MAX from ._cython_blas cimport _copy, _rotg, _rot +ctypedef fused real_numeric: + short + int + long + float + double + + def min_pos(const floating[:] X): - """Find the minimum value of an array over positive values + """Find the minimum value of an array over positive values. Returns the maximum representable value of the input dtype if none of the values are positive. @@ -24,6 +32,35 @@ def min_pos(const floating[:] X): return min_val +def _all_with_any_reduction_axis_1(real_numeric[:, :] array, real_numeric value): + """Check that all values are equal to `value` along a specific axis. + + It is equivalent to `np.any(np.all(X == value, axis=1))`, but it avoids to + materialize the temporary boolean matrices in memory. + + Parameters + ---------- + array: array-like + The array to be checked. + value: short, int, long, float, or double + The value to use for the comparison. + + Returns + ------- + any_all_equal: bool + Whether or not any rows contains all values equal to `value`. + """ + cdef Py_ssize_t i, j + + for i in range(array.shape[0]): + for j in range(array.shape[1]): + if array[i, j] != value: + break + else: # no break + return True + return False + + # General Cholesky Delete. # Remove an element from the cholesky factorization # m = columns diff --git a/sklearn/utils/tests/test_arrayfuncs.py b/sklearn/utils/tests/test_arrayfuncs.py index b0a02e13d1639..1da4dcbc088b5 100644 --- a/sklearn/utils/tests/test_arrayfuncs.py +++ b/sklearn/utils/tests/test_arrayfuncs.py @@ -2,7 +2,7 @@ import pytest from sklearn.utils._testing import assert_allclose -from sklearn.utils.arrayfuncs import min_pos +from sklearn.utils.arrayfuncs import _all_with_any_reduction_axis_1, min_pos def test_min_pos(): @@ -24,3 +24,15 @@ def test_min_pos_no_positive(dtype): X = np.full(100, -1.0).astype(dtype, copy=False) assert min_pos(X) == np.finfo(dtype).max + + +@pytest.mark.parametrize("dtype", [np.int16, np.int32, np.float32, np.float64]) +@pytest.mark.parametrize("value", [0, 1.5, -1]) +def test_all_with_any_reduction_axis_1(dtype, value): + # Check that return value is False when there is no row/column equal to `value` + X = np.arange(12, dtype=dtype).reshape(3, 4) + assert not _all_with_any_reduction_axis_1(X, value=value) + + # Make a row equal to `value` + X[1, :] = value + assert _all_with_any_reduction_axis_1(X, value=value) From 4d8b6d7ea10e5a075676042681678c433b08bf8b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Filip=20Karlo=20Do=C5=A1ilovi=C4=87?= Date: Mon, 15 Jan 2024 11:22:27 +0100 Subject: [PATCH 053/554] DOC Add examples section to docstring of functions from the base module. (#28123) --- sklearn/base.py | 37 +++++++++++++++++++++++++++++++++++++ 1 file changed, 37 insertions(+) diff --git a/sklearn/base.py b/sklearn/base.py index c48a5f2d99628..a6313947ba469 100644 --- a/sklearn/base.py +++ b/sklearn/base.py @@ -70,6 +70,21 @@ def clone(estimator, *, safe=True): results. Otherwise, *statistical clone* is returned: the clone might return different results from the original estimator. More details can be found in :ref:`randomness`. + + Examples + -------- + >>> from sklearn.base import clone + >>> from sklearn.linear_model import LogisticRegression + >>> X = [[-1, 0], [0, 1], [0, -1], [1, 0]] + >>> y = [0, 0, 1, 1] + >>> classifier = LogisticRegression().fit(X, y) + >>> cloned_classifier = clone(classifier) + >>> hasattr(classifier, "classes_") + True + >>> hasattr(cloned_classifier, "classes_") + False + >>> classifier is cloned_classifier + False """ if hasattr(estimator, "__sklearn_clone__") and not inspect.isclass(estimator): return estimator.__sklearn_clone__() @@ -1208,6 +1223,17 @@ def is_classifier(estimator): ------- out : bool True if estimator is a classifier and False otherwise. + + Examples + -------- + >>> from sklearn.base import is_classifier + >>> from sklearn.svm import SVC, SVR + >>> classifier = SVC() + >>> regressor = SVR() + >>> is_classifier(classifier) + True + >>> is_classifier(regressor) + False """ return getattr(estimator, "_estimator_type", None) == "classifier" @@ -1224,6 +1250,17 @@ def is_regressor(estimator): ------- out : bool True if estimator is a regressor and False otherwise. + + Examples + -------- + >>> from sklearn.base import is_regressor + >>> from sklearn.svm import SVC, SVR + >>> classifier = SVC() + >>> regressor = SVR() + >>> is_regressor(classifier) + False + >>> is_regressor(regressor) + True """ return getattr(estimator, "_estimator_type", None) == "regressor" From fe551922b38f47d1261dab59b98696bed058202d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Filip=20Karlo=20Do=C5=A1ilovi=C4=87?= Date: Mon, 15 Jan 2024 11:26:07 +0100 Subject: [PATCH 054/554] DOC Add Examples section to docstrings to functions from utils.discovery module. (#28124) --- sklearn/utils/discovery.py | 43 ++++++++++++++++++++++++++++++++++++++ 1 file changed, 43 insertions(+) diff --git a/sklearn/utils/discovery.py b/sklearn/utils/discovery.py index 733fe294e3637..c1fdca3beafb2 100644 --- a/sklearn/utils/discovery.py +++ b/sklearn/utils/discovery.py @@ -41,6 +41,34 @@ def all_estimators(type_filter=None): estimators : list of tuples List of (name, class), where ``name`` is the class name as string and ``class`` is the actual type of the class. + + Examples + -------- + >>> from sklearn.utils.discovery import all_estimators + >>> estimators = all_estimators() + >>> type(estimators) + + >>> type(estimators[0]) + + >>> estimators[:2] + [('ARDRegression', ), + ('AdaBoostClassifier', + )] + >>> classifiers = all_estimators(type_filter="classifier") + >>> classifiers[:2] + [('AdaBoostClassifier', + ), + ('BaggingClassifier', )] + >>> regressors = all_estimators(type_filter="regressor") + >>> regressors[:2] + [('ARDRegression', ), + ('AdaBoostRegressor', + )] + >>> both = all_estimators(type_filter=["classifier", "regressor"]) + >>> both[:2] + [('ARDRegression', ), + ('AdaBoostClassifier', + )] """ # lazy import to avoid circular imports from sklearn.base from ..base import ( @@ -140,6 +168,13 @@ def all_displays(): displays : list of tuples List of (name, class), where ``name`` is the display class name as string and ``class`` is the actual type of the class. + + Examples + -------- + >>> from sklearn.utils.discovery import all_displays + >>> displays = all_displays() + >>> displays[0] + ('CalibrationDisplay', ) """ # lazy import to avoid circular imports from sklearn.base from ._testing import ignore_warnings @@ -190,6 +225,14 @@ def all_functions(): functions : list of tuples List of (name, function), where ``name`` is the function name as string and ``function`` is the actual function. + + Examples + -------- + >>> from sklearn.utils.discovery import all_functions + >>> functions = all_functions() + >>> name, function = functions[0] + >>> name + 'accuracy_score' """ # lazy import to avoid circular imports from sklearn.base from ._testing import ignore_warnings From fcd22cac49a0f6f40a37334f1072c5a8fadeed61 Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Mon, 15 Jan 2024 23:12:02 +0800 Subject: [PATCH 055/554] DOC make up for errors in #26410 (#28128) --- sklearn/utils/arrayfuncs.pyx | 2 +- sklearn/utils/tests/test_arrayfuncs.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/sklearn/utils/arrayfuncs.pyx b/sklearn/utils/arrayfuncs.pyx index 59dc43084b3d9..b005bab896925 100644 --- a/sklearn/utils/arrayfuncs.pyx +++ b/sklearn/utils/arrayfuncs.pyx @@ -33,7 +33,7 @@ def min_pos(const floating[:] X): def _all_with_any_reduction_axis_1(real_numeric[:, :] array, real_numeric value): - """Check that all values are equal to `value` along a specific axis. + """Check whether any row contains all values equal to `value`. It is equivalent to `np.any(np.all(X == value, axis=1))`, but it avoids to materialize the temporary boolean matrices in memory. diff --git a/sklearn/utils/tests/test_arrayfuncs.py b/sklearn/utils/tests/test_arrayfuncs.py index 1da4dcbc088b5..4a80a4c1edefd 100644 --- a/sklearn/utils/tests/test_arrayfuncs.py +++ b/sklearn/utils/tests/test_arrayfuncs.py @@ -29,7 +29,7 @@ def test_min_pos_no_positive(dtype): @pytest.mark.parametrize("dtype", [np.int16, np.int32, np.float32, np.float64]) @pytest.mark.parametrize("value", [0, 1.5, -1]) def test_all_with_any_reduction_axis_1(dtype, value): - # Check that return value is False when there is no row/column equal to `value` + # Check that return value is False when there is no row equal to `value` X = np.arange(12, dtype=dtype).reshape(3, 4) assert not _all_with_any_reduction_axis_1(X, value=value) From 09cc3c1853f84e9e9c1c40b1f3013bac8ae591a2 Mon Sep 17 00:00:00 2001 From: Joel Nothman Date: Tue, 16 Jan 2024 04:17:38 +1100 Subject: [PATCH 056/554] FIX _get_doc_link when a _-prefixed package contains a nonprefixed module (#28024) --- doc/whats_new/v1.4.rst | 4 +- sklearn/utils/_estimator_html_repr.py | 13 ++++--- .../utils/tests/test_estimator_html_repr.py | 39 ++++++++++++++----- 3 files changed, 39 insertions(+), 17 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index ba0facf7cdd13..64c7ade810231 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -86,8 +86,8 @@ Changes impacting all modules documentation and is color-coded to denote whether the estimator is fitted or not (unfitted estimators are orange, fitted estimators are blue). :pr:`26616` by :user:`Riccardo Cappuzzo `, - :user:`Ines Ibnukhsein `, :user:`Gael Varoquaux `, and - :user:`Lilian Boulard `. + :user:`Ines Ibnukhsein `, :user:`Gael Varoquaux `, + `Joel Nothman`_ and :user:`Lilian Boulard `. - |Fix| Fixed a bug in most estimators and functions where setting a parameter to a large integer would cause a `TypeError`. diff --git a/sklearn/utils/_estimator_html_repr.py b/sklearn/utils/_estimator_html_repr.py index d259016504685..dd51a8bbb71de 100644 --- a/sklearn/utils/_estimator_html_repr.py +++ b/sklearn/utils/_estimator_html_repr.py @@ -1,4 +1,5 @@ import html +import itertools from contextlib import closing from inspect import isclass from io import StringIO @@ -471,12 +472,14 @@ def _get_doc_link(self): if self._doc_link_url_param_generator is None: estimator_name = self.__class__.__name__ + # Construct the estimator's module name, up to the first private submodule. + # This works because in scikit-learn all public estimators are exposed at + # that level, even if they actually live in a private sub-module. estimator_module = ".".join( - [ - _ - for _ in self.__class__.__module__.split(".") - if not _.startswith("_") - ] + itertools.takewhile( + lambda part: not part.startswith("_"), + self.__class__.__module__.split("."), + ) ) return self._doc_link_template.format( estimator_module=estimator_module, estimator_name=estimator_name diff --git a/sklearn/utils/tests/test_estimator_html_repr.py b/sklearn/utils/tests/test_estimator_html_repr.py index d3a395d5cfe86..d59658998432d 100644 --- a/sklearn/utils/tests/test_estimator_html_repr.py +++ b/sklearn/utils/tests/test_estimator_html_repr.py @@ -433,7 +433,33 @@ def test_html_documentation_link_mixin_sklearn(mock_version): ) -def test_html_documentation_link_mixin_get_doc_link(): +@pytest.mark.parametrize( + "module_path,expected_module", + [ + ("prefix.mymodule", "prefix.mymodule"), + ("prefix._mymodule", "prefix"), + ("prefix.mypackage._mymodule", "prefix.mypackage"), + ("prefix.mypackage._mymodule.submodule", "prefix.mypackage"), + ("prefix.mypackage.mymodule.submodule", "prefix.mypackage.mymodule.submodule"), + ], +) +def test_html_documentation_link_mixin_get_doc_link(module_path, expected_module): + """Check the behaviour of the `_get_doc_link` with various parameter.""" + + class FooBar(_HTMLDocumentationLinkMixin): + pass + + FooBar.__module__ = module_path + est = FooBar() + # if we set `_doc_link`, then we expect to infer a module and name for the estimator + est._doc_link_module = "prefix" + est._doc_link_template = ( + "https://website.com/{estimator_module}.{estimator_name}.html" + ) + assert est._get_doc_link() == f"https://website.com/{expected_module}.FooBar.html" + + +def test_html_documentation_link_mixin_get_doc_link_out_of_library(): """Check the behaviour of the `_get_doc_link` with various parameter.""" mixin = _HTMLDocumentationLinkMixin() @@ -442,16 +468,9 @@ def test_html_documentation_link_mixin_get_doc_link(): mixin._doc_link_module = "xxx" assert mixin._get_doc_link() == "" - # if we set `_doc_link`, then we expect to infer a module and name for the estimator - mixin._doc_link_module = "sklearn" - mixin._doc_link_template = ( - "https://website.com/{estimator_module}.{estimator_name}.html" - ) - assert ( - mixin._get_doc_link() - == "https://website.com/sklearn.utils._HTMLDocumentationLinkMixin.html" - ) +def test_html_documentation_link_mixin_doc_link_url_param_generator(): + mixin = _HTMLDocumentationLinkMixin() # we can bypass the generation by providing our own callable mixin._doc_link_template = ( "https://website.com/{my_own_variable}.{another_variable}.html" From 5aeeeefe31acd5444148fd89f3a8df9443b04889 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Mon, 15 Jan 2024 18:24:47 +0100 Subject: [PATCH 057/554] MAINT remove deprecated 'full' and 'auto' option from KMeans (#28115) --- sklearn/cluster/_kmeans.py | 23 +++-------------------- sklearn/cluster/tests/test_k_means.py | 15 --------------- 2 files changed, 3 insertions(+), 35 deletions(-) diff --git a/sklearn/cluster/_kmeans.py b/sklearn/cluster/_kmeans.py index 0732b75f982b8..178242e60be57 100644 --- a/sklearn/cluster/_kmeans.py +++ b/sklearn/cluster/_kmeans.py @@ -389,16 +389,13 @@ def k_means( `copy_x` is False. If the original data is sparse, but not in CSR format, a copy will be made even if `copy_x` is False. - algorithm : {"lloyd", "elkan", "auto", "full"}, default="lloyd" + algorithm : {"lloyd", "elkan"}, default="lloyd" K-means algorithm to use. The classical EM-style algorithm is `"lloyd"`. The `"elkan"` variation can be more efficient on some datasets with well-defined clusters, by using the triangle inequality. However it's more memory intensive due to the allocation of an extra array of shape `(n_samples, n_clusters)`. - `"auto"` and `"full"` are deprecated and they will be removed in - Scikit-Learn 1.3. They are both aliases for `"lloyd"`. - .. versionchanged:: 0.18 Added Elkan algorithm @@ -1294,16 +1291,13 @@ class KMeans(_BaseKMeans): copy_x is False. If the original data is sparse, but not in CSR format, a copy will be made even if copy_x is False. - algorithm : {"lloyd", "elkan", "auto", "full"}, default="lloyd" + algorithm : {"lloyd", "elkan"}, default="lloyd" K-means algorithm to use. The classical EM-style algorithm is `"lloyd"`. The `"elkan"` variation can be more efficient on some datasets with well-defined clusters, by using the triangle inequality. However it's more memory intensive due to the allocation of an extra array of shape `(n_samples, n_clusters)`. - `"auto"` and `"full"` are deprecated and they will be removed in - Scikit-Learn 1.3. They are both aliases for `"lloyd"`. - .. versionchanged:: 0.18 Added Elkan algorithm @@ -1404,9 +1398,7 @@ class KMeans(_BaseKMeans): _parameter_constraints: dict = { **_BaseKMeans._parameter_constraints, "copy_x": ["boolean"], - "algorithm": [ - StrOptions({"lloyd", "elkan", "auto", "full"}, deprecated={"auto", "full"}) - ], + "algorithm": [StrOptions({"lloyd", "elkan"})], } def __init__( @@ -1439,15 +1431,6 @@ def _check_params_vs_input(self, X): super()._check_params_vs_input(X, default_n_init=10) self._algorithm = self.algorithm - if self._algorithm in ("auto", "full"): - warnings.warn( - ( - f"algorithm='{self._algorithm}' is deprecated, it will be " - "removed in 1.3. Using 'lloyd' instead." - ), - FutureWarning, - ) - self._algorithm = "lloyd" if self._algorithm == "elkan" and self.n_clusters == 1: warnings.warn( ( diff --git a/sklearn/cluster/tests/test_k_means.py b/sklearn/cluster/tests/test_k_means.py index 030f35bb748bb..5b0c7ab9aace8 100644 --- a/sklearn/cluster/tests/test_k_means.py +++ b/sklearn/cluster/tests/test_k_means.py @@ -200,21 +200,6 @@ def test_kmeans_convergence(algorithm, global_random_seed): assert km.n_iter_ < max_iter -@pytest.mark.parametrize("algorithm", ["auto", "full"]) -def test_algorithm_auto_full_deprecation_warning(algorithm): - X = np.random.rand(100, 2) - kmeans = KMeans(algorithm=algorithm) - with pytest.warns( - FutureWarning, - match=( - f"algorithm='{algorithm}' is deprecated, it will " - "be removed in 1.3. Using 'lloyd' instead." - ), - ): - kmeans.fit(X) - assert kmeans._algorithm == "lloyd" - - @pytest.mark.parametrize("Estimator", [KMeans, MiniBatchKMeans]) def test_predict_sample_weight_deprecation_warning(Estimator): X = np.random.rand(100, 2) From 7f131e04b01b43a05846bbc3d0b8c2988e6dd03e Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Mon, 15 Jan 2024 19:06:02 +0100 Subject: [PATCH 058/554] MNT support cross 32bit/64bit pickles for HGBT (#28074) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Loïc Estève Co-authored-by: Olivier Grisel --- doc/whats_new/v1.4.rst | 8 ++ .../_hist_gradient_boosting/predictor.py | 29 ++++-- .../tests/test_gradient_boosting.py | 91 ++++++++++++++++++- 3 files changed, 121 insertions(+), 7 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index 64c7ade810231..cb4404c7855d8 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -506,6 +506,14 @@ Changelog support missing values if all `estimators` support missing values. :pr:`27710` by :user:`Guillaume Lemaitre `. +- |Fix| Support loading pickles of :class:`ensemble.HistGradientBoostingClassifier` and + :class:`ensemble.HistGradientBoostingRegressor` when the pickle has + been generated on a platform with a different bitness. A typical example is + to train and pickle the model on 64 bit machine and load the model on a 32 + bit machine for prediction. + :pr:`28074` by :user:`Christian Lorentzen ` and + :user:`Loïc Estève `. + - |API| In :class:`ensemble.AdaBoostClassifier`, the `algorithm` argument `SAMME.R` was deprecated and will be removed in 1.6. :pr:`26830` by :user:`Stefanie Senger `. diff --git a/sklearn/ensemble/_hist_gradient_boosting/predictor.py b/sklearn/ensemble/_hist_gradient_boosting/predictor.py index 600e55e43467f..b939712d18893 100644 --- a/sklearn/ensemble/_hist_gradient_boosting/predictor.py +++ b/sklearn/ensemble/_hist_gradient_boosting/predictor.py @@ -10,7 +10,7 @@ _predict_from_binned_data, _predict_from_raw_data, ) -from .common import Y_DTYPE +from .common import PREDICTOR_RECORD_DTYPE, Y_DTYPE class TreePredictor: @@ -20,15 +20,12 @@ class TreePredictor: ---------- nodes : ndarray of PREDICTOR_RECORD_DTYPE The nodes of the tree. - binned_left_cat_bitsets : ndarray of shape (n_categorical_splits, 8), \ - dtype=uint32 + binned_left_cat_bitsets : ndarray of shape (n_categorical_splits, 8), dtype=uint32 Array of bitsets for binned categories used in predict_binned when a split is categorical. - raw_left_cat_bitsets : ndarray of shape (n_categorical_splits, 8), \ - dtype=uint32 + raw_left_cat_bitsets : ndarray of shape (n_categorical_splits, 8), dtype=uint32 Array of bitsets for raw categories used in predict when a split is categorical. - """ def __init__(self, nodes, binned_left_cat_bitsets, raw_left_cat_bitsets): @@ -68,6 +65,7 @@ def predict(self, X, known_cat_bitsets, f_idx_map, n_threads): The raw predicted values. """ out = np.empty(X.shape[0], dtype=Y_DTYPE) + _predict_from_raw_data( self.nodes, X, @@ -125,3 +123,22 @@ def compute_partial_dependence(self, grid, target_features, out): point. """ _compute_partial_dependence(self.nodes, grid, target_features, out) + + def __setstate__(self, state): + try: + super().__setstate__(state) + except AttributeError: + self.__dict__.update(state) + + # The dtype of feature_idx is np.intp which is platform dependent. Here, we + # make sure that saving and loading on different bitness systems works without + # errors. For instance, on a 64 bit Python runtime, np.intp = np.int64, + # while on 32 bit np.intp = np.int32. + # + # TODO: consider always using platform agnostic dtypes for fitted + # estimator attributes. For this particular estimator, this would + # mean replacing the intp field of PREDICTOR_RECORD_DTYPE by an int32 + # field. Ideally this should be done consistently throughout + # scikit-learn along with a common test. + if self.nodes.dtype != PREDICTOR_RECORD_DTYPE: + self.nodes = self.nodes.astype(PREDICTOR_RECORD_DTYPE, casting="same_kind") diff --git a/sklearn/ensemble/_hist_gradient_boosting/tests/test_gradient_boosting.py b/sklearn/ensemble/_hist_gradient_boosting/tests/test_gradient_boosting.py index 8adc0a19dc483..bdc85eccd6607 100644 --- a/sklearn/ensemble/_hist_gradient_boosting/tests/test_gradient_boosting.py +++ b/sklearn/ensemble/_hist_gradient_boosting/tests/test_gradient_boosting.py @@ -1,9 +1,14 @@ +import copyreg +import io +import pickle import re import warnings from unittest.mock import Mock +import joblib import numpy as np import pytest +from joblib.numpy_pickle import NumpyPickler from numpy.testing import assert_allclose, assert_array_equal import sklearn @@ -24,12 +29,13 @@ from sklearn.ensemble._hist_gradient_boosting.binning import _BinMapper from sklearn.ensemble._hist_gradient_boosting.common import G_H_DTYPE from sklearn.ensemble._hist_gradient_boosting.grower import TreeGrower +from sklearn.ensemble._hist_gradient_boosting.predictor import TreePredictor from sklearn.exceptions import NotFittedError from sklearn.metrics import get_scorer, mean_gamma_deviance, mean_poisson_deviance from sklearn.model_selection import cross_val_score, train_test_split from sklearn.pipeline import make_pipeline from sklearn.preprocessing import KBinsDiscretizer, MinMaxScaler, OneHotEncoder -from sklearn.utils import shuffle +from sklearn.utils import _IS_32BIT, shuffle from sklearn.utils._openmp_helpers import _openmp_effective_n_threads from sklearn.utils._testing import _convert_container @@ -1580,3 +1586,86 @@ def test_categorical_features_warn(): msg = "The categorical_features parameter will change to 'from_dtype' in v1.6" with pytest.warns(FutureWarning, match=msg): hist.fit(X, y) + + +def get_different_bitness_node_ndarray(node_ndarray): + new_dtype_for_indexing_fields = np.int64 if _IS_32BIT else np.int32 + + # field names in Node struct with np.intp types (see + # sklearn/ensemble/_hist_gradient_boosting/common.pyx) + indexing_field_names = ["feature_idx"] + + new_dtype_dict = { + name: dtype for name, (dtype, _) in node_ndarray.dtype.fields.items() + } + for name in indexing_field_names: + new_dtype_dict[name] = new_dtype_for_indexing_fields + + new_dtype = np.dtype( + {"names": list(new_dtype_dict.keys()), "formats": list(new_dtype_dict.values())} + ) + return node_ndarray.astype(new_dtype, casting="same_kind") + + +def reduce_predictor_with_different_bitness(predictor): + cls, args, state = predictor.__reduce__() + + new_state = state.copy() + new_state["nodes"] = get_different_bitness_node_ndarray(new_state["nodes"]) + + return (cls, args, new_state) + + +def test_different_bitness_pickle(): + X, y = make_classification(random_state=0) + + clf = HistGradientBoostingClassifier(random_state=0, max_depth=3) + clf.fit(X, y) + score = clf.score(X, y) + + def pickle_dump_with_different_bitness(): + f = io.BytesIO() + p = pickle.Pickler(f) + p.dispatch_table = copyreg.dispatch_table.copy() + p.dispatch_table[TreePredictor] = reduce_predictor_with_different_bitness + + p.dump(clf) + f.seek(0) + return f + + # Simulate loading a pickle of the same model trained on a platform with different + # bitness that than the platform it will be used to make predictions on: + new_clf = pickle.load(pickle_dump_with_different_bitness()) + new_score = new_clf.score(X, y) + assert score == pytest.approx(new_score) + + +def test_different_bitness_joblib_pickle(): + # Make sure that a platform specific pickle generated on a 64 bit + # platform can be converted at pickle load time into an estimator + # with Cython code that works with the host's native integer precision + # to index nodes in the tree data structure when the host is a 32 bit + # platform (and vice versa). + # + # This is in particular useful to be able to train a model on a 64 bit Linux + # server and deploy the model as part of a (32 bit) WASM in-browser + # application using pyodide. + X, y = make_classification(random_state=0) + + clf = HistGradientBoostingClassifier(random_state=0, max_depth=3) + clf.fit(X, y) + score = clf.score(X, y) + + def joblib_dump_with_different_bitness(): + f = io.BytesIO() + p = NumpyPickler(f) + p.dispatch_table = copyreg.dispatch_table.copy() + p.dispatch_table[TreePredictor] = reduce_predictor_with_different_bitness + + p.dump(clf) + f.seek(0) + return f + + new_clf = joblib.load(joblib_dump_with_different_bitness()) + new_score = new_clf.score(X, y) + assert score == pytest.approx(new_score) From 8a71b840d3d7f6e5db9f9faf3b6c44f8ed6a3850 Mon Sep 17 00:00:00 2001 From: thebabush <1985669+thebabush@users.noreply.github.com> Date: Tue, 16 Jan 2024 04:12:56 +0900 Subject: [PATCH 059/554] ENH ensure no copy if not requested and improve transform performance in TFIDFTransformer (#18843) Co-authored-by: Guillaume Lemaitre --- doc/whats_new/v1.5.rst | 8 ++++ sklearn/feature_extraction/tests/test_text.py | 21 +++++++++ sklearn/feature_extraction/text.py | 46 ++++--------------- 3 files changed, 39 insertions(+), 36 deletions(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 6f4778d9784e8..96cbd21021f08 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -31,6 +31,14 @@ Changelog - |Feature| A fitted :class:`compose.ColumnTransformer` now implements `__getitem__` which returns the fitted transformers by name. :pr:`27990` by `Thomas Fan`_. +:mod:`sklearn.feature_extraction` +................................. + +- |Efficiency| :class:`feature_extraction.text.TfidfTransformer` is now faster + and more memory-efficient by using a NumPy vector instead of a sparse matrix + for storing the inverse document frequency. + :pr:`18843` by :user:`Paolo Montesel `. + :mod:`sklearn.impute` ..................... diff --git a/sklearn/feature_extraction/tests/test_text.py b/sklearn/feature_extraction/tests/test_text.py index 7c7cac85ccc6b..06a4f0e805e29 100644 --- a/sklearn/feature_extraction/tests/test_text.py +++ b/sklearn/feature_extraction/tests/test_text.py @@ -1653,3 +1653,24 @@ def test_vectorizers_do_not_have_set_output(Estimator): """Check that vectorizers do not define set_output.""" est = Estimator() assert not hasattr(est, "set_output") + + +@pytest.mark.parametrize("csr_container", CSR_CONTAINERS) +def test_tfidf_transformer_copy(csr_container): + """Check the behaviour of TfidfTransformer.transform with the copy parameter.""" + X = sparse.rand(10, 20000, dtype=np.float64, random_state=42) + X_csr = csr_container(X) + + # keep a copy of the original matrix for later comparison + X_csr_original = X_csr.copy() + + transformer = TfidfTransformer().fit(X_csr) + + X_transform = transformer.transform(X_csr, copy=True) + assert_allclose_dense_sparse(X_csr, X_csr_original) + assert X_transform is not X_csr + + X_transform = transformer.transform(X_csr, copy=False) + assert X_transform is X_csr + with pytest.raises(AssertionError): + assert_allclose_dense_sparse(X_csr, X_csr_original) diff --git a/sklearn/feature_extraction/text.py b/sklearn/feature_extraction/text.py index 29104c29e74ac..cef6f340e83c8 100644 --- a/sklearn/feature_extraction/text.py +++ b/sklearn/feature_extraction/text.py @@ -1679,14 +1679,10 @@ def fit(self, X, y=None): # log+1 instead of log makes sure terms with zero idf don't get # suppressed entirely. - idf = np.log(n_samples / df) + 1 - self._idf_diag = sp.diags( - idf, - offsets=0, - shape=(n_features, n_features), - format="csr", - dtype=dtype, - ) + self.idf_ = np.log(n_samples / df) + 1.0 + # FIXME: for backward compatibility, we force idf_ to be np.float64 + # In the future, we should preserve the `dtype` of `X`. + self.idf_ = self.idf_.astype(np.float64, copy=False) return self @@ -1700,13 +1696,14 @@ def transform(self, X, copy=True): copy : bool, default=True Whether to copy X and operate on the copy or perform in-place - operations. + operations. `copy=False` will only be effective with CSR sparse matrix. Returns ------- vectors : sparse matrix of shape (n_samples, n_features) Tf-idf-weighted document-term matrix. """ + check_is_fitted(self) X = self._validate_data( X, accept_sparse="csr", dtype=FLOAT_DTYPES, copy=copy, reset=False ) @@ -1717,39 +1714,16 @@ def transform(self, X, copy=True): np.log(X.data, X.data) X.data += 1 - if self.use_idf: - # idf_ being a property, the automatic attributes detection - # does not work as usual and we need to specify the attribute - # name: - check_is_fitted(self, attributes=["idf_"], msg="idf vector is not fitted") - - X = X @ self._idf_diag + if hasattr(self, "idf_"): + # the columns of X (CSR matrix) can be accessed with `X.indices `and + # multiplied with the corresponding `idf` value + X.data *= self.idf_[X.indices] if self.norm is not None: X = normalize(X, norm=self.norm, copy=False) return X - @property - def idf_(self): - """Inverse document frequency vector, only defined if `use_idf=True`. - - Returns - ------- - ndarray of shape (n_features,) - """ - # if _idf_diag is not set, this will raise an attribute error, - # which means hasattr(self, "idf_") is False - return np.ravel(self._idf_diag.sum(axis=0)) - - @idf_.setter - def idf_(self, value): - value = np.asarray(value, dtype=np.float64) - n_features = value.shape[0] - self._idf_diag = sp.spdiags( - value, diags=0, m=n_features, n=n_features, format="csr" - ) - def _more_tags(self): return {"X_types": ["2darray", "sparse"]} From 6ee983c1b212b63ef0823f101bddd8d3b5ea6f67 Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Tue, 16 Jan 2024 03:42:09 +0800 Subject: [PATCH 060/554] DOC solve some sphinx errors when updating to `pydata-sphinx-theme` (#28134) --- doc/modules/ensemble.rst | 2 ++ examples/release_highlights/plot_release_highlights_1_4_0.py | 2 +- 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/doc/modules/ensemble.rst b/doc/modules/ensemble.rst index 334e00e35a848..ee8cac3e715c2 100644 --- a/doc/modules/ensemble.rst +++ b/doc/modules/ensemble.rst @@ -286,10 +286,12 @@ model. For a predictor :math:`F` with two features: - a **monotonic increase constraint** is a constraint of the form: + .. math:: x_1 \leq x_1' \implies F(x_1, x_2) \leq F(x_1', x_2) - a **monotonic decrease constraint** is a constraint of the form: + .. math:: x_1 \leq x_1' \implies F(x_1, x_2) \geq F(x_1', x_2) diff --git a/examples/release_highlights/plot_release_highlights_1_4_0.py b/examples/release_highlights/plot_release_highlights_1_4_0.py index d8112699e04ed..0d2924d9e8bb4 100644 --- a/examples/release_highlights/plot_release_highlights_1_4_0.py +++ b/examples/release_highlights/plot_release_highlights_1_4_0.py @@ -155,7 +155,7 @@ # ------------------------ # Many meta-estimators and cross-validation routines now support metadata # routing, which are listed in the :ref:`user guide -# <_metadata_routing_models>`. For instance, this is how you can do a nested +# `. For instance, this is how you can do a nested # cross-validation with sample weights and :class:`~model_selection.GroupKFold`: import sklearn from sklearn.metrics import get_scorer From 0040de8c0be2fe3d7885919b9384c163b7c38c81 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Mon, 15 Jan 2024 20:42:24 +0100 Subject: [PATCH 061/554] DOC fix underline for Examples section in clear_data_home (#28135) --- sklearn/datasets/_base.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/sklearn/datasets/_base.py b/sklearn/datasets/_base.py index ab2b8bd3f5110..d5c9a66b76167 100644 --- a/sklearn/datasets/_base.py +++ b/sklearn/datasets/_base.py @@ -87,7 +87,7 @@ def clear_data_home(data_home=None): is `~/scikit_learn_data`. Examples - ---------- + -------- >>> from sklearn.datasets import clear_data_home >>> clear_data_home() # doctest: +SKIP """ From 65c907d31fd4a6e68fbf59d93052521c48556f61 Mon Sep 17 00:00:00 2001 From: Xiao Yuan Date: Tue, 16 Jan 2024 04:56:43 +0800 Subject: [PATCH 062/554] DOC add examples in docstring for decomposition (#28131) Co-authored-by: Guillaume Lemaitre --- sklearn/decomposition/_dict_learning.py | 67 +++++++++++++++++++++++++ 1 file changed, 67 insertions(+) diff --git a/sklearn/decomposition/_dict_learning.py b/sklearn/decomposition/_dict_learning.py index 561ffd32a0551..51350aa5e05bd 100644 --- a/sklearn/decomposition/_dict_learning.py +++ b/sklearn/decomposition/_dict_learning.py @@ -338,6 +338,23 @@ def sparse_encode( sklearn.linear_model.Lasso : Train Linear Model with L1 prior as regularizer. SparseCoder : Find a sparse representation of data from a fixed precomputed dictionary. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.decomposition import sparse_encode + >>> X = np.array([[-1, -1, -1], [0, 0, 3]]) + >>> dictionary = np.array( + ... [[0, 1, 0], + ... [-1, -1, 2], + ... [1, 1, 1], + ... [0, 1, 1], + ... [0, 2, 1]], + ... dtype=np.float64 + ... ) + >>> sparse_encode(X, dictionary, alpha=1e-10) + array([[ 0., 0., -1., 0., 0.], + [ 0., 1., 1., 0., 0.]]) """ if check_input: if algorithm == "lasso_cd": @@ -804,6 +821,32 @@ def dict_learning_online( learning algorithm. SparsePCA : Sparse Principal Components Analysis. MiniBatchSparsePCA : Mini-batch Sparse Principal Components Analysis. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.datasets import make_sparse_coded_signal + >>> from sklearn.decomposition import dict_learning_online + >>> X, _, _ = make_sparse_coded_signal( + ... n_samples=30, n_components=15, n_features=20, n_nonzero_coefs=10, + ... random_state=42, + ... ) + >>> U, V = dict_learning_online( + ... X, n_components=15, alpha=0.2, max_iter=20, batch_size=3, random_state=42 + ... ) + + We can check the level of sparsity of `U`: + + >>> np.mean(U == 0) + 0.53... + + We can compare the average squared euclidean norm of the reconstruction + error of the sparse coded signal relative to the squared euclidean norm of + the original signal: + + >>> X_hat = U @ V + >>> np.mean(np.sum((X_hat - X) ** 2, axis=1) / np.sum(X ** 2, axis=1)) + 0.05... """ # TODO(1.6): remove in 1.6 if max_iter is None: @@ -982,6 +1025,30 @@ def dict_learning( of the dictionary learning algorithm. SparsePCA : Sparse Principal Components Analysis. MiniBatchSparsePCA : Mini-batch Sparse Principal Components Analysis. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.datasets import make_sparse_coded_signal + >>> from sklearn.decomposition import dict_learning + >>> X, _, _ = make_sparse_coded_signal( + ... n_samples=30, n_components=15, n_features=20, n_nonzero_coefs=10, + ... random_state=42, + ... ) + >>> U, V, errors = dict_learning(X, n_components=15, alpha=0.1, random_state=42) + + We can check the level of sparsity of `U`: + + >>> np.mean(U == 0) + 0.6... + + We can compare the average squared euclidean norm of the reconstruction + error of the sparse coded signal relative to the squared euclidean norm of + the original signal: + + >>> X_hat = U @ V + >>> np.mean(np.sum((X_hat - X) ** 2, axis=1) / np.sum(X ** 2, axis=1)) + 0.01... """ estimator = DictionaryLearning( n_components=n_components, From 54de8300b2079f71d26137bdb2d62b3b277950cc Mon Sep 17 00:00:00 2001 From: Salim Dohri <104096451+dohrisalim@users.noreply.github.com> Date: Mon, 15 Jan 2024 22:01:01 +0100 Subject: [PATCH 063/554] DOC Add a docstring example for the BiclusterMixin class (#28129) Co-authored-by: Guillaume Lemaitre --- sklearn/base.py | 27 ++++++++++++++++++++++++++- 1 file changed, 26 insertions(+), 1 deletion(-) diff --git a/sklearn/base.py b/sklearn/base.py index a6313947ba469..c2b119cbf63e5 100644 --- a/sklearn/base.py +++ b/sklearn/base.py @@ -889,7 +889,32 @@ def _more_tags(self): class BiclusterMixin: - """Mixin class for all bicluster estimators in scikit-learn.""" + """Mixin class for all bicluster estimators in scikit-learn. + + This mixin defines the following functionality: + + - `biclusters_` property that returns the row and column indicators; + - `get_indices` method that returns the row and column indices of a bicluster; + - `get_shape` method that returns the shape of a bicluster; + - `get_submatrix` method that returns the submatrix corresponding to a bicluster. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import BaseEstimator, BiclusterMixin + >>> class DummyBiClustering(BiclusterMixin, BaseEstimator): + ... def fit(self, X, y=None): + ... self.rows_ = np.ones(shape=(1, X.shape[0]), dtype=bool) + ... self.columns_ = np.ones(shape=(1, X.shape[1]), dtype=bool) + ... return self + >>> X = np.array([[1, 1], [2, 1], [1, 0], + ... [4, 7], [3, 5], [3, 6]]) + >>> bicluster = DummyBiClustering().fit(X) + >>> hasattr(bicluster, "biclusters_") + True + >>> bicluster.get_indices(0) + (array([0, 1, 2, 3, 4, 5]), array([0, 1])) + """ @property def biclusters_(self): From f3b13e5dae57d57e3c455b05e747e0341b755fe2 Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Mon, 15 Jan 2024 22:29:03 +0100 Subject: [PATCH 064/554] FIX divide by zero in line search of GradientBoostingClassifier (#28095) --- doc/whats_new/v1.4.rst | 2 +- sklearn/ensemble/_gb.py | 15 ++++++--- .../ensemble/tests/test_gradient_boosting.py | 32 +++++++++++++++++-- 3 files changed, 42 insertions(+), 7 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index cb4404c7855d8..ae830dd9346d8 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -470,7 +470,7 @@ Changelog - |Efficiency| :class:`ensemble.GradientBoostingClassifier` is faster, for binary and in particular for multiclass problems thanks to the private loss function module. - :pr:`26278` by :user:`Christian Lorentzen `. + :pr:`26278` and :pr:`28095` by :user:`Christian Lorentzen `. - |Efficiency| Improves runtime and memory usage for :class:`ensemble.GradientBoostingClassifier` and diff --git a/sklearn/ensemble/_gb.py b/sklearn/ensemble/_gb.py index f478d94a5828f..7c5dd6fbdac3c 100644 --- a/sklearn/ensemble/_gb.py +++ b/sklearn/ensemble/_gb.py @@ -65,16 +65,23 @@ def _safe_divide(numerator, denominator): """Prevents overflow and division by zero.""" - try: + # This is used for classifiers where the denominator might become zero exatly. + # For instance for log loss, HalfBinomialLoss, if proba=0 or proba=1 exactly, then + # denominator = hessian = 0, and we should set the node value in the line search to + # zero as there is no improvement of the loss possible. + # For numerical safety, we do this already for extremely tiny values. + if abs(denominator) < 1e-150: + return 0.0 + else: + # Cast to Python float to trigger Python errors, e.g. ZeroDivisionError, + # without relying on `np.errstate` that is not supported by Pyodide. + result = float(numerator) / float(denominator) # Cast to Python float to trigger a ZeroDivisionError without relying # on `np.errstate` that is not supported by Pyodide. result = float(numerator) / float(denominator) if math.isinf(result): warnings.warn("overflow encountered in _safe_divide", RuntimeWarning) return result - except ZeroDivisionError: - warnings.warn("divide by zero encountered in _safe_divide", RuntimeWarning) - return 0.0 def _init_raw_predictions(X, estimator, loss, use_predict_proba): diff --git a/sklearn/ensemble/tests/test_gradient_boosting.py b/sklearn/ensemble/tests/test_gradient_boosting.py index f721767b96aa7..4bfbf7c2ff6ee 100644 --- a/sklearn/ensemble/tests/test_gradient_boosting.py +++ b/sklearn/ensemble/tests/test_gradient_boosting.py @@ -1452,9 +1452,9 @@ def test_huber_vs_mean_and_median(): def test_safe_divide(): """Test that _safe_divide handles division by zero.""" - with pytest.warns(RuntimeWarning, match="divide"): + with warnings.catch_warnings(): + warnings.simplefilter("error") assert _safe_divide(np.float64(1e300), 0) == 0 - with pytest.warns(RuntimeWarning, match="divide"): assert _safe_divide(np.float64(0.0), np.float64(0.0)) == 0 with pytest.warns(RuntimeWarning, match="overflow"): # np.finfo(float).max = 1.7976931348623157e+308 @@ -1680,3 +1680,31 @@ def test_multinomial_error_exact_backward_compat(): ] ) assert_allclose(gbt.train_score_[-10:], train_score, rtol=1e-8) + + +def test_gb_denominator_zero(global_random_seed): + """Test _update_terminal_regions denominator is not zero. + + For instance for log loss based binary classification, the line search step might + become nan/inf as denominator = hessian = prob * (1 - prob) and prob = 0 or 1 can + happen. + Here, we create a situation were this happens (at least with roughly 80%) based + on the random seed. + """ + X, y = datasets.make_hastie_10_2(n_samples=100, random_state=20) + + params = { + "learning_rate": 1.0, + "subsample": 0.5, + "n_estimators": 100, + "max_leaf_nodes": 4, + "max_depth": None, + "random_state": global_random_seed, + "min_samples_leaf": 2, + } + + clf = GradientBoostingClassifier(**params) + # _safe_devide would raise a RuntimeWarning + with warnings.catch_warnings(): + warnings.simplefilter("error") + clf.fit(X, y) From db971d1e63acdac0d00ac1c32636ceec904f48df Mon Sep 17 00:00:00 2001 From: John Cant Date: Mon, 15 Jan 2024 21:30:43 +0000 Subject: [PATCH 065/554] FIX return proper instance class in displays classmethod (#27675) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Guillaume Lemaitre Co-authored-by: Jérémie du Boisberranger <34657725+jeremiedbb@users.noreply.github.com> --- doc/whats_new/v1.4.rst | 12 +++++ sklearn/inspection/_plot/decision_boundary.py | 2 +- .../inspection/_plot/partial_dependence.py | 2 +- .../tests/test_boundary_decision_display.py | 16 +++++++ .../tests/test_plot_partial_dependence.py | 21 +++++++++ sklearn/metrics/_plot/det_curve.py | 2 +- .../metrics/_plot/precision_recall_curve.py | 2 +- sklearn/metrics/_plot/regression.py | 2 +- sklearn/metrics/_plot/roc_curve.py | 2 +- .../_plot/tests/test_common_curve_display.py | 44 +++++++++++++++++++ sklearn/model_selection/tests/test_plot.py | 23 ++++++++++ 11 files changed, 122 insertions(+), 6 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index ae830dd9346d8..b738c89d5b7d3 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -557,6 +557,11 @@ Changelog both binary and multiclass classifiers. :pr:`27291` by :user:`Guillaume Lemaitre `. +- |Fix| :meth:`inspection.DecisionBoundaryDisplay.from_estimator` and + :class:`inspection.PartialDependenceDisplay.from_estimator` now return the correct + type for subclasses. + :pr:`27675` by :user:`John Cant `. + - |API| :class:`inspection.DecisionBoundaryDisplay` raise an `AttributeError` instead of a `ValueError` when an estimator does not implement the requested response method. :pr:`27291` by :user:`Guillaume Lemaitre `. @@ -646,6 +651,13 @@ Changelog `predict_proba`). Such scorer are specific to classification. :pr:`26840` by :user:`Guillaume Lemaitre `. +- |Fix| :meth:`metrics.DetCurveDisplay.from_predictions`, + :class:`metrics.PrecisionRecallDisplay.from_predictions`, + :class:`metrics.PredictionErrorDisplay.from_predictions`, and + :class:`metrics.RocCurveDisplay.from_predictions` now return the correct type + for subclasses. + :pr:`27675` by :user:`John Cant `. + - |API| Deprecated `needs_threshold` and `needs_proba` from :func:`metrics.make_scorer`. These parameters will be removed in version 1.6. Instead, use `response_method` that accepts `"predict"`, `"predict_proba"` or `"decision_function"` or a list of such diff --git a/sklearn/inspection/_plot/decision_boundary.py b/sklearn/inspection/_plot/decision_boundary.py index a42e744261e0b..12162b25c53ed 100644 --- a/sklearn/inspection/_plot/decision_boundary.py +++ b/sklearn/inspection/_plot/decision_boundary.py @@ -396,7 +396,7 @@ def from_estimator( if ylabel is None: ylabel = X.columns[1] if hasattr(X, "columns") else "" - display = DecisionBoundaryDisplay( + display = cls( xx0=xx0, xx1=xx1, response=response.reshape(xx0.shape), diff --git a/sklearn/inspection/_plot/partial_dependence.py b/sklearn/inspection/_plot/partial_dependence.py index 7414433ed3f56..f640df909e2d4 100644 --- a/sklearn/inspection/_plot/partial_dependence.py +++ b/sklearn/inspection/_plot/partial_dependence.py @@ -744,7 +744,7 @@ def from_estimator( X_col = _safe_indexing(X, fx, axis=1) deciles[fx] = mquantiles(X_col, prob=np.arange(0.1, 1.0, 0.1)) - display = PartialDependenceDisplay( + display = cls( pd_results=pd_results, features=features, feature_names=feature_names, diff --git a/sklearn/inspection/_plot/tests/test_boundary_decision_display.py b/sklearn/inspection/_plot/tests/test_boundary_decision_display.py index 37e8adf3aac2d..7bb38f55445a0 100644 --- a/sklearn/inspection/_plot/tests/test_boundary_decision_display.py +++ b/sklearn/inspection/_plot/tests/test_boundary_decision_display.py @@ -591,3 +591,19 @@ def test_class_of_interest_multiclass(pyplot, response_method): response_method=response_method, class_of_interest=None, ) + + +def test_subclass_named_constructors_return_type_is_subclass(pyplot): + """Check that named constructors return the correct type when subclassed. + + Non-regression test for: + https://github.com/scikit-learn/scikit-learn/pull/27675 + """ + clf = LogisticRegression().fit(X, y) + + class SubclassOfDisplay(DecisionBoundaryDisplay): + pass + + curve = SubclassOfDisplay.from_estimator(estimator=clf, X=X) + + assert isinstance(curve, SubclassOfDisplay) diff --git a/sklearn/inspection/_plot/tests/test_plot_partial_dependence.py b/sklearn/inspection/_plot/tests/test_plot_partial_dependence.py index e98fdebaeaf03..57fc68d07e887 100644 --- a/sklearn/inspection/_plot/tests/test_plot_partial_dependence.py +++ b/sklearn/inspection/_plot/tests/test_plot_partial_dependence.py @@ -1117,3 +1117,24 @@ def test_partial_dependence_display_with_constant_sample_weight( assert np.array_equal( disp.pd_results[0]["average"], disp_sw.pd_results[0]["average"] ) + + +def test_subclass_named_constructors_return_type_is_subclass( + pyplot, diabetes, clf_diabetes +): + """Check that named constructors return the correct type when subclassed. + + Non-regression test for: + https://github.com/scikit-learn/scikit-learn/pull/27675 + """ + + class SubclassOfDisplay(PartialDependenceDisplay): + pass + + curve = SubclassOfDisplay.from_estimator( + clf_diabetes, + diabetes.data, + [0, 2, (0, 2)], + ) + + assert isinstance(curve, SubclassOfDisplay) diff --git a/sklearn/metrics/_plot/det_curve.py b/sklearn/metrics/_plot/det_curve.py index 98997e01750bc..e7336b10f5bb6 100644 --- a/sklearn/metrics/_plot/det_curve.py +++ b/sklearn/metrics/_plot/det_curve.py @@ -265,7 +265,7 @@ def from_predictions( sample_weight=sample_weight, ) - viz = DetCurveDisplay( + viz = cls( fpr=fpr, fnr=fnr, estimator_name=name, diff --git a/sklearn/metrics/_plot/precision_recall_curve.py b/sklearn/metrics/_plot/precision_recall_curve.py index a28d69d3b320e..852dbf3981b2c 100644 --- a/sklearn/metrics/_plot/precision_recall_curve.py +++ b/sklearn/metrics/_plot/precision_recall_curve.py @@ -486,7 +486,7 @@ def from_predictions( class_count = Counter(y_true) prevalence_pos_label = class_count[pos_label] / sum(class_count.values()) - viz = PrecisionRecallDisplay( + viz = cls( precision=precision, recall=recall, average_precision=average_precision, diff --git a/sklearn/metrics/_plot/regression.py b/sklearn/metrics/_plot/regression.py index ef0e0c39b1c4e..393a9524e2af4 100644 --- a/sklearn/metrics/_plot/regression.py +++ b/sklearn/metrics/_plot/regression.py @@ -392,7 +392,7 @@ def from_predictions( y_true = _safe_indexing(y_true, indices, axis=0) y_pred = _safe_indexing(y_pred, indices, axis=0) - viz = PredictionErrorDisplay( + viz = cls( y_true=y_true, y_pred=y_pred, ) diff --git a/sklearn/metrics/_plot/roc_curve.py b/sklearn/metrics/_plot/roc_curve.py index cf465392ef5ba..292fb6e2e2f69 100644 --- a/sklearn/metrics/_plot/roc_curve.py +++ b/sklearn/metrics/_plot/roc_curve.py @@ -402,7 +402,7 @@ def from_predictions( ) roc_auc = auc(fpr, tpr) - viz = RocCurveDisplay( + viz = cls( fpr=fpr, tpr=tpr, roc_auc=roc_auc, diff --git a/sklearn/metrics/_plot/tests/test_common_curve_display.py b/sklearn/metrics/_plot/tests/test_common_curve_display.py index 47ac750f9b278..7fe0f0fc6fa7f 100644 --- a/sklearn/metrics/_plot/tests/test_common_curve_display.py +++ b/sklearn/metrics/_plot/tests/test_common_curve_display.py @@ -8,8 +8,10 @@ from sklearn.exceptions import NotFittedError from sklearn.linear_model import LogisticRegression from sklearn.metrics import ( + ConfusionMatrixDisplay, DetCurveDisplay, PrecisionRecallDisplay, + PredictionErrorDisplay, RocCurveDisplay, ) from sklearn.pipeline import make_pipeline @@ -223,3 +225,45 @@ def test_display_curve_error_pos_label(pyplot, data_binary, Display): msg = r"y_true takes value in {10, 11} and pos_label is not specified" with pytest.raises(ValueError, match=msg): Display.from_predictions(y, y_pred) + + +@pytest.mark.parametrize( + "Display", + [ + CalibrationDisplay, + DetCurveDisplay, + PrecisionRecallDisplay, + RocCurveDisplay, + PredictionErrorDisplay, + ConfusionMatrixDisplay, + ], +) +@pytest.mark.parametrize( + "constructor", + ["from_predictions", "from_estimator"], +) +def test_classifier_display_curve_named_constructor_return_type( + pyplot, data_binary, Display, constructor +): + """Check that named constructors return the correct type when subclassed. + + Non-regression test for: + https://github.com/scikit-learn/scikit-learn/pull/27675 + """ + X, y = data_binary + + # This can be anything - we just need to check the named constructor return + # type so the only requirement here is instantiating the class without error + y_pred = y + + classifier = LogisticRegression().fit(X, y) + + class SubclassOfDisplay(Display): + pass + + if constructor == "from_predictions": + curve = SubclassOfDisplay.from_predictions(y, y_pred) + else: # constructor == "from_estimator" + curve = SubclassOfDisplay.from_estimator(classifier, X, y) + + assert isinstance(curve, SubclassOfDisplay) diff --git a/sklearn/model_selection/tests/test_plot.py b/sklearn/model_selection/tests/test_plot.py index a3dad60f7bf40..1a7268150fd90 100644 --- a/sklearn/model_selection/tests/test_plot.py +++ b/sklearn/model_selection/tests/test_plot.py @@ -570,3 +570,26 @@ def test_validation_curve_xscale_from_param_range_provided_as_a_list( ) assert display.ax_.get_xscale() == xscale + + +@pytest.mark.parametrize( + "Display, params", + [ + (LearningCurveDisplay, {}), + (ValidationCurveDisplay, {"param_name": "max_depth", "param_range": [1, 3, 5]}), + ], +) +def test_subclassing_displays(pyplot, data, Display, params): + """Check that named constructors return the correct type when subclassed. + + Non-regression test for: + https://github.com/scikit-learn/scikit-learn/pull/27675 + """ + X, y = data + estimator = DecisionTreeClassifier(random_state=0) + + class SubclassOfDisplay(Display): + pass + + display = SubclassOfDisplay.from_estimator(estimator, X, y, **params) + assert isinstance(display, SubclassOfDisplay) From dcf6c2733fef8dcf22fc1e0593305d00c0ef3fab Mon Sep 17 00:00:00 2001 From: Harmanan Kohli <17681934+Harmanankohli@users.noreply.github.com> Date: Tue, 16 Jan 2024 18:54:57 +0530 Subject: [PATCH 066/554] DOC add example in docstring of silhouette_score (#28125) Co-authored-by: Guillaume Lemaitre --- sklearn/metrics/cluster/_unsupervised.py | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/sklearn/metrics/cluster/_unsupervised.py b/sklearn/metrics/cluster/_unsupervised.py index ccbe473a5f645..21d99d950b844 100644 --- a/sklearn/metrics/cluster/_unsupervised.py +++ b/sklearn/metrics/cluster/_unsupervised.py @@ -119,6 +119,16 @@ def silhouette_score( .. [2] `Wikipedia entry on the Silhouette Coefficient `_ + + Examples + -------- + >>> from sklearn.datasets import make_blobs + >>> from sklearn.cluster import KMeans + >>> from sklearn.metrics import silhouette_score + >>> X, y = make_blobs(random_state=42) + >>> kmeans = KMeans(n_clusters=2, random_state=42) + >>> silhouette_score(X, kmeans.fit_predict(X)) + 0.49... """ if sample_size is not None: X, labels = check_X_y(X, labels, accept_sparse=["csc", "csr"]) From 064228769bb3c4b30030cbe42c4d548ea5f4380f Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Tue, 16 Jan 2024 14:32:27 +0100 Subject: [PATCH 067/554] CI Fix lock-file update workflow (#28140) --- .github/workflows/update-lock-files.yml | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/.github/workflows/update-lock-files.yml b/.github/workflows/update-lock-files.yml index b259617494e9c..50d62c85d00a6 100644 --- a/.github/workflows/update-lock-files.yml +++ b/.github/workflows/update-lock-files.yml @@ -17,16 +17,16 @@ jobs: matrix: include: - name: main - update_script_args: "--select-build-tag main-ci" + update_script_args: "--select-tag main-ci" additional_commit_message: "[doc build]" - name: scipy-dev - update_script_args: "--select-build-tag scipy-dev" + update_script_args: "--select-tag scipy-dev" additional_commit_message: "[scipy-dev]" - name: cirrus-arm - update_script_args: "--select-build-tag arm" + update_script_args: "--select-tag arm" additional_commit_message: "[cirrus arm]" - name: pypy - update_script_args: "--select-build-tag pypy" + update_script_args: "--select-tag pypy" additional_commit_message: "[pypy]" steps: From 27c3277d0986e550c3dc2b89d3474e605b3edbd6 Mon Sep 17 00:00:00 2001 From: Xiao Yuan Date: Tue, 16 Jan 2024 22:07:14 +0800 Subject: [PATCH 068/554] DOC add example in docstring for sklearn.neighbors.sort_graph_by_row_values (#28143) --- sklearn/neighbors/_base.py | 14 ++++++++++++++ 1 file changed, 14 insertions(+) diff --git a/sklearn/neighbors/_base.py b/sklearn/neighbors/_base.py index 848c8b7c9dc5a..6df0f2030877e 100644 --- a/sklearn/neighbors/_base.py +++ b/sklearn/neighbors/_base.py @@ -224,6 +224,20 @@ def sort_graph_by_row_values(graph, copy=False, warn_when_not_sorted=True): graph : sparse matrix of shape (n_samples, n_samples) Distance matrix to other samples, where only non-zero elements are considered neighbors. Matrix is in CSR format. + + Examples + -------- + >>> from scipy.sparse import csr_matrix + >>> from sklearn.neighbors import sort_graph_by_row_values + >>> X = csr_matrix( + ... [[0., 3., 1.], + ... [3., 0., 2.], + ... [1., 2., 0.]]) + >>> X.data + array([3., 1., 3., 2., 1., 2.]) + >>> X_ = sort_graph_by_row_values(X) + >>> X_.data + array([1., 3., 2., 3., 1., 2.]) """ if graph.format == "csr" and _is_sorted_by_data(graph): return graph From 9eb5b91143eb316dc6c1058cfc11f951c97c121d Mon Sep 17 00:00:00 2001 From: Claudio Salvatore Arcidiacono <22871978+ClaudioSalvatoreArcidiacono@users.noreply.github.com> Date: Tue, 16 Jan 2024 15:32:33 +0100 Subject: [PATCH 069/554] DOC add docstring example to `sklearn.datasets.make_classification` (#28141) Co-authored-by: Guillaume Lemaitre --- sklearn/datasets/_samples_generator.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/sklearn/datasets/_samples_generator.py b/sklearn/datasets/_samples_generator.py index cd0bb4b3dbba8..561b22012f69c 100644 --- a/sklearn/datasets/_samples_generator.py +++ b/sklearn/datasets/_samples_generator.py @@ -195,6 +195,17 @@ def make_classification( ---------- .. [1] I. Guyon, "Design of experiments for the NIPS 2003 variable selection benchmark", 2003. + + Examples + -------- + >>> from sklearn.datasets import make_classification + >>> X, y = make_classification(random_state=42) + >>> X.shape + (100, 20) + >>> y.shape + (100,) + >>> list(y[:5]) + [0, 0, 1, 1, 0] """ generator = check_random_state(random_state) From 48e2f72dcb9eb0957891f597db42856bf5606b4b Mon Sep 17 00:00:00 2001 From: Claudio Salvatore Arcidiacono <22871978+ClaudioSalvatoreArcidiacono@users.noreply.github.com> Date: Tue, 16 Jan 2024 15:38:15 +0100 Subject: [PATCH 070/554] DOC add examples to make_friedman 1 2 and 3 (#28142) Co-authored-by: Guillaume Lemaitre --- sklearn/datasets/_samples_generator.py | 33 ++++++++++++++++++++++++++ 1 file changed, 33 insertions(+) diff --git a/sklearn/datasets/_samples_generator.py b/sklearn/datasets/_samples_generator.py index 561b22012f69c..dd170942eb224 100644 --- a/sklearn/datasets/_samples_generator.py +++ b/sklearn/datasets/_samples_generator.py @@ -1119,6 +1119,17 @@ def make_friedman1(n_samples=100, n_features=10, *, noise=0.0, random_state=None .. [2] L. Breiman, "Bagging predictors", Machine Learning 24, pages 123-140, 1996. + + Examples + -------- + >>> from sklearn.datasets import make_friedman1 + >>> X, y = make_friedman1(random_state=42) + >>> X.shape + (100, 10) + >>> y.shape + (100,) + >>> list(y[:3]) + [16.8..., 5.8..., 9.4...] """ generator = check_random_state(random_state) @@ -1190,6 +1201,17 @@ def make_friedman2(n_samples=100, *, noise=0.0, random_state=None): .. [2] L. Breiman, "Bagging predictors", Machine Learning 24, pages 123-140, 1996. + + Examples + -------- + >>> from sklearn.datasets import make_friedman2 + >>> X, y = make_friedman2(random_state=42) + >>> X.shape + (100, 4) + >>> y.shape + (100,) + >>> list(y[:3]) + [1229.4..., 27.0..., 65.6...] """ generator = check_random_state(random_state) @@ -1263,6 +1285,17 @@ def make_friedman3(n_samples=100, *, noise=0.0, random_state=None): .. [2] L. Breiman, "Bagging predictors", Machine Learning 24, pages 123-140, 1996. + + Examples + -------- + >>> from sklearn.datasets import make_friedman3 + >>> X, y = make_friedman3(random_state=42) + >>> X.shape + (100, 4) + >>> y.shape + (100,) + >>> list(y[:3]) + [1.5..., 0.9..., 0.4...] """ generator = check_random_state(random_state) From b9a7a5e3d911c4e9bd24f1355db1aa59397a6f4b Mon Sep 17 00:00:00 2001 From: Alexis IMBERT <97242148+Alexis-IMBERT@users.noreply.github.com> Date: Tue, 16 Jan 2024 16:09:32 +0100 Subject: [PATCH 071/554] DOC Add doc link to SVC reference (#28073) Co-authored-by: Guillaume Lemaitre --- doc/conf.py | 3 ++ .../statistical_inference/model_selection.rst | 49 ++++++++++++++++--- examples/exercises/plot_cv_digits.py | 43 ---------------- sklearn/svm/_classes.py | 3 ++ 4 files changed, 49 insertions(+), 49 deletions(-) delete mode 100644 examples/exercises/plot_cv_digits.py diff --git a/doc/conf.py b/doc/conf.py index 20181c0a84769..c0846cb9ae29e 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -306,6 +306,9 @@ "auto_examples/decomposition/plot_pca_3d": ( "auto_examples/decomposition/plot_pca_iris" ), + "auto_examples/exercises/plot_cv_digits.py": ( + "auto_examples/model_selection/plot_nested_cross_validation_iris.py" + ), } html_context["redirects"] = redirects for old_link in redirects: diff --git a/doc/tutorial/statistical_inference/model_selection.rst b/doc/tutorial/statistical_inference/model_selection.rst index bf0290c9f7337..87423ef1c3925 100644 --- a/doc/tutorial/statistical_inference/model_selection.rst +++ b/doc/tutorial/statistical_inference/model_selection.rst @@ -185,15 +185,52 @@ scoring method. estimator with a linear kernel as a function of parameter ``C`` (use a logarithmic grid of points, from 1 to 10). - .. literalinclude:: ../../auto_examples/exercises/plot_cv_digits.py - :lines: 13-23 + :: - .. image:: /auto_examples/exercises/images/sphx_glr_plot_cv_digits_001.png - :target: ../../auto_examples/exercises/plot_cv_digits.html + >>> import numpy as np + >>> from sklearn import datasets, svm + >>> from sklearn.model_selection import cross_val_score + >>> X, y = datasets.load_digits(return_X_y=True) + >>> svc = svm.SVC(kernel="linear") + >>> C_s = np.logspace(-10, 0, 10) + >>> scores = list() + >>> scores_std = list() + + |details-start| + **Solution** + |details-split| + + .. plot:: + :context: close-figs :align: center - :scale: 90 - **Solution:** :ref:`sphx_glr_auto_examples_exercises_plot_cv_digits.py` + import numpy as np + from sklearn import datasets, svm + from sklearn.model_selection import cross_val_score + X, y = datasets.load_digits(return_X_y=True) + svc = svm.SVC(kernel="linear") + C_s = np.logspace(-10, 0, 10) + scores = list() + scores_std = list() + for C in C_s: + svc.C = C + this_scores = cross_val_score(svc, X, y, n_jobs=1) + scores.append(np.mean(this_scores)) + scores_std.append(np.std(this_scores)) + + import matplotlib.pyplot as plt + + plt.figure() + plt.semilogx(C_s, scores) + plt.semilogx(C_s, np.array(scores) + np.array(scores_std), "b--") + plt.semilogx(C_s, np.array(scores) - np.array(scores_std), "b--") + locs, labels = plt.yticks() + plt.yticks(locs, list(map(lambda x: "%g" % x, locs))) + plt.ylabel("CV score") + plt.xlabel("Parameter C") + plt.ylim(0, 1.1) + plt.show() + |details-end| Grid-search and cross-validated estimators ============================================ diff --git a/examples/exercises/plot_cv_digits.py b/examples/exercises/plot_cv_digits.py deleted file mode 100644 index ebad3a55098b5..0000000000000 --- a/examples/exercises/plot_cv_digits.py +++ /dev/null @@ -1,43 +0,0 @@ -""" -============================================= -Cross-validation on Digits Dataset Exercise -============================================= - -A tutorial exercise using Cross-validation with an SVM on the Digits dataset. - -This exercise is used in the :ref:`cv_generators_tut` part of the -:ref:`model_selection_tut` section of the :ref:`stat_learn_tut_index`. - -""" - -import numpy as np - -from sklearn import datasets, svm -from sklearn.model_selection import cross_val_score - -X, y = datasets.load_digits(return_X_y=True) - -svc = svm.SVC(kernel="linear") -C_s = np.logspace(-10, 0, 10) - -scores = list() -scores_std = list() -for C in C_s: - svc.C = C - this_scores = cross_val_score(svc, X, y, n_jobs=1) - scores.append(np.mean(this_scores)) - scores_std.append(np.std(this_scores)) - -# Do the plotting -import matplotlib.pyplot as plt - -plt.figure() -plt.semilogx(C_s, scores) -plt.semilogx(C_s, np.array(scores) + np.array(scores_std), "b--") -plt.semilogx(C_s, np.array(scores) - np.array(scores_std), "b--") -locs, labels = plt.yticks() -plt.yticks(locs, list(map(lambda x: "%g" % x, locs))) -plt.ylabel("CV score") -plt.xlabel("Parameter C") -plt.ylim(0, 1.1) -plt.show() diff --git a/sklearn/svm/_classes.py b/sklearn/svm/_classes.py index 37a1a4eb302d9..00854f47d9a84 100644 --- a/sklearn/svm/_classes.py +++ b/sklearn/svm/_classes.py @@ -640,6 +640,9 @@ class SVC(BaseSVC): other, see the corresponding section in the narrative documentation: :ref:`svm_kernels`. + To learn how to tune SVC's hyperparameters, see the following example: + :ref:`sphx_glr_auto_examples_model_selection_plot_nested_cross_validation_iris.py` + Read more in the :ref:`User Guide `. Parameters From 059de51458d9a7f6140ff822212b4760b0d1a2a9 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=B4me=20Dock=C3=A8s?= Date: Tue, 16 Jan 2024 17:44:58 +0100 Subject: [PATCH 072/554] API Forbid pd.NA in ColumnTransformer output unless transform output is configured as "pandas" (#27734) Co-authored-by: Guillaume Lemaitre --- doc/whats_new/v1.4.rst | 9 +++++ sklearn/compose/_column_transformer.py | 33 +++++++++++++++++ .../compose/tests/test_column_transformer.py | 35 +++++++++++++++++++ 3 files changed, 77 insertions(+) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index b738c89d5b7d3..f79a15d7a15f0 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -365,6 +365,15 @@ Changelog with `np.int64` indices are not supported. :pr:`27240` by :user:`Yao Xiao `. +- |API| outputs that use pandas extension dtypes and contain `pd.NA` in + :class:`~compose.ColumnTransformer` now result in a `FutureWarning` and will + cause a `ValueError` in version 1.6, unless the output container has been + configured as "pandas" with `set_output(transform="pandas")`. Before, such + outputs resulted in numpy arrays of dtype `object` containing `pd.NA` which + could not be converted to numpy floats and caused errors when passed to other + scikit-learn estimators. + :pr:`27734` by :user:`Jérôme Dockès `. + :mod:`sklearn.covariance` ......................... diff --git a/sklearn/compose/_column_transformer.py b/sklearn/compose/_column_transformer.py index 6740bdf4e8993..ee1ee88635516 100644 --- a/sklearn/compose/_column_transformer.py +++ b/sklearn/compose/_column_transformer.py @@ -7,6 +7,7 @@ # Author: Andreas Mueller # Joris Van den Bossche # License: BSD +import warnings from collections import Counter from itertools import chain from numbers import Integral, Real @@ -682,6 +683,38 @@ def _validate_output(self, result): "The output of the '{0}' transformer should be 2D (numpy array, " "scipy sparse array, dataframe).".format(name) ) + if _get_output_config("transform", self)["dense"] == "pandas": + return + try: + import pandas as pd + except ImportError: + return + for Xs, name in zip(result, names): + if not _is_pandas_df(Xs): + continue + for col_name, dtype in Xs.dtypes.to_dict().items(): + if getattr(dtype, "na_value", None) is not pd.NA: + continue + if pd.NA not in Xs[col_name].values: + continue + class_name = self.__class__.__name__ + # TODO(1.6): replace warning with ValueError + warnings.warn( + ( + f"The output of the '{name}' transformer for column" + f" '{col_name}' has dtype {dtype} and uses pandas.NA to" + " represent null values. Storing this output in a numpy array" + " can cause errors in downstream scikit-learn estimators, and" + " inefficiencies. Starting with scikit-learn version 1.6, this" + " will raise a ValueError. To avoid this problem you can (i)" + " store the output in a pandas DataFrame by using" + f" {class_name}.set_output(transform='pandas') or (ii) modify" + f" the input data or the '{name}' transformer to avoid the" + " presence of pandas.NA (for example by using" + " pandas.DataFrame.astype)." + ), + FutureWarning, + ) def _record_output_indices(self, Xs): """ diff --git a/sklearn/compose/tests/test_column_transformer.py b/sklearn/compose/tests/test_column_transformer.py index aa7dfe62fc1a8..fe417e8575e81 100644 --- a/sklearn/compose/tests/test_column_transformer.py +++ b/sklearn/compose/tests/test_column_transformer.py @@ -3,6 +3,7 @@ """ import pickle import re +import warnings import numpy as np import pytest @@ -2275,6 +2276,40 @@ def test_remainder_set_output(): assert isinstance(out, np.ndarray) +# TODO(1.6): replace the warning by a ValueError exception +def test_transform_pd_na(): + """Check behavior when a tranformer's output contains pandas.NA + + It should emit a warning unless the output config is set to 'pandas'. + """ + pd = pytest.importorskip("pandas") + if not hasattr(pd, "Float64Dtype"): + pytest.skip( + "The issue with pd.NA tested here does not happen in old versions that do" + " not have the extension dtypes" + ) + df = pd.DataFrame({"a": [1.5, None]}) + ct = make_column_transformer(("passthrough", ["a"])) + # No warning with non-extension dtypes and np.nan + with warnings.catch_warnings(): + warnings.simplefilter("error") + ct.fit_transform(df) + df = df.convert_dtypes() + # Error with extension dtype and pd.NA + with pytest.warns(FutureWarning, match=r"set_output\(transform='pandas'\)"): + ct.fit_transform(df) + # No warning when output is set to pandas + with warnings.catch_warnings(): + warnings.simplefilter("error") + ct.set_output(transform="pandas") + ct.fit_transform(df) + ct.set_output(transform="default") + # No warning when there are no pd.NA + with warnings.catch_warnings(): + warnings.simplefilter("error") + ct.fit_transform(df.fillna(-1.0)) + + def test_dataframe_different_dataframe_libraries(): """Check fitting and transforming on pandas and polars dataframes.""" pd = pytest.importorskip("pandas") From 4c0b9c4e1d8fc599eb3d631e5961128bb049af26 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Wed, 17 Jan 2024 08:40:14 +0100 Subject: [PATCH 073/554] MAINT fix scipy-dev tests by passing 1d vector to unique (#28137) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Loïc Estève --- sklearn/datasets/tests/test_samples_generator.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/sklearn/datasets/tests/test_samples_generator.py b/sklearn/datasets/tests/test_samples_generator.py index 6c5d822163e63..9a9cc41d7229c 100644 --- a/sklearn/datasets/tests/test_samples_generator.py +++ b/sklearn/datasets/tests/test_samples_generator.py @@ -136,7 +136,7 @@ def test_make_classification_informative_features(): # Cluster by sign, viewed as strings to allow uniquing signs = np.sign(X) - signs = signs.view(dtype="|S{0}".format(signs.strides[0])) + signs = signs.view(dtype="|S{0}".format(signs.strides[0])).ravel() unique_signs, cluster_index = np.unique(signs, return_inverse=True) assert ( From 7e72bee19d52064b8b67195781e1c9927f4439ab Mon Sep 17 00:00:00 2001 From: Arturo Amor <86408019+ArturoAmorQ@users.noreply.github.com> Date: Wed, 17 Jan 2024 11:00:23 +0100 Subject: [PATCH 074/554] DOC Improve docs of permutation importance on the user guide (#27154) Co-authored-by: ArturoAmorQ Co-authored-by: Guillaume Lemaitre --- .../permuted_non_predictive_feature.png | Bin 0 -> 45027 bytes doc/images/permuted_predictive_feature.png | Bin 0 -> 43181 bytes doc/modules/permutation_importance.rst | 105 +++++++++++++----- .../inspection/plot_permutation_importance.py | 4 +- 4 files changed, 80 insertions(+), 29 deletions(-) create mode 100644 doc/images/permuted_non_predictive_feature.png create mode 100644 doc/images/permuted_predictive_feature.png diff --git a/doc/images/permuted_non_predictive_feature.png b/doc/images/permuted_non_predictive_feature.png new file mode 100644 index 0000000000000000000000000000000000000000..3ba908cbfbe8393fdd83db83753c927ca731d53b GIT binary patch literal 45027 zcmb@uc{rBs+ctVfAymqcN=buEC6YOWA}RA&=7f+TB@&f+hmhD;$u12T_AGBk+F zJS0;nW32t``MvMsjU@PTI+LzIT1r(F>pJqqIEMf6*(qN-tX{Gdt+xP2C^XA0C(SJqaon zAGyFkE}41x%$B&MoIFj}?2nq@hVX{3m+m)Amkf=& z#rLsWF#0gb({bTH+MC!VssHm2^Tza6#lNr8arr(}VJ1JvKUr?^o%+vjXwp3v+4%1p zmAU@6e_3O4@a?vlAD=dikB@hYudnjoKueK!n_3+w<$Ao(d7#{hW7DQhjL~ZL>7CqE z91_l}IqG`q@9y?%qHozKD=RxQ{x!xcJ8UAVPh2%tT*IoNjpfh1wOoGYHp7fe@-q%K zL9D}VX<;3+Y6&v87k>Sqrlg*V@nhxJHC>wfNyo^j=;(NmVp<)rvHgVuXK!zB_xO0I z4dh_Qwqw;bdpy_)29a>cceZ&*OGKBH!n{^`~ml)gT<~BwGR$iF=|R^rWJTL z%v{5gR14KcB_29T*CObaVl&nMdUAGF&s~;}kA9%xJ}u>ajtS?zd-uehdVMOTakBvd z0p-z$ZD;PONl8h$jDOj}ubb8U-T8yPmcgd2TZ5(s8u-83-+6tm{`Kpd&z>D#O?AP; zbp6uOQgfEUp=DR?FB4bm?{Llitm8}^5#i$JXTWFY76^Q`%}M!nMt2t16Jg|%7I;T^ zkECQ#y_8}`Mh4y5wHgn``}-fI#eTnEWG@_%l_ijz@rZ6URe<6y1uJh#eSQ6n?Ck8O z`uO<6^y@`+I~o{xBFRS+vb z{=#lSE@$4Q@V=ml`P^jBO6|->P+VwqgFjoP0jUqpz7x~J(%eJYlJCK(ed)- zW7^s*%gc)n6u0YpgoTB3od;Arw;wxv`^2eJ$GS?dYb46A!L@lzR?zAfI$PfDjEIfR zy7GXd&J6diUwmc%W9lip?l^a&2}zQNe$G?z3mUF)`b^LbSqVL}tgoR^2!84CK>! zEGNL0j1L@cO*xs^6?WFySr}_N`Z+((?z8Y}8X9>XmforjvelpSts)0yU51&WMQo_f zoH-*dA<>f@CR6#$_7ltC-HCUdFC3CxC%!p4JKsu9-n}sMojOo4NhjWOu7`GRVL@Ji zZMU>E2cC^?l-}~!uEI;S_8kQ`OG=K&tSnukxRpiv_^j5>)LpmhE*&eMww#@vKx=C& z%e@Nrs`nXM8hPf&+S}V7;?7<=4+LqJzW-&j@FSwaVe>)k003%UwB{s%58dX z>2;XGyioXxenVH9&Gg_~>}sr(j^QkBufi#|^0!Hu=XPvwzu;g7#a$am-e-7^b-vhK zpof}(Styo~)LUr-tMf?PjT+v$-%sR+B zI-B&};dE*2m9Y*p1WdYC+WKm9p?e36=;fxJ!1SRX zpkZXEP>^l3i}ycx@WA?G)|QK1rMI~h{K~t!IAcWZNihDL8aTjfcf605(tkV|F6Ht|xrz%|-;cx%^xyNf|k(2UJ-k09jlY9UEbs8EP z!|yYgi>{0Xy^Rs;>vSkP_G{$B#_ikJ?VM;*4rbeb=bwLWnN|5!eEqr|o2RO$%Kz}2 zPl#S8te(q{kL!?r6%piYg2o}|-$wV0e8|LZ;dS0Ze)-3zoQ>PIF@*LVyfmolGV*>c zBHA}Hl9ke3ev>*NH1xQf9EJOIY(Qh9s(^kj<@VOYmGjd>JwrojxsAw2l~~Z9KYxmQ zElYbY&s)7LEHtYP*|OFC(_@{?$%S9Gj60;+4GavT#2j`Q7FgdrdH(=H2oFy2`0*sC zzS^iG*F?)o^78UL!aJ~Ick%MBW!QY6T6Sg0JA9woROex{%GIiIlC0P{NAcVomzUfM zE`HrAAt7;W&}n7)kM*Y~+oh$Y=O)HGp4qk+UfSokZku0wyS}vh3>`8}9d4DuNVoI3 zJxM*cw~6hNmOk54g2(*O@Y#i>i9rOX_51XVuU@@6c)pPx3CO&!_V#dl-tOmipKpc9 zF6$3S(l73BI4A5f+`MYv>DclctEibcI6`nAJ^q}Y<#-->_MM`h62Y4Vk3YNcp*c~3 zR^QOjye@S6+*E@gSNQ9LS0_y=HH#(bm-=egxlDZ9j+-6sDw9PfP0}emK_|2;gvtE( zCT8X>lt?f6M0GP`6BCn<&+X5oSW-;AX*cL9#|T@}=RbR9DdzA^Tbc9FdB*(QT+`W! zE~^lsE5prPG&k05m(;c?6n+qr@EpG;BqX%eUJKDgUH&o4pek*kzdx5|-PVJsdcUz8 zyI!1No%ftyoG6w z`Zk4H%laq(x*|Qv4 zIw1{Z6OSJ4k;%qF3t-i4jNO*kbILTP`TF}GqS$?Stm9#L<%;MfS68b9kEyi@qE<~D zG&k%!pWjw%Xly*Elc}p?n98?imewL9PUgCy&DY0!_U;v;WFc>l)&y_%SePhVX`(E)6kwWp#*ZszA7LMpCZIi{#6#}_$usso`-723zNd2?C;qs}QKk-TfO z7he_^CnFj>qOtS+)^86&tfkxWJ<34f>=Y0P@fz#v^A3|)pj-Ss=B>@Q#)9$8bGxJX zM+Li9`|tXCxb$1TUa+x{P&1s^w^=(8tEcI$~YT z9qx;cj*d1bD{WYQhU7W&F*`GN8Mjo~(7>==(xtl%g=%4TBBw=iY9j1~Lsu^zEge1m zt?=;h)@7}sHZ@sa+}BFciEwU8`Ou-gd)I6}a8?dky{EUg@||At@pT0kKF|X+>KYCr zkFgqy*|e^HmmpgQv>@_E3){12V4$Y#32Fr%N1C${vasgW5}P(|aY@M}1Rs5!j=nxS zu5qsTD(!j>xznfDlkDu41;( zTKS`Af9b{h_u7_;j<l!5`CA;?ARr}98Uw!A3 zNDhiz`q@HZO84)vPSX%GbRO)op0C~vJW5UF(-uGM^yB0Hv&C1k>z(N+r{kpqU%nKN zMT*HtFm!h>s+>({c(6=e<5BOrj}SDNJ|1bm`pdFZEKkv%@Xa$h3Sz z85+rcmA02%9|#Bt6sWv2$y0aL!#;bg`pJ*`hhRod=>k30gk8o%e5;5x1sCeb;g9EU z^M%Wpb6()WgE+~{nlHn*Ff(gqnN9i1Cu*jcaEDjCeY*}pqci^?X`ydR5I{X5Ht+He zT>--aI<RqpV}cp#;wbUfm~W15rtPk=0E@ zex919OwJcPecJkcg8Se>CfuJQ;-2B{bm(@eJ8^NhfsOgUoYFq*7+5r5Jsra@@LyZ* zI$58O&usM&EL)NLwA?4qXx6`dyN2R1^?Dxyt@Y6fmFa=9-GF+TKi6IYV40lzX@SVR zsn+oJZOyGU45n6AX(B=QMNcQlWH+Y`R(R9S{h94*DY|T{udiQTSxH4|Arf?wNQB58 z*IvV|{rk%17v<{x?FE%%u4iT-)k=HJshhnVMg>`#X;a&J{aaEE_}`SOJ{%>qV6EWqt534-GjsU!s=FTCkHaOe>8mMvQ--@1+p zv@?nb32A)4G~y}Pu{22Eki$K|%~k!fIgVTY%^d)dNmuhJ+xFbV$`2P#;c3&O;ZiET zGS*z|>fqz+%e8wq_5A}zmB_evV`D>bugU0^hCk=;8+?}#Mp6f;0O1FWDCtUQi-gYA zGf~|3B_o@cBYE_E|Gw7C%jZ8Ti@5WxQEa;jQdHga)siC$*5 z4Ga>s(oXAxA^-((@Y-*?6gR>Hh+O=_j=NI=tl!AQbYq`dd^c8k4mi*8XK&4}S&gFz ziE0!GA;)ex4jB)2gv2dvzM5Y@KIJ*}9bc9>k$vWcAJ9a{3x_cD0O*ZrWdF=CBOD|6 zoc@im_bmlq_V}x(eN?_kU{{%rXvHMu(V4ZBkNP{zH4(aLXv2EY}7VA{^?$V!y%OBD5akB+S z`km4pK#I-+2}VAnv7;keC&$>x>`YJxw@!a&;rX{)bu%>gag#=1KHw{cQ>1Ct1N+4DfsmcM`hW~9i*J}u&^)5(^_yXO}c_#S;Nxpx1VedoQv z8uP)CkvFCDgWH~4Hn7p$_%lD0pwdt;xPSlcm1RI5eL$6B&p(pWzkcnXD!#vA;)v}> zfs|QwQ*k>q1&-YmTeMEh!8MH)6*r$bbj2H$xa*khn0IW&ZR;rOJ(CT7(-tdnDeA=w zF=l$-EMT=!kO=hCj~B-WP?c&cD=VY)1Xv(xpqE^-QkU03U3r zbD~$U&+NdjKlfVkq`846Wry_Dht8ejUka-T)9~f%*9U;Q(^FICV;#>dV#FMDi)H)Y z|G9U`*||Mr%OQRAa#89$=bDpHHp)fizF@D9{t17ArS?Ja~J7u)n;|Psd4C zAi0oT2et*Ro;P+~=62y2@Wn#Mv~70zFlpM|y?b{KPc`q#m?4t8kB^V(pM_ua3a6Oy zNvWZKKZy{`;E~n-PcY4_cVc51L;G^gYv^&id$#{KeQfA;1;h`Ssn@SxOTODonVX+a zM6%ubuJehJR7t=h21%v{}vH6<8qtPgFr(NQqhg?KmQdM=L``5I0d9BzFXEE(|{57HjhPq|=^wv+= zXZ*pn-5xS2^^gEI^u>O)K+u;5$uB`=!3Dzn1 z=~H1B-GI=qFP*5VDi(jN*2^)jd2}M2lt$eCmID@Rvd+9%4}S3DoXR&O$?kyxUrx`t ziu9>tDi;CV3AuYx+bLSqeygPGc$N$8Q&43{q1$}q&0Y(i2-#XAGOZIWGzQKg-K0zg zWV%byms@Q64Si4wZ9QmjN1Xll>hPr$3euL=)rsPX(yAyS3zaB^K8@6I5u00m;^2FM zJ=i7rFPzx3Nx`aO&stykd}$};Q>T2hUk>=Mp<9%*J&4p|Csf%2dZ3K{RxPFG)r(xiOyq*l}fK&KM= zZ&I-n)dPV32uQJU!yPGDTl3&y`a@%=1kH^0Tr0-Q&3K}#EDQu?1-`-9*qE&2DhiN- zJg722GX@YWSoza7TM&WA@OpP|FV(hf+lXKQbfFoI-MOp)z-@-#MO6-A7otMp$(x`IXxxM~<|oUl#(I_o=_QXoTBcy?V7N7!eh>)^lDNww=bgB_7BLxwfAU zd?|Jfdlz2B?>A-f3+*Q1f?Gw0r}PB`OrkHwcu0(d4uN&7uBqYT;h~dHWcv2)8~YLW zf({G5vZV(5_FOKovjA;*=U#70aXW0Zq0Wh2_#EBGCq}_2AV+yv%+V7r{f-xRqC%tU zo0-Xv4xMS+w&2)Uj#nP@o4}&nLZclU7e{vI!CLySg7al+ZlBu#!IoY5yN2fZ#q+J+ z9k23Rr}5;hL_m!>fSDQ4mP+THyLWG*%bZGhx3W$BR|%TifG%S-H8sKr3YwH^7nVU4 z2m;Kg!jr>eijuf=Uc-cy*OH|TyFKlMYJ-}w@!q|A%|1ObT5=v3@Jk)AF*n}@7I76G z3Z1V!$z}is^wZc)JwO}c?lU}PJLcmUE?>9MjbQ85ke$OVZJ$qJ{{rMf!`YJk0X}L2N93>)? ziJhG;QN3O)zh8VkS{KEz9Y?x{hv@(TLO5l2ga}DrouqJHpV`R4!9heViA#elg(j4- zCJAR}=Yyazu`m??ZqJ`Tw{Cs37F1YP?mEOjo?nXrT0MzvoutsR4OnDIJH?aFkmltda$Bf^JI%q}(*2&i(|2%KjA{Z7z_oAaC zv+Lcptz^!)}o;Lf+!-~I%N_-M#Qe2 zX;a{81k^DgO`40j;|}ih=v=_-$xeP+?AG z{z=Io^t`f~S{=xgIJDo_3O21r{ivgg>4|3D!OyBX~6?CW}UtXOV z-uQc>D-1B2zeLvIn>4vNyq&~guIC?jl7OUK#}$!HS5ZhcEA#wAQV=8iK?(%{o_-~c zdEe=36bz<~8*k!Dd$4}`udWI;#UIIZY^*wtv5R>BeQz>T31sgV9H+!q3%PXTuG#hp>QqY`~1H-KWSG_dhwL*PkSJ=AC7r(r`i3+`2RFsXhT7W`3?-sx0z!pwNNhrd z=0Zt+{K1n?;CY$*FM4Pdhl*S$LP5Rup%s}MxNpL=WlP{8tEPv*8qi#80D8$5^;r5@ z=W)!$Qy{Z7$3zxH_9hT@wPIZ!-et=R3=mOXUK!i^{NmTg2OXS34NA-{m67hK%gNGT zi3gQVb7L=AJ4K(h4HV_%E)c#U@5<;j*36}|Vl zy*(EYQf9xjbnGfqu*xmg+Xmd?04WlnU)Q2I^ofUNo0P??Ard61%K;+h3V^K+Q3wz1 z(E&*LpsY-GdV1Ods9dO4|F5oqFs^@h|L)xiw3h6YE9ir3*)59T0Ehm!soIrQzx?)8rGXK1dr&C))WTna{LYN9DUK{X+dcRaX&v#_;giO z#uazi0qEV{zWem){wh<{;=VRbEv*MYUaG9B zGQf(kw9YsQJZ`uz+L=0j^*^LK{o-Ar+Is>9m6Tk_N{_$%zSR(i+?jAMF7EAxk6EwL zBI=($9s2Rds&ZtMViegoWBaeFidi?4)&X%;pN7=@6ter%$;nCiSq6*84e=mv4qdKs zshHT0*3j3_?XdJU^UMMV=q@<<_+uGWGL}R`D(lR6u`B<1@2VTn~+QTcy z@L7;rLsQdvSQAFW)nqgtm?T{zrSH3BYDOO@{*LGNnRil8xji143kV502ZD><1{gt$ zM{od7N&BAfCMVXzH7 z>p|0U%AS4u`cNg8Ja+8ZG5n?QkZi*3+v~gzpBb<8Wq5h{2b;{|=pkvZ<=C!$Afl|% zW&@3;w%Gq%g3f;nTZy{9(-pK}HMBmE9BwjyuoWQk%UfHg>oznA0v+H&drXRpijq_! z`D!{$QcjuFB^^kYj2E--@IkfF&NkHHD*KntUwUj|V-twt{8#8Ox3IVy9UXLOsEO6~ zNKa1>v4|9wmR?`v6Ol^S%h|zhG2Hn=FyTC?BBWU!jzdNNtXxB)RKVjSR|kS=v#r~I z1PQ+loKKnyUwN%8J32b5UuZf9jdRBl_sJZ1N;HtwOYHbtHU!zQw9(g!*jsDG56a54 zwzd7l`uq6%_aUXC)(H->WeiPBgn?L0SrUavf{vGd{x)bmFe`Z$bwOvJq@|_FGi;)z zrJWi1z_7A3y`Zg!A-ZesODxb9U;g=E0_VuMI^9uI(2PFtxW|w=*ZQ7TngI zn3!PP2a8nG=(mz>lu00H{znZHrfeaHie%2842mAIQQotOcl&$ttz$(*$e5XNfry-O z_V|She6rUE-mrSiesvW`Xkja-95kt*s%~fqmURq{(F2#Rou#jlVV{wloLngiq`U!C zh?ZgM659$|4dM-fqy=28Y~zO5(YSgYygx-wRW+nzDK#yv>hos<@BIyJk4`WXDIOwI zE%GsJHQ?#XY2!&c$8(1|e7Ev-!gF{LB3kJ^hoZ-GT2pf@G&XB*e+Q^UyddvsLYL-s z>e||vf$ivtvO;s`(|8f=0){zC->Q`&Z`JujNTHN>cAnuaF@{qJOyh6 z$II7a8&f}A+J+AxdizRg?Bl6wTwkICjgT=(3aj396^a({s)76eFS){|I}B@8y7Zn^ z$8r|T*CZgyJ)Wj?$>URXThJtDe@eEG-599IV^aD`a{+-9+*tOUP~kwP-Dsq*F8h-Fmj^7OAg>& z5xoy^?m@SybHp#RXHyTvKLQ3KpJ)!0$!q=d>hVdddC80)(H>_=!wp>hI992%RA`Fd z56Qd=*hQoGihZefFW&bPhBSC0WapL_Cr(DV8jTBH-uHBk)QKNKrJ=`k3ciZ!)a9#k zE8y7*xsIQWOyJeu$a}#4lUBQfT1=P0vc>Vub(KH+>RKSZi`aLtpvk-C?@tRB?M_rw z`qSO7WZ09Wxr!8ovaj-_mZaIZxeqmMFR=T(3B{>~5W!<(>yV{MfXXgSDrD3?(9Je9 z0~y69yE|toBI5av7SSf_sUsu1nz9V?X%Jb*aSK8h-t!<%D?r(k%6(}lw6;OxSX=P5 zS~j%1j}&g+yeSTbZ0q3*+QVVpAa|}VFG+YVj*0o6zINioiIlr<NvT1DeF~RArdqgc$a^aI5|1z=H|K)j3kwM{df*UJ_j#Q9@I*rwn!-azQ~F;uP*>km(D+pV=r*=%i=0Q?eL_MQeTz?_fmu!Q6xnH% zmG;QRNx%+GD{M$a)VSKDX>OiW+rXpjne`3S1rb)5Ci7%WUlX_zQ1nDbVt)jMeBagisl6-LM0DS$o$yDdr_az1;?+v!DY+jL z)4=&^D*DoLv@@;-rB<77UO<(2LSWH>A``9FP+wpE)Uu%)yxi=BbzAB>tMgD% z@G{H2&CowU9yG=u*#;!9Vc13FqR4mLr=q*j42fSFWaKmj7+4DtOEp0zj`LRG!Kv^c zw|s*{MZTI$$;FiI&)4SD=`fk{h%7LOl?gp|dA{|0Cw;>NJd{7t=JS@^fAAnMBqU_l z{kyo@4g3uK_I2=J^?+pMWGK>CJAo$-&Qm_asJQ6*dLGVXP~zR_b~j`fh;(Sj_&GL|Eln{XXU}h(SN?odD0{R9iQF*$u_Q_4I>!APdd3++hoz`L#DOLW?duL+e}3Js zbXZ&eW0BVS%9m$0Z8VgK<4h4<*&@HiwYVol_M4P)AcN|taEjXL2j1HT$c;KyU02tK zLTdfB=rHz}BXX)8oCatH6hKC|$A>?SD9e zN2XHVKF?^&w^FnF@^T|J?fPoq<(71f-5^*=3%X70Vt>L54R%l+jZs3>AUuf{O%yVu zd(+QPEthiZNiayjXn?jvKi71Xc&LJ@>CP+TUoxH^DOuWMgB1egBbQus`G+60I&<*W z0YO2PRteYrAn`9j)PoyBb3}^;F^x+w#m^^aW_-%Z%HGKAAgmfh3mb6`aNnus6J%AIskJPz8hp^zW2rwi|E|gi>^s*efDU_Y-rmn88CTr}G{|yb8DSk}M zT+w0l{z+g(BZtSY|)V12*ZSEeXHWDuKXsse0>u>Ez>(u#ub z`aRsT+cO>)_7wH<5VQ!$(4XOu1mCx)y}sqFqxkpA^oGo$f%L7%XTU>d0oOIYejNyP zKo|Jy`K6)LDeu04!Mi-%e3P_dSfhOCNk0`@*f^$U(3qQno-)T(TF58fL3VhGhUh7z z>ZxgKNojtI5Tin8-&Lj+6&3W~!L?XY+ED~_F4|Tk-uGJgzK@NGsT}GA%T)G-vV-?t za5Aj-fBb#9=+e;o;D7*Ae4EIq=lk!Mc})9$A%#MHtRdq^DX0Q8jYS;SO4mB4V}>7nuVU8_|{RKl?Jb z9sLse_N*DV%T>}14c=*L34D(-7f#7sUx>I4c-UB5>n9;4m5;1(Wq!&s<>@~36xV-! z&};ngA<~H>P5s*vOmbN;q~IH$!ZG+1uG2`%R_sXw^nVW^4pHX1ZWMKOBHQ(f86&d#qrPHQjYLV19l+5BVn- z+AIH90>rU=>?mP8W*RvzhyMowZR}3pbO*U)UvNyqXHuJ$uXmngK_`A?skgT%E1H9{Z8U0+tFW1q=gmnGvEvZW$r zCxoAy08~970AG5{KP4V*n}_Y%B1YAzOK_m91y_vALF(Zz@f!ak zp?K^Vw$mzTj--GcIQ#OU_kJ$tdwxw6ApscMFaP+M2wN)scQ~kP5b?B$>edfqZ6*K^ zluw+vnUKIqc+lZtYXz%}>OGs*VtcIKIZL2NYni#_cHK?)m;j2l{_;`?{+^ASH`BuOe=L}d z3+ldAQ!FuFxjZwl+^Q4(<|U{|Q*10kj0)`U;#`j`t^n-^YX151FE&;{20{?%X-By&s!D3tu4V8hm%ba;)O6o40S@)^xvk@c`NaI3t7O_Su(-0YDSC1Uw-9 zZSd_88dF9bjzgA04ke|!U9-mk)QAhkw*VY69EFU?lqnc)k(AHNuuE=#G>pecct&p3 zs2jK|o)THlrK?aAh}8z^k(vTLPK~Rmed*Mf3$gn4^r`pBgKm~*(K!8Fn!hyKRhFRE z@cMNH^n{Z{q2|=tL;$q+KVpplY)#Sy$dcD$UVg!^UzO%(L9sc z1$2O@6~`-DX5n871BPht?5sktfeEdH86o0cTgu7J02$>FJP7Q-=L9u6KQ(ey1__sp^#W+Rsk}!X6(bj=de^6%b@Q{A5YnXS z5fv3>7rV&OwZjuu1!D04o&dE;3|BiRMlJvz5#ej!zI`4uElS`Sqbr|cztV;F5t-Bj zMZxjPmBAbVgkc{_9pME;yzuPlkDeHZe~5;Xa+!@=Ol&i7bIVmv3d%j%q{QI$VTg`v zM8htr{m}{I1{J%lqIT@WPAp>6syMJnoHvS!t4++PDDbg>bg)3WHLN?FM1oNqF(P8! zOab}HpYYu3-41w2li}LolVyhf1OSnUKg2fxnVSbTfs zMhAT2vzD3ow^J-rQ&NaR$LTS-YOwKvH=cek?D8L8{+WfBjWn9X4iCnSj09jOXF40B z!2n1iAOwIx5t{+z8oaS}@ABasAbxF76JXS6Qq>YdN#;e9vLuZ*{OErmrpNriebTw! zjg0gqdlg%1>p<~AU@#KTH(tqKLE}mU4)7qWcjkDV0>X3}>o^QbndZ`9BVn}2$kWdS zBz{7TrIbN>^PP{E^$JCrd7ov#3=0y8#ua1G!E6HhV5MkG;ff;|`X)6k;hF&nA$4yg za22iqUsNS7;4EHFqaqi=r$NDPF7xuDl*8HsdFt%w^?BIRC~{yT4@a+H$B;Q0%%#8- zA%tz`+b2@m z{j7+KNXNMNv!xz|`fhvI9Mt3nVJ%D<#bk{gEKQByzQsdg2H}hAGDAj&fqn2_qO-wk z9LR=|at5x<%BkPrG$Rg}7<)p|AP~aR((^S%U+q?X?Wwx2># z=mWe;FuZf##^yD6V%EXnz(Dw?i5(;j?oDE+KW}gUHeTlX8+PKyGb5H2FuIS8hR{ZA zpIc!da>xiGz-R&J~-r)>H09X*QO>wosrC){_q@9pi~>iNVVpBq*c z3s`}N+71}Mst4ix<3r}=1U3X6B^Rt5|A3lj$Y7U42PW1`*RNj>H*9vjTiIMMgDBF$ zb7a5g7e4TP1l>V%p??9MPa0LLYvN;QekO{|L+X$JzIDkn>}Gz+KQ@L1Ij13aRoFK>C3RM;?x%V!_`2y zK}<-WIt1aaCru)>3WG9t!o$7M(@-doq9GzDHj6xZh<6hakbZmQ*ne}p)2CL=^x$qu zH6^6*L?`?aJw(O~38^N87#{7e`_nB-oMy0;sd&xjd|Y({R$BCs+6VeTx%b3epvbAH z&;teBfSI|frp8w-MKLn0w+56pRYR8my*7TE8qMw)4$|H<~Mtr<@ z8Za16{vJE?{;>`N=+D76#GyYjtvLo6=%*P#39eqv^XRow|J|-LI{v5`|AIv2#_?W3J9)G`$8D=Z_b$)$aD?Mg((XBV+n&$~a3p|S3? z1kXz3JoPfq>%Oy(b+dzz;HfA?f5rlk29~&43%0HcWJ=mW($q1bp>~WghCt!&f%Ysg zwZ)h;?C7u#!pcS-3PnBb^!AV)N4CN@bUj0sy78<=N+5+oBm(jv;iNn6;!@DTzY{DK zZ0Hv{&o$(X5N8DP5uqPY4uBx&05OR76Sc-ym0N2r*4Nkf%>dA>sm|S@6x1=kuCi;h zPc(P(1z{eM#2c>F3|1n*;tiPnA&=IsW0phrL$0 z6dn7N!)|Ju8$Q*7nkH{3uKA<(J_zz!ERccIrrCZgq*?w%fvQpkT+@j_U`>I(gl{2$g+Nn zTW(>L!FLvL(f#*mAaX$u0#i$wEQmapGvb}ntwc*T6>UaLq|oU7Rk{BTYQm6fPKJgt z_5trWc1U8xKDX)h=%${{qbt$<{7S)duj=WqMhtEbpTW<4wZ64=SJ%bMi>a_&x^KBS zi~L)OryhdJYYK^-zKUhD|I*^?u2%}%-P-(t`%$ z78(T0R^Jg*Ke5cPc3x?o`UXPgp?OV^M+2Rv@cXntY{rRK@Vt_ei_6uMGQm#Zic}G@ ze8ZA5OGbXo?hohe{!%)Zl@evjgk5qb#H1_~ zUH6m?^eS!kRck3LuL5sX859>5CL-qbi(GiS&P%`vwBH_DSUO4%1b~b?6X06_>z38s zo9;?Rum3^czwRsK#xpR|m?4CG<3)@>WWl67WCDddXyx<(yG4}bWj+ee+5K<3s+GoZ z+i*y-R+~E|c1dt4UL%F(dr(GADjGI(ARv0LslID~>_E)6iIK0+7IXsykRb>!v7iV> zw4AS^qM~*Ut=+Ie34zty*XPtzxrR*k>FVkdi4Kj54_M?xESNP+OzBkCxu4Qw4f zze&>=5~65mxE0VZWk%p1(AWq;#pXqp7L@i}VuxRa_^O34ly^bwbifXEApAXg_sq4HrH(sK6{`>z1Ym1@ z8mH71M$+`)p%Vkx5^O=rs;Y{xi6Dcgwp`h`j@`6c#(eeEh98Dex2lTXfbR`Kb`%1e zHaa$D0C4=5QNwV}oEz+PuSY)Ryn`)adS>PvUONvG5!L`Otn>5#(f1=hYB&b8-9p_1 zT!QYC8MjYD;=w9Y3>SCbx3RTtARb@C7Z=~6ebdKWx<1AcWf?)H*?)WW;Mv8m(cs2~ z(60{m^;JQRHBeCrs+^j@dtkC`1muWP-eZm=WeGM#J4)$8?h}}Z;nyZ)8yN&dkNlVZ zyEu(lC&tPA=g)Pw_WmODABuM4*ROG~mIz@T(O*d2_S<}ntEy;_(^>cH?Cvgrzw;Sq zmr=s>laJpf=4ar$EYyNgWL)~`djHFj8`5GDQn60ciQ|IY+{bN-ryaTgMhxIni~{{Y z*f&68tYF^WllKn7X)*a3xIYr0AQw&UL9k+jqoa*bj@psajjeaA<%&*hR|MP$cOUWN zD7(HWY763n-=tKUc(P#v6p)dLht9vbNIGeIsL*y3iM|5Ju>KC6E3W zmgxV3X^Q$&We;a3q?pxU+#ETqhDTvHet;o&bhU7Fo=|%vc*7jtWA@`t7U7@uqKRnp zJMas>c4h(&3)qK}67$Wpc|xK3#uJ7FToaR{XQrrdz+kci=K7>&1qPSA}u*!wR57ol;y z^N<*F4M80Mc?p9k#Q--w7=ySVdmpn&2r%9ya`uNGo+y*NYD|l6G}L%T3fy|PG*&>C-#nhB|X*VkL?=pXoF;GKNJKn(}3>D?}Cm4ji;ki=Y z^N;yZB$ERAkeft$&CXsI#|l$B$5W)Tq(x{*s9*Vp+|jCfaj;S zYLdAs{0)24o)Wo{9|A8w2djVTTEj+FWU4^NRU7oi-n?0*4dC!L`eZEJSc!$XIq`!N z%n17k@sNQB351Yq0V+sRR^k!=Otp4Po~`?DBJKqtZ6E2J~ z_e$<-9khPy#vj|>BtK?`afCph5Xm(}ohW{dQ8V+~q>e3zer6T~1B z7L|)F_-6$zrzvDh-$6?pG?T8Ku?aKBeA?+ZkjbYfV51|RB0tP9gKs8ub`M?$dVKvF zTsQ7W5u6%~#Xljb^&&LHul*LlAr#bbE}}*1LxZ?9)ZZy=)4JznQBfsGK!8X^gjEA(QgNd>34%BpbK@~SlhC@&-9%3I6 z_pJNO@G97r!N>8xT|p;E6j{PhfxHUt|Nm~%-pKw1WOakO2{0z z3NleaYzdgJFNZ8igSBsr5!--GHr(-ypFDQ7=F>B!QP z7oC~IL2<2nbvGr*`<4_}wSLq|w?`9*SvIH=H;Oju^v7|mJDl_e4O$Dvg>DbMYiggb}s54t0tMz!W)XLV3syzLppcvjDwD#suKFBR-iB%rD}o5RI+3 z7SX=}BUQu9iURmL2mYH9@^5l+XhMjuilD`^F`waR-x~^N7nv9hFYQ=E+ZkCFYRjoP zns}A>C9`~pZ()Se!hsioHi?67HRl@RIDWj(NdB;d&KrZ>M!I~3aGG6ZP?xvu#z{GIW671C#Y6Oc?=H3KL@5f+!&mmOB zo0U}oP~L99JOUAeR%QN`%lAz<>x575Nmi~sIi2T)J+WP*Q-JTq2V>PsIA$VxR8+oz zjdkQ^&5;Yqx4e5l=(s9r1PVT__m-m2kNYrr3#UyfCk%)R?RQDbhzcnTPlkQ1()Om@ zjay-T?l39I{sC`GAzo3i@hJxr)OmdPpx=-qM@U%7F3(Yu=)?>Tk&|&q57Fu;fBmWf z5?Bot9?B9(eZF5uo<4or3yw+a0eARok*_X$!Diot#*TeW%fP_kJKKK>Czky12b(|I z@r=Q1=_f5369R!fAX(NJ3v!Jp`no?%Nua)%({QaMNzLf}y5Fs@UwccxtXugVCSOz> zcx1CLllekphIFe{;VD&Go#>zwZB4DuK2*w zkM2Km5NDFH;ZcD6oPU+xJ2rM37}Ffm8ZiQRLMz6RVI{9Qt$)lH%RQ_rKUopSWG?Vu z*GY}*d>tVv@f;ly8ASuB(){o!HE}!PngNAcJuX<_KBT4%=PB)^2{`U_#7& z%BN2s$9*FkB)v~qOF=Y+;LbQ;TdBnKpn9h6e(CEAEVLVTD55KFMw`mWDZ>FWjo2!1 zVv-~5$8c`{5GeEE$;-WAe^txVa40yf`!t=y83mIHg+9YRYqt#6(2} zpm*qvaVV_-f2skCDH%P&_fS*dAtCcID}QD=;jhEd24o`a=svf-imW%)=@L>-{(3=0 zHjRZD_rTq64ff_U;<~x{!4KTfzHw~dem6dTJut5CEJFQGM8s+gI0q_ zv3F0IpZxaKmvb=pM0{3|!Ost8vHSkBqsnJk%WS~GKxxqr@HP46fZ*+TJ*uhGq2r+< zN%a5r0E2LAMyhM4(;E7|>;npTRN8ba=DKmXwAN_9G9B-B;lR~m+YZxEv}N3=Zh0sS zW5iHMM*AZ3W0CJSd~fjoGOKBLd}~9|-fHcr!Xq1!e_BGV{$@JT}IGpl7Tk_Lj{VK~u*<3Z;CG z)hju+VO_XD+b`pb%*jeE?^pkVtj zcD=jP)5&epi$A;SDudp*m_JQ#J%_WblE^VpXqI|N;2~QIfdY-39_;TQwktVs-~btO z|2##vYAV`6wZn;3wdUFs0g8Mg(>pQ98k?@VZ!m!G6t@ZwOC{z~U|0k%%Y;4?V&oNy zn`;ufvH(IOVxA`*C%sTei6Bq_3yA@{TjcZ@fTtzl6ir?-gGBN)&ho*qnh3j=Krkz#Z3YvzwxvMx8ML~eA&wGPR zIWQRO$9%ZKwV#~bf&)eH81?GY_(FsTO$TuUh>(B`kUI0TK-#xK3YLHSwgXz=+-Uy0 zJ2>X2Ot6gqIm8JoNd7d4_Ng8WUtWL1R|IuZW|vXwe!IPbkF^hgLyTj>eH(E2m|V{x$QnK=DSL4@-k=A~ zK=qLE<`09TXjPTUZEJ;=tWmRY=y8lQS%D?D(pHk|{ zdDge^b9=u1p5J!c)|y>>-Z8HlC$fKCYk^3Kz~Ah2XX^fk^#(cf!e6<5Ji||K${^t` zS?{D{u4Au4bQ2|uKm(5=AV*J}eun)_O##@ZMsz~{*}`#O?7}IiwxpzELlb@zm7kPk zA_h~)>>On2v@2hMsK~(vL@82KRHRVISzEJ4XU|e7&>WV!*J1oL>OayG2JI#jB+%R7 z%)KCRdjiI<*g?mWQ9#N=j3c}RCWFy2QB;sXTAU;B?=c>0;3CGH#C@+70QT-QYAULO zUdtXHb3Oi7F;|T+nVTh!bnu6(C^$YM7}oz@((qiljFU29x2(mCPJ>r1+Gv=hsl`M4 zrWbHT;zP_HsLV^_YRD6T3=LL*QjXOohoaeid$k3_CFMXl#4U}X2v&+ZPCaYWV~k2NRb;8@%Vmk0dMFP%l4LHdGqzO3zZ>HU0kwslk@xTco z8-Y2GVkna3xh%xxu4Mp-t!RG~!1lmk(3BRF|8IYD^y|(7j;Z4uMZSg=b3qp#Y4zh(F?Ih&pV`Mvg;-m=Cgx7%#vF zli^^3-^irMQQ;{u?>_O<83r76PB=5QrfCS86FK<Me&#wJ<3_+_=~ZWX?q*3_x;u;fEjrn-ol#G|U!iq<|*-zbJbTa4y@w5BM^&GD8tU zMpnwm-h1znWMorhiwI?B?=5>pqU=yM*_C8xCz6pUdcT+M|9GC~J>KJZJ9Ol}4}RA? z&-45JtP{=ywE)kGU6DM7e+O0X-xbBBo?=o-aa?0#t|Z_s(;uv`YeXu&5gbB52VC7%X+KU1&jF91g4s zXlKzegK~XgWK0YRy=XKHn%)L|GTdEo#}J~nN$`l6gRXaV{24vW0;uqhqR|g^5Q7I9 zYGqV-2ZROsVSfeu0NbHLX7pE~S-$=IDZBbqt3N%eNCAx91!x(VzXQI-=xf274hEY5 zkZoIlLIi7{sw)IISfn!3u@CtH5?8!A`VdgnMn zsOTLOP=Mk41Q>ffVq#Q;3_U>StoPw^#$B{E1_3d^k~mNeUD@930^u`i8Ab<13Ggx~ zgA*Cqe0&r2tMl8Rfd%Cepzh|7j*A3`rJDJCeAfQoD&>IZ!%ZHBTer8=@+t@ffSX zWSCTAi$b0%k(*b}{G}(97|~c-mM0FY&N2}f``>m<*KQyPMw6$ER^}ZkrZ_IjL;kcJmUexL6cgzQ)R8DyltA}&S3?tBhNJ~6{B(N z|6jm&A~5wZ+k!7#dP7Vxniiwm1=_bljqw2&TBx3i#;ng+xC;?~kPfUOCcqXN9BzP) zO@{(~YUjm=L#c+ZrsQN9vcyaJW39AMy-qY&SK+Jago0HsCMJy5lyAzGl}02HbD(*~*> zMGN}*dB`@w2899i|L5-7{!LleQFAX=K9weycdUN;G+vO*z%nHsL(jOL#&k2phE(ia zABMY!M(qM>h=~AX^c?6y)KZ8d1W-qzaS8weL-Z^yiaJ33@*LETiC4@nKozV|l#UxA z0nrg4>wE((6UxZ`hDf-dFde-ZTmX^`CB7kpEr2HkCgmp(1cmbBz-J{b=fGSK4NElO z)Neq@L2^N?4fvH80917;B&UTJaGM~=afX02!&w?3MId4bWS0bTRT2mU9YRa}h6I|~ zeUrJD%x3Srqf%41ZeMOEp-iWqcB00xzmY!b`+EEtC1=h*H5iB*AQhSjFgd8AE<%$A z0iqE*kNHw<$qDzb&MM-YI79zv;ZKTTK# z5@mvCDWGS7Nsw0DhXU{q-E?SVexXo2U`&uB^pk**=Puw1b;b$^@{st#aIgHS{B|l= zr`h>pkEjy>VNt?A+E_gu6XBqf#OZfkBFxOU7Mx zEa9f2E@!mT0pGwm&MTOK$kVpo38_z9jMN2w9j!djPWAt*I6{eyzlkAeQ-(I(P+017 zd%+z-TQKOKf4~;0a+re+dPznx2emrQnHWW3m9(RJ;9>DgtAzkQ&K+AI8%9I>LD6J4zF`3U;DfC~bR)q%+s7lEeL z^6SUv5KxGN3P1n}heF+S$<^=wwP;~~OAeY67`v#Rt^=KxQJ)on!vs61mj?oyWZ|_L zfkFz+c0m<$b=^>JqS|Vh;ZOtGMPXrT5Tb;%1<^+<;<0D+xuX2LTif%8blQ<&p#1`g zKbl1jgfNKmKf@b^`t#mDuz~ImYOZ)4A2g-agaqmNH$}?WC&(|vSoDTA=2n{An;3JW zrQgCewBnjD2{t+IC`1x|e5t#VKS|vW?fMB$s-N6y9VGx1x?4P~-A>mWfI)ZP4(g50 zE(QuIG5d2DYra!<9q1iVrUZ3Kz#QQW5+xLR27n6%y${mR9vSq8%cw|-06aT|N!Bt- z%J|r%uV~&D%$aj6%P^c}S{;@?st8yxUs~(cByZ3EDA&goh*VYkNIK^2qJW_7yQCs` z>2HBo5Y6^NoZm%YTQ|U`hAqo&G}n{T<#le zI#TT>okK$&(P#a17E$TTduiqW>P%Y~9& z_pcP&y;9?-3)h~tzXAP4R}6j3mDb;9ptAum58WdG#-sV_Q;eS5@%yXR^OBj0S8C=M zs`H9`g6vAv+-sA1V?7RdB`bD1ZLQ}`W3mxs_F6gP)dXocV9_PP%S3e<%i?^zs~@$m zi-@j!_~}r$17)Aq(9$P$I@G+?g;nI#dzGxLePOt*ToPSdKfpbhPJxAD!p~!S+h$spiuX29{#;b; zfCph;GZISJ#U#R1cVm{cpi-C?)0}-e6SaIXPAOvVV9HnW0fuGC?zW_!&jCI5DUDs% zwSwT)@ZTSk==C>bRm9{L3FCS;d9ui(&?2axqwQo2iE0SG_1!+{@n@rVB-* z~AF;-E zJ+bY$B!VBQ@i9+d)GIm?u)!HW{13LkcJI}V6L4W-)-hM8_`Y?^D6{RYTiQ>_hYhaz zGEiP;Jg)O+Q0k?5vpuqYvp%s#9`O7Zd06qJ`FVA5NiVE&SI7!L{q!_~V6W}-g_mT1 zCM5~hHd&hcEjX5HpLYnm$tJPVh5g^I6xV`1-?Pc);et6GMUUosI#1+N=Bnq?OJl{m z)pvw-@i#l}X;G`}K3qTo=Z@UjNA?=P#&EeKr(X|)TL6}SK3p-i1rV3D47lgSPv z>0`2bx8j*(GiR$pN6|mH$m++<`&PyT`h&I(YQP3BNLjOuoKrn_4%_UNMXX24d$*cu z(iOs7H4@=U-Lq$_i0%?hZWSSWeC+y%_qGLj%=oL%Fh-ORve=YIW@l(;j*&Eu6)<7u zl1iwioEM?7CVn}|@`*PjE(tocd&B-fWBe+%CJkeACToAfo0!UHlAV#J)8L4nE?5OD zMqk+1sj5BV;v@I^^s+t6bAzG(?K>z>m^d-6PW7npQZw5Q$+-UDdJ-CH)(;rnGutau z>l3G|xf~uyzsu)MTs3yOalUf=Y4G~B+l+#j2ZkruCB#!E8?7i@1_?oroyzaCT}N#} zC|viAVdY0wb7<~bps!US0da&O!8D!xG7n)n{wXhu>v|z)UM2LlRae7Ti5Q2x4gCsX zK>EsBnA*kH0}4?XNRnoD01SvWVGf#-X>NAz^Lz269J1lwoa-w#@Gd^yvl0OyLq~@F-J1And}MttajmCj|zp7l98-68;_Uu(NrW12QBo}9O_h63L>MQ%7xJ; zWj*Ta@h9Bf>U%}7iO(bf7NIkjwOkwL)zt@&%A( zTS&~joJ`UrA7h5s!+q7f^lAT1MFXJFB7*z0@^F*6x$#;&-w#i))x5nnNR6qV?k>*| zFc3<=JqAHtO(r_D*4}z7GTU(fQ|Uwmv7}ry3?aFK2{R547$t~$3?e0G=pXFzmkovL zdN7A1P|I$P7FfE2k);Oeq{$$3L#q6X{e5pC?je7JVP zQ=WA4mzfYG5SOHFbYui>Ju5ugj-TjbC32{{%?inqCM|C@6D*#pK?LI(XOw3X=(G$|gMcZi-SI~PqBmL;YYVlO ze3!eX4&Bf-ViEl`!$?+x(FST{;B#g}r>7rVlVWT%zTCy15Hs>;AcKK4oG^Dx zbicZEn6pYGQ=CrVpsM+wcRxDAw~vnzuoaV9h9@_@Yz;FpW`{l5ytz8@2 zL$F+U(Il<(1P|w+lz*v>yZxzTh2FGQ=mbToIk&m@^^{R^+RXC=mG3^Y63sJpB(W+U zE%)4rfiGW3ku>N)Wg#r*vg{Qforuv9sbDX5Fe%fVs62m9(vBp*5CRWPN%*5YFSvy z{_>*}4my4S7DCaXI^!2XTzoaFB&4rmiwf6X#i#&ves;ek3TCakLzuO$PvzH6)E-^_ zsNe^yH_Ai7O?Fse3QA2LWs_B&V2DYoB$f%@9gR$KcQW9Q*htS0WN;7WY4+O<%VAx< z3?TF6BUyx16|ZN(jYKl4SKtX5 z#j(}ij)sv-p?fUTn%#OD+vesi%$YiE|EJ3W5~0nUVXW^JvQ8H4SAF!H?Y>z9hoAGb zIW7*SS>G84E}u;2=E-`}rxcGD#4U!!u%a7T+lvv~D0Pw6gEf=sl6P{Zmi#Ku#*2T13H@*$6J6CA;>_;iP zV%>fbOR9%8aF!o6z$}}JCn7j@zSDCLP63a-$;1N zgu)6$S(hQtKtO{_p-}gd_S!pT=U}m{EK->;mnWbf*~=<=@q(Y<;T;~JgxzR|4N)YZ zb$AU;`mhmGSJVi++?;d`VJW z#mv5LG6~kIXcOy~UCZxnRdAxs4jbuSJXzwiH8F!>N0c?ulU2+;l^qJeK~A+v$Lp+U)bpL6uX5y9f}7vJx2Jh5ys z>X=KKAzYUfSNbn~Lc7|ssqwpd8X2@Fi#s)RnXC!9&=#V!H7KNl!L*E$#Xxz7e;NXG z72wynKRZ8uaCg#@%jRA3&%iIGc(%XMzV7!6DSbEGz@ttv6&20i!3xj9w0S^VTsb z1aXL5YNTa*(6sAN(tO*Jj|G(f^|EJb@7n_zLxvj8Vm-m02C#0~Q6#7iwbw+%wa7GRd~f5 zL@M6zpQmwYqY!Pzpq0l*&`lZOHly_c8(Em#)@*YxOhqsmMjIXFqIA@&q2fwNgxj*xx{5BqzmQS-7=|l*K`S^T!9m&u zu%kz^u&(7v5tC?NIPGNk7mP`d#`*dQKQLQ!D=Ol`FEY(B7Xqt{#cfHox7CWR4nCLY5X8uHCwTv^oB~>ippY&LA1o za^V=3h1^2}eKHN|&tEy8KN*07n%^etNKQ5+0)H(a>TiaIG>g-if)d3~UVTzr#5qR{ zACr$zjZ5&{mEHxE4o{y!ySl7l4`Y0jBcL z=v0o?KWG3?)L>D4dpm<}_koW^<4nyHmqO(qm2?FV9fLjvU^dn->lP^`Ck5jk;UY-0 z{|$xLU8HYc_($+?jKRdsV+d6|;j=$`{~4 z+rT1>3jXc9QN{xho4g&uGcsWkMna-vt7xh4Ho411!^EG_(%kX%^+_4?73_VHHRopu z#DQ7lIi}}dRPw8XlLH9nHgG@Q%6gXN3wAnGVv}VPO`6|@uG~Q$S`6Z29df;j24%j7 ziM^g0A2Kot&GRg0%rq26c2N1@g>QY%uhmHOqMIM~#Tlx+WX}*DZSwgkVS&9lWxsldH1Fx$dzmuQGcttC{;iNw%OtyX{O04N@AHL< zML!>QiPNQQZ~JgYbG=Q1xt`w9N>6$EvXWoi_a`Yt2C1w8eFX-=_sT_3!eqK!B*XN) zvr-%tK}NW)0=|;2Crh;gB<@Km^VCC$v-fuhtL_VC`BQIOuec)HI7V&F=L3EZ>aj z_eS00W%ISqd1KaP)=o^EuTCe^BDud07Th>n{mmW|#4v_>u!LxTvSwpv0w>VD&vJDPqDp@UvjiBF^w%L&6qLMc+*rB6-^)5o7f+&=Eqc7b`z_?0tI23Xu`R} zcx4pq68HpMSBE|M%A|k$Y9NSp(y3$zH$?wV58J)|!CD&ANpHL`Vyz1nCRpx6K1Ss3 zzGHP%?plV}y&_3m0mWr?l3(YXxoqpj$Yj(~{!Sg&HHfn>o8Ta9^rBLNBt%!rbo4Ss z|7`RAp73v5j*j?7babvt_j{(k{`2lnT|yS|WFw8E*a!s3z6PqxBkP*nejbq4I06_2uO~id$5%QP7mdOot)5A$K{zJiJbM~mgmAP%w*Grc-dnJlu*Py+2p{$a$ zZ=n490}+3OH`j9kQ`Gj}<}2IPJF%m30K4#2ul$*B9fZBWDFx|6pWKw+IF~mxU;ns_ zB585Z|G1ualT6%41~=jwX$KfgV$k>$NCrmGfXJ_4=Yg4A0W4J@pbLWit)QkdTcH-X zp~KD_ePN*(pXc`Bo&3reTf$NYK?zYMmwWF!_hv|*rTbqHzGgUU)j%76+;h8Ve)cfU zW`GqkriXWcC~Ii(UA_JYz9VS_fAlk}5a01x9Rr^Zi;CaJU1r?U5iW6PXACgyGyrA%h_Y!?xsj9vfYqpZBG= zrb}eFb||Z}S4`H(jqqOEUpLY%i6t5e3Q%B!rV?*PjP0sPFNKx#KxEAI?6*%ZjVF)~ed_0~dlJX*uUwlhiO<5S4EsRDTZ$8F#R&U!?BP{}@76WX@C={pli1+>TTBMi%oqP0@*8;_OY`9IxL7w5vNY~3o=2o&k0?x|a(2eH-SzN3 z%r-9hCG+k?iP)~|JZ(`ElNBiE?wV8OFRR(K*SAL{;&@~G@VZyUe^E7 z>U`ndQzL&hL!_fL_PxiIbvPxAR5n+FRId2!_|UJM(_*dZf>)!bi?Y8BiIsAFxN`oz z(aMW5esD$K0viar6JyJ^Wi@n1cr6bA>ScgdLH5H^|L+&COU`RFVGk@26n=3UuJXs= zr0Fqz)i&*kN_G5cAGW=g_gV+5tVN!XQVsTR{wE9FdK9dG*os&s^4NLADl%3)zJ4SF zQNQkayf?JU)`S3wG3j4cy*pm27oK9S`L`aXQ*d%k= zC0$D&BO!(&rGLI+(In1-o%?~cdW_ejk_Jr1j;9onssYF8wzqE&=Xn32H|?GrSlBKS zWx+JnCrT;imE*W?CVa%XcIpex<2BmaAXpc{0)h`bbr<+$$(E z<>pbM#qKpD_)ZM)rwaGsUJzp0?bad4P#PZq0S{X=!rXYbH<_32ZsW|2zAUdwZXN!< z6ZB%cvX{1hFktr=2ZB8QGmTt&XDQ*%6Y!9_+Hs2Q*~42nNrZt~VG%oLK--h|aOQ^e zu>7ep#u&lBCC%5PorTEx=mbWi>usPlku*8mGjyxMt%h>+v}hSe$o*3XGG^15D2rV=!rY)pVPbMkEL}K<5Zu!zJgEL z-Snh~Ij#QX00>R2Bc&ErPA9RX^Bc}_UDkfEW&WU1>EMFvO_$dXdBrKXMkF0hNh2ju zsz|`WwHZyCcJtFt`g)ovn15wtsvcRL+E!D&=WMll+`=RLh;OS9enf|5LW5|+t_&j; zWd~AFV2iq6`SG|6eW;mmQvS>rB3gxR5u1v!Og{PcZl?x6ov$<_8oAf78*8a^BVvr; zVX9zosDqbA;FGxp`Y>fSj!nDswta7epE4RtQXf)jSeTbxs4r_@b{*v*8F3nvuzKYkBCj+zCfR#F zrouA;195zsV%a?&7xKzgc3|OM-ZH@_z|ILsNGYfso{(cciM%PzdVX^DyI4p?YXCvR zw}trTZT|Xc=g1ARKm*22oVScTLBCguCX!_U6*gJ)m2X70UAO#dYf}**u%5@#N2nNyOAAx%vKv;m+kx0F+ ztG6_HnMoNdq;6sc@C%v0$+mh?LWbn8c3QT5H0M}yby?7mb|)@3gHuI!0D)jh3)qGe zuebRI2U&Ds@e0bEd`cq`Ny}%SD>s>-4h>b)hA7h0)c%H&%xZ~5*oK$-`f>3Pd7kV$f`>h6 zH%@&Te*T)GZz=C{wAdh!mg(wxYe|6ZYE!NJ)odb2S(N9*d4y*R&JN;(CMd zG&|rZ(MDsjj`l>YPd!w2$QJVUmkAE&UDd*}hbgTtrnElwv0n>Vjjo~O7)>MP7*JHZ zc`U?~P?}XSCw%8Pf;_7|IrGE*AFvxAJWzvtnEIVf&x?rI{UVE1RdR}dJ2ox0T)ZgK zG-pFIBll(PckJW$*NQqm{g^fjln~$EJri%%(EI#_!ndA3xn7HZVjT~TxyeJDN6OU> z`l5vM6Vxq;7nfoib?HnJWdXFsHfs%;TOW_Cr6$kijiS(aK)k+b*R@iGllMKV22C zcw4eoUumyXL_HQPW*J4a%mwgzh0N66j#wS9+JdM$xV>edIsv2OV*n(-+e3^Ano$L+ z!ced%$={{E7W5m}@YO|Pf#?N>s^@{gzueTOqjTd82#@ym$7v~Nf>zt{)uqac{UEOK z4u7?6+)53#aqoa@!JLAC+TGKL30>K}hxfGN@o_*UPe|sZ7mByGMcRHeKIml)sCKDV zeru3scS<@!?|K@mt)6vyTo;a;E#urULlvz=*;+78;8Ntuy{os*n?>7vBi@Ce8HoYd ztO?vY+WTZ<#RCZi>KkRL&GcG_Zst2pp+Q8@%JBOZYg@h->7h;CY#pyabkKe;0N_+@sb$_2#aXQX-fFuBi3Tv*>Rb#}ujAz?tF zl;bgcWG*`aE3oY0bn3%EUlW4^g{0@zxdD1qX@nCgSK7}=HEDU}-`*LbT2RFQ-0_)r z|NdnadwxUG(BLH^O5?Zl7E$3+Y(Yb#g#haP*xH7Ju-)jL)`WFjbA!w+Ayw6S*uzP(V#GCTM?ru0z&>q6zbKq~ zble1en3hX_R04B1MXU$Y3 zj+=OE2d~QxS*yGPhvu1OZ6A^5zcwR^?61o3Yz+ZJLo`a#gII5~OXB6@=r*wT!Yj8PY!s=N*$wYX??2H!~l zl{@)vV^2%-$*2E#>9@oG&0nD<`6Bfd%&z8wimuUD zdd4cNl21!6^L+NYWIl$g{jl^tm$(wEPdzAYuJ}4272_`u?pT(p28wAN1To0I;O&>l z&j18ulQlNNusVlPBV-@9AIGGiFpfpiYiW3VmRwSh&+ajPr%+mHW6o7=^B1j#SY0@} z>?x0Fyp+~PeRi99bEzm9JNd^NI1~(v!z%FrpP>`6TfLJ=1=Mp^STjSp^6XUreJ||N zBdkd>>B$D?jhL-7)#Af5GG<;r)Y8wrS|lABQ9jq+7v9V3e&=eiYePvr6Kt3?cQnR} zCuhC?x-&%V?)&;t=7iqeG?In9ZxOL^r?B#0iC71fFtKW1L@}_Zxo*;ZnBLZ_71(MV zbHxYl-jSo8!?|&FJI9ju@gCEe>=gZ?!|1!e#ylAmd!~=xwr}*H{(OkY-S=gh?o zWZ#h`B(j6lOyYNJ4y%IH(2-+lB;T#w`z&(b)XGNr{SMtD=|s;$X9B@C4|8jDeQ|u# zdDt<&gegZb5a9QU-r{+baQ5H2Z=Exr1b76Cc%wfT7_9D>g-9XVc5IUBO3JBfg==6I zLu1MSalf_tG%;0Md?2e*BaH~k0h*WX$Rd%EN_+RYghG}~$@M|a+Ye!17cG!**|(A( zjaBhl^;tjE?o_s@#=c1QTktNgyFf&d^i*7T8t+iRN(HO7y$_h%P$(UBSwJp9S#FJ` z!Ri+Z#t3MEopY{A?;JSZ(b~(|@p%hqFsYltKO<_$SpD+{RE|9Yc5}Jr`&)*$z`y++ z4N{Z@mg`g8e@hZ{c5#(*A)m-EZj+aG&#Xxsn;cY~4O--*PlFAb@1-s?^`h1E?cjqPzCE#WC(|IgTb$0k^X<}Dz zq$lHmC>wH+&Z;pAP?JX_eBxW4Ezh-hRGQe|86{faPVKSN3tJ`Kwsz)yHx_vswU~{+ z#?ZAdDr3-sfSEP|s+=8r2zvoLOFM+Oqw4X;nE7A8ABfk#^;(a%%g4Hp3I;@;9EW_@ zh-Jv&DIV9NGZ`h3OdWVjF?>3uw{|SB2|Z>UfOCUF4z*_y*aM~{4X&r}urUCKek$I` zC3xY0NYU#8RSL?yRM*uhmiYP}d7%j^kVaPMF4HH5B{^R3RDw#EgKvhFz92H+mX3@` z%fRvxHyp;F<(1U5IS&_hCJ=c7z*<3Yr0^rj@s89Bkwz*I;3EwlE_3C;-6KPKrfQ7< z>20~N%-8+cG6B5QH14J5nQMRBFtWedJ~#u$MwYHgMIa!i5; z&R2qWj!rR$brjJJV`GLbdhM>g9=FC!a$${p@XSWpB-8sJy@*T9ry@FAwa8xm^tTT& za%|L80z@q0y$X0kz>uA1-roGn)5)9`HvrWtj&O8LTV-hBoro>0P^ipG&6>7TxnlU6 zK2h`|0LdjCRo-<{Bj$)u+u}lGDL4QysJ899R93$vPJqRwexig2d9n@LHv0Oy7~KHJ zg5rZ4NvL)-KD#hc;>E;OE8}I8EX}_jWjoUA8fL;Q1zJTnS*i3hnORjq-0fy{?h>>;q{sifTTU%DwncE9Qk?ZH3wwgY~wj`f_mCrchJQ z0Ui&wV;{CI+X5bsV%cZnPU`oB-ZLt7XYoebpnVNXy`ShmE&84C{E$!(YI#rPv|s}n zNz`${ns1j9VB?dQZ+!NAoE)odJ@r+a+V<1xzQcn)nM|H*diJ~s+Efso|7`Op1THjW zkeK~pGt=^dVT7UEA|20K!Dn#|E`ftkOWrP6qu{IUotShBXoSRQ<$L^zOj9!+vEbU? z>V0iQy0i=;lpyeC7ADyHnG*tU{$^4!2D$btFcsI5Yk^I}p|{8XSx5vCs{XJaCk7oH z-n@pcAF_U z)S&19e^I_CmD4+7-N_fd|EN+HPumf;Jj-d@-y=r|0W&DEC+G3$y4gjon#s&!IsCB= zwM0k*maSlvmZ(eVrPoc(*3A9br7L!&LeHLKJ@WUL#9aF^nA9&r0u=~P1g7YcG8Epf z8zh7TvakmB3CTpJGzu+jvT-!EN6~CGCqdF|0V%8k6s&*iU>DZ4Q`r*gSTjVU?>IV+ z{%lO&ch=iG=>9(B#NVV$*e@@8CMrp_LZ_MW4fb9AQZ0Y2EaDH;l#eW>S>BbteYc*d z^vYbcZ%dc#fkkkw#dA+7PV3PR4A+CtpUxelC_I&?lVMJ0C-nv=9<~h(0Pl=Zb?@eb zp^2Rb16Ipl=efiVh*y4Pu1sBM3d9iapv4t3UJ=_lIIfByh?$CCdC94!AvJ={JXeOx zb9w*TcL&S2w5^&;?Ktc@@x_j{k1#~mUuVvh2@_W?zMqO!q4UP^)aJ*)cNYtrct;z3 zf2dVckvl|c^ls7e;j*)pyhRZp{Aq@{MCJ8~YFlu}Iksr6a zRO3U&o0CQin+(PU9cKDOE=wGGe;U0(L`&gntDTmR{|28tuU~v=sDEthwe_`Nd|hBSnuXfAZT7#SLf3QYy@Rm+EgtKLUNJlhUPajiCuKr=@hk08=+tOSN6~#X0GXRDNYq zR`tB^*XEoxCe3<@dK4(s&wl)BLs)PoiYF#KsNEseHUE6kREd(%X`fG2`LdH)T7NFg*nRB>-u{6&KCam5sj-M~lN%<(I!Q*-g&4ddS3kM5jY;`B(3E?vJA zuU#ZQB6Ohv!`||HebVF(vU@|(^6ln~*+PPzjHMc~MFEeNOrAXUa`caD_|dd!&s$je zyRd2BzR5(C7ICaD*=RGUy6;}2-p&|bB(d2{4u#v)nfuYdMu=iDsX9Qub{Sjhq9DVs zEj9803AY=TY6Ss9#R;ma`TYp(+uUEA4Hy$adA&G~ry~R1b9ssq`OE`0NOih$Qi|tq z-@X%y@$8}|YiIAT@{Orj^&Q@?d3M!pHi8S>v_75$y^m?dO0xd&DaZ5gdX;;3G)DWA|SeDIYbUd}8baSIny zWarQE5IoRXo=Zrj)XbW2Mt4Os9wss@&rEC3o=Efl>XL%|n@xKS!h2-DIAb5&v`0_E z5?Ld|hZN{TLS>za_;XW#YO9sx<_`~7d1I9qdzqiNb|rc2@IHkzer+MRups~DmY?go zp7N-n=`jmAf!LMN(LDc~K55KrU-xH{Bqduz%_2*d9+BkrBZqC$-ewb ze*5}Stt^7v!}Bf_kq^vl>CCDmt9z2(dJZALEsq4InXm!S$SNZER7whR} z!g99sJ5?VC>Q&k1t31_n7?nAUw=u)u!$J%d^9*?i^`hy_AwXI?mi)woY^H!SUhhioNP>j|t5WWsNN(7J6G^%h>m*c6fE7 z^6-1k(D|u{JgacgmpofnUU~Q3^p(2?V=W&A4X#}a_1N#vP#&%d5ymj>Y>sq)Vuu-E z>JURU;OAEUd3tN)xzlEbvB`bauq#(IC5US=49)$;FHWvle{6ru!1et70F~62cLHCv zE#3CTwId6Oop7xpGFGsQ;ZB8et}HQc6My~R&(@*muIc(n zt$eV!DBi}Jr<4lMrM5>?LRT&&cr_0$yKpXXA-kigj9B_-2+ErF;}4jVf*zjdvodFG zR|!|TT`uuzN5ju&B`e}Z?I&VfDa^0Wx~+EpzIS0k?T1!h9MQiKL3HlkR8l&wza{k5 zFEA>boN@17fh8kx!E1g!dveml{76)!>X@$d)wC!77S*;@{719=Y9G@O{b=rVFt-+~ zayQ690xSmy$0(8D)DzerYl67^TeK&i)4?3pdYziHCqWpc@I9@`XDtfo-?&G(n)OE= zaV$67*V!mTE$;5I5ubXw`awHgsG8X%7jvQ?M&{;1t+((*^MWDb7?f_;e4mP$bI#W*}AI~yN>Mb!#_n)(WqPh1w-~flqVHWQNzK*Q? zYUkh(b6pS37_#zwK=f(|{u_PzJFZu@qiv1EoIKn|^h`|XcD!NN zF7$%H!0z5jX?Q(?0BP*|{;PU<@8$cwsPwvOjmI$$vTI)$VBrSqUGolF7#zkl?LPkW z>P~vft(^S3L!zc`ZZ*vIb5a*(RFf8lG)BG-OuOO`KB>b$ENE=e_G)c5QvYRQ7c(a+9{kcO}<}I-)fF9JF9yNTCiHZKIHfxCeOXdL-YVQvC)5AeAgHFRudWfMO?cM49Wirnfr#?#*|O47<#-B$;Kc5UV$;W7o?ql9U!Q_?dF{Hwx1awiny&NM z^hm*Ab8sN;7SCT9i<7uy8M4KD!O&cz$KmULouqmJ4{&ai${+7X9IFst`09j`l zJoKUcP<_37%+x%W7%aI~lm&13dq{3;SZD}4720i=MAzt^RF}>sE^JEYT<&G?h`jJq zbe|}PIm8qjs~-DTd4HUjMh|{>1(ag6vjg|uHh3Eh;=v^&b8;HJ2}ox952z4yL-hZ5 zAzAI|5n7`=5uDA6h`23e&b<)BZxe=UO5{+UbfOD?Ix-HD%hNM;>x@obbEN8RlpCh^ znE+jFe(NyZMrtfLz@jn&nyW;{UUxnmr0sSO94^u}owtJ>{q2ht6^G;E`%3YH_m-&- zurT_`8WR60mI90$95om;-_-PoR!YI|YpmHsaw(wY%gU1d&rjzkZy|mw)xv>d_|GnW ztH-Ka`9DXbn~r4hP{vF9Y7_a@!`F=bLNZSCfaA#tCcp`5DGDxp*7{m;sx$WqJhzBr z^3|7L4i%zuTK^W(CO;BhHfPtJyEtEk)~g}xf*fa-C-CbNj90L?nVZ9!7l_5i5^vQu zQ8WF5da&2=c9hZV$aoM|6!X6-(K?r3nNrz7E3sZT1M_+@GvJagiys62k;0?^RSts0BJ%&#Zcua23AHqf zO)?^gSWxqVFZ{r8xTK`n$~(teI(SK#dPFzdHoX7GZ=oNWC3nMkZ!OE~%b4gly@HqT zJwZj!r8wh)i&id!pH3FX5o-h)jc8sVxM`0G9}s{IH|t5lkFap=*4e5{%X|n@ssO(c z!HNh$<)Xc?^6c-@k#b~J0j14lg;PimT+BWcCJ3GK&A&^-8=m!y^!CPIO9RYTVOtou z{@^_PfR&euk;^R&`rnn~FbrBB{a`@bH#?MXe zITTf^J4d%DDK9i!d&uX91h_MNtH*`s=N`YXU+Vjj0R|r;_7FzER5l1n|La*-vZbHp zzFAT;D5Cn>Tnn7#`i_nj8?orm+zVQGTm;)L=|cpufxdo{z}=arr@8B;AfR;^Vmq8f z`O%-cF!wXh9zxaACHxwlT8})VsCZ;Rn{M#nL5r!+8p~uZ`c7>jiml@4#(SS$*e6`@ z-Fm$$(mMMLWRJHY7uBV8?9-<*NF7>TfgGnoaSa4H!zhRm&?MYwhwjnvZf2$u656_- z(u>`z0?U7X{VYUZWdnUTX)TUVfbf=E&Rpxd9^E0~N{$H0-gQeD5)#Tdg4OQ>wm5ZaCwLVK{mm)F7IciSCwV;DHyi%j1!kwySHQwD_=2q$k{mv_QMY?TEB zg%4H6N{h40AUbOUf^}_ufng_otKgApx$s zniu$VRp32q=%OvJa)0MRzou3B1-huv%{vPV3zPHQh+W5czG_Ze-uY~K6a5k$so#Yo zHrn+K47~3=XZAEb{rt3ZJ5hXr5_2AW57`6divFRq-}Zk~Ybe9U3iP6Sw2`X5Ctv%G zG#OfKR*dIYm;|3ovb_1&(Or*zrP$r{Nywlz;a467XkK4iI~O)CxeanK@LvbQ-bYU$ zqjK8Q4tlCDD0zo$V(!~FOU-+wMjb-Z5a@bA1}o-qp}2L8-h{jcfeKeLZ@ zAPA%gup0OP#m1K&3r(_CpTYj+?-Yx}@= zB&2VGb;9M~oByBkzCM+t#6&hyyj2K-`~X3eycCYh%%a?<&vFfqxBR%aw6s+E^r_nZ z%>8F?=AA?G1|q#?ycI#;V_MN0B3i~LNKd=P?4pNU3F#`iaNa!m;?GJCI5Q%r` z{l%i6J?h`%G8(IsGYD@w!U@1udgINzqpHhr_@*=%1Z%UctrM^AdrfWJZ8@;1feb`l zLql71>tQ~}PyDb^!`s&5y=R1q*`~eGs-YnvIBth)#TgL%F?ACTo`x^*D4s6m!v_OC z*tFJd+G9KoVnoLI`T0|!j7tNm+Dd49SH<%tq(3^r`5D0mZT5C{S6lxaeN$If4sU=< zQVsW7VP68pW?j>eiP#2dYI(0{W50b};tJ39mLE(|~{$^c|w-Uh9Tnumvn4Q2>a zY)Ye|Q$)-Z1$_g%gmFW7UhfiQVp(!V;~%k&Arpt6FZ59-;<6R=uEOWMp62a`+|H z1EvjLAj8w)Q=FNLqc(Bw1wb2x7KVcLC6-R4aYzI`O2hOFtA3veo~y1t7=jKS~XTFxWEhPz>bjc>yyy| zS!pjJ{Kv#=+JicLq7=3^qvDH>=fCwP=)#bqv*fnU{>hv0nMtrvqs(w*GmLl$`8&Ftlj%iaG_JsG;PqiJ=$pW)aHSDcx=b)`YDifq7$O~gmp?UReo))0K6V1 zXg9}8R&9Ch+O4DxMM zUYiel3VnbLH_aeI?LF*%oBp`(xvM| z-#A9DDV7VXfGK1JegNaqUeoN62IRt)Yz!lYOjqb+o&I@tJ}!m5of_hMdP|Foc95&X zvksZB6|ga5(5xREY!^0;`%+OiGYZk4IfGaExF=jD(CuGg-%0gsG2uq;GvP|{8X{}V z&={)&NvV2wn0h$l&lMIH7LkL+ge1uHH}TqRSV0?)P+~eANGZ8t`Tulwr9n+yQ5dO0 zLF)z-s1~#$iU^cVp;yu;+v$v5rhl40Z<6=s<-T|C{mwb}p7VVz7kgu$(jz7|q43~fpQv|Q z&5nQlY$icG)O@`Kami18g646jCr_RP)ZGQcshgMZ!+&&W;!YbK``0|)AfEpal}ZhC zS6k~urN-gSRxI@Hi=1;Yg(=Esv7Ui?T7Qj?^=67TKjU<~c&fD88@Z9~H~b?--Q|P% z1?c6Elai9W^yZ5PnirGpb*c`ad+F*vc5`*+;w+SC6x!}}7SzHp+_D!A*!J3F+E3;R zg~F78A9KhUmf>&mRQy}>8ozUKX2(VKLuL0N=y*7&fZ#xVME_XriVmnlc64@n#_H0L znan#ct@8cJMZDxR zmDPC#1^NIKYeuShBntvjNbmh(;+{q0CZA8nA~4-SM4R_E3)hVgXkHBhqbDf**tMziDZ-rOTm1 zG+IlBH-swFXKOm%Vm~sT;t?2YWn~ppCT!HK2;*ZG7(i%<+=~W<0cHBSsf71fnWuTU ztl@aZZ+R&sqPnhL_??p_K1e&cKP4JGhRC!7%T2LuE( z?So(i3DUv4j%mLi^@3aw8*!0Uw9t$F?V#RZejbx)0e#LiR^RKK%N`)CdqZ1(h48YMKE#{J;I#$ znW3x;zc5>j`Rh#==f}NORV6HV3F%RtulCAalZu?{xDnXxS2~b-Jo66|kJ644=xRnz ze*QK)MxUTdzt!X>M21fQ9Z9Q3D+FfbK428ffO;wuG-MF}^<(6?I2v!xQ&Ktia*64} zL`V7C?5o)g-k`A!0Wt)(TPfzTSdL(Ba<#NPk98?$3Ob|?E@t=pjAQefx>bI#=(-vQwU8-6S^Mk3 zGu5ed%whNL=mbwRYc;3&&e2Crn(WH8oKet!5~ss-hPp)Jq~%FTXSycULe)Kpdl013 zP7|lmq89VTuqv4C5WL8nL$I_8D=%|l%;x9AswSTuhTwfBA-aN~$Ucl>-N;^75QxTW zs~VlMpqv?Q`}FLKEJ&2tbd{ErXvoV>&h|-47%eZ8vDNxPok^FKl{E;P8A}O5X4@^e z%msolKE?gsDyitXnSQz8=iV;Q6(C!DDRJlk!{Pmc%^|q-k>DM8HMCZ zK>y%+x|-IPMSBn;#68#EvW5S+8pH7*hNBOnR&T<11&IFH{^1}I7q$9q4e1G?h+av5 z(^V}FYTAJrZ`Fw}KwmA+sMvpq&_!iou&BiB+DOeX za>7#K+*1#6HfY>@KzTdxiL6A_?rZ5$W_K3$E4-uufOtr#r;Cnrxm+3wrPOtFS|^6O zTA>77TRHY{dAm6X{T^s&Xy^t92NQpWr&v7`li&%IS=Ym3RgCIIOMRrra6niEY-}|X zHMxb3l;kp*HDQwl9^l$6ydt{e=kD&FoI)Hu?aHxT1P7(CNy57SN|u8HP5C%P-ZBIZ zW@C*3F2jZY@@Pe$Z;)+j{gj6rSC;aQU?MY2rb&DIew{)!yC`!7vmSz^VX2yEE>B<4x9Cp1Nd1q0sA+f2b0q;*BYk1lH4X zC(b)O8~t`mQ@!=?+KIW1_ls!^s<+6U$l%?Uol|_NBk4=KU(1O%Ef+FPOgCq$jU;GY zZeHOP88Q2l`^Gt2OC!^d@!39JKN;Fn)^6@PDeKnVmp--11m3=VD|r1QL(Yc}9{nN#EQe_*6T=PMzkhz&@FnR2b)}VNn$Dfm(}}?` zv%VDXO*>arT8Y>W-%os}#<{StpxaY)XL)2w?}?#a;&oSdA)U*E+Sx4t%yOnMmUzQWSd(qbYt;k(fG-e&K)=b{D=t5j73 zX7ZzLLKxIuh*U5iw&S7Y-0JbR@rdi9?flY`X&%SctzJLByr!l+5jJ0c_Uu_UF?&UG z_FK1Z*$#iEdaaQvknPgm-X5codhOJiGlzzq{M+M`b{{#y^yRfitHY`om3{Za!`DWu z(B8StDl}YOUr%q|p3z(XwUNF40k=Y5e5BjmteC4jA;pc2TOK{C$k#SW9#>Kw;^FjU zUL}F&GW^D7sBVuW=NV;Xg;S?|T5J^*Dy^$7E%2H|DxTyCd91YODAxSm!_WCrO?9^X)ald3K|95Y ztk^_?WYyHRhy+pV(@FjL%@~>TdwIczJjbrCE(TvYS~|M2(Xa2KRrWQvq^j&ov+R{= zeyu^Tm(DC8AmHQYS0eRynK~+}Ti)JY_+qkVu|@35LD3&SelUwW2zGXM?%L)**>c%J zx7=^*o?q%FCMFL?zKd+)RtPG4a@17igT1*=xt^z&SMSFM+^o}Gh3*`YE>r8s?``o_ z$-a4&PrKmCh}G5Xn>!0u7EK)-9gExBSr1;zD=NA}t!rxPFLvu!)aC4(X_=W`;o;le z7Juxe6qWmLZ%DZ+*4EZm)Y`h;v^C|z<+%*wre?_vdjRdzVn>G?6m3`C8af$RuN*i1l;C(*Um34%5Xi{#l^KM zT;QtOQw_&w7F~J10#|d3>gqQAS(;m`VtGwKGmT<5(&DAbckk)b9eR3t6Qf@bXW9u? z2-*xPHYcc3W7QgUWbxz6NCmoV{SyCUyzs9)-mmBT_tkjd&8gZP8@3C4pN?Oy35yK= zIh52MZ`|}ssYhg!H$6v7+iYXp*MFp3jAOqI}T0Hp(5i2fh8=EgJArTQi_G9hEU%qU`Pq3p-ojRrRG(N{hy>?W$#Bu2FEG{_oWuD$dHRN`NB)ba%gpL zZ!fd388c4An>Kv)e&2oWhED5=j zCwuefx@03IUDJ%4cw80+!V@%7+3^u{Q9Z^=AMD5OKYDbpAy%RE+qa!q_vN@vc2%uE zn&Pec%GUiJ>)3wOq*>a;gm=r9EuUX0?YL6lLJH!Z6#ap!Fo%VyJ@Ns}RF4(!d2QGp zbM=FLj>BY3yX4@|5ci=&Tlg<$m36+iS+J`tE1Rg*PzYr83JqoAyOzg1RQJ4evcFQ- zz+eq!e(RI|Un`CoW_aqEW^+}4tomTfzP&7|ezJlmZ|T2mFi<=QRS{4V^vBJ016 zzh_{;>vRZDaYY3Uc@frMy5sbXyro|sw6aw@=j5YRq7;JJMeIkJ?`_<1uct@nN}j`J zspSc3yz@O=$;8O%Q>Ucymbh2+np#?V9a-jNGe@cH6VxlCrNoIH^S7S)dExg+D?Pp) zv#9Mplh%|stw;SB)J_Y@&^D;*>2Xm<%_#5Ql2|)3GE$vuKbDlfiDFjfv)RP>@QoTB zl#0>zOjDB|Av{WY+qP{Rt$QvOAXKr5O{9pDrWz}6oU9S8GSP&W@rjCR#k*>y=}=J@ zT}-`J8$U2Gkba}~iO}WCmuLIR{kP1W-Z5acFcj%NpV5(RaX4MGrptBis-*cuPjO9Y zsfUE)l$`lOf$P$twQJY%gq>m057;U0aGy%&sAzh9@hU0zW!Wt^U#u6WrhYB^6`O>I5{KYZ)sLmTKAwjky(u`6=h=FP>F zG(4{<?l1qD@<1{lR%b&#M-d_^c74)CCGfA=s2-#14KOJap z8uIXA2}RIXuH@IZ50&wM|IBsMQFe&g^F_WAufuIoyIhIe&qB$%QOlb7``d?GujAt5 z;}1uxd?a_%zTn*BLta!WVul+y>X(;Y2FJ(iU0hvp4fV}gvAHRmx#f-i@UiH#MmiU| z{fT(}`T%IJIB`MDCa#Dt?#l<&T z<*Qdwi8=jF&{}-B=PWbDetF(XQmJET2Mfz5la}PfM;DvlC)O{F;o(x!$d{{qJwhJ8 zqw2J{j87NoG^5`>XpC93rd*A?)?E>}qao?S0g>a!dr|I|wE=W?g;y`|nB#M4Dy@J_ zzG!-94IaxSz@ziqw}w=0DaqQZs`at*fif2_K57{}n~RTSDJdzLZ&H$lI*WiEvxf?fl>Rw2^&Z-+89_CuoK&|tPw70dD8*9(h)z|ls zbX}bIk!jj`FE*AN<@v(TKaJjR-tbE<4s)+$s_c_fWB7+rSY?uJ?3xb%Y`#T$Qnm6uY%)|Ni~Vhi?SPM;0}{ zSmJP*plWPt`tpQwvpr+odaqUFFDSdXxvAG~Vv|OBo0u9n>(>23($(+d$5U4d-FD86 zcegrs>^OE)&eBrwXLDk#eGRtPT8fD>%LOUz0rg*4v7Nbg8y-J?9M?YA2jqsLrN&*D zdnJ~ZGi>|OdT!5QesS^6sOXYZ?ZP1Uzkjkn4&DSzq1?t3DrstBtgEY|=|EG@R(*Yu zKTv0VRocgFi*A9LTvM!?2iusKRwthKs|@Cl-Nn$eFg>K1WybP||MDxlomH+Hyh*2@ zzjY7i#STPUlkHc?^ziVI*M2$`W4?6m+_`z6jfGaco4mC2?FyGWckYOqbMM`|c5-s^ zl%ir8n%ozz${&ldxe4_qrS~?P7(aUSC_y#$o>u$JNb7^gkJo8vXuKJ@=ytElCUGL` z%9ShV`{&J8Zg%CZqqu||mjw#D>)}D&lByky3!vU-^rSWF1oi4QYp5tUJ9DTim|vZH z?gzYVa#h*MDObx&eS2%Fb|rT7ndLg_BGm59J9hY^xytZ{CDn%G^Kc#cSy#81cMDxj zRrS1I<E27#t z#l61B=jWux8D5?FYD6}CEc^@VS5R4*9Kn^fPmbdDPy=^%7r1WhI<38Xi}Crit?SoZ zxq20qkM&J;X!Bg{c{#4lPP3!FQY-V+fS8BbC7e9_D}(i0k}oZbSM?u5n+=RVcqaGa z%?taP))PoNJyhSiSdg84?DpL%y`|YP)%bHeYa*pM2##IvMK&G*WX~nFUF$8(yVH}L z0=Yc%2qp6Qv}X0{T?YogO^P7tUk8)WB0<0 z6z#&42f#!}H7{SDmU=lhI*Pr1wsUUkrAyT$z;os+?e-UM!?N~kTA^fFEle2*Ry4P` z19?1u)SY3>h?*>9(Iqaa#1qnsr6f8Oolsv@U!PiLK)q^Jnnkx{hEYRI+vnygn;(la zBMFz@@kFl7Pxfz#T;A{T_)4yAVw=s$?)o7QddVWctq0QF|GHgyZ@nf!sKA(=A{lw{ z;zb$>1-z4GZ^@bri|&Fpaf^-#>+zxSIUx%J={KXWV_bIgyc%k55Ne_YQd{Y_}F#i;Yan z5yCVB^wwY8SgO*-ynfi2*==QE@m-o5f9;j_=+pK2fpBeGt*(+1nf^~-f~8gxT0`EV zb0^Hy9tRRzSy{4>EYY^Y7p8uHC5Ywl4Wqp9$jEw*)J;1@i%?eabz-&L?0oZD5>&ZS z4fKoeu3@b#oOn+!Jyf3{>%9pWxB@qqe1A`G@7G6_ZS3q$0w1Q?{!n`v7iY|;-ENbi z94VRO{({T8F7ZUZ&hLkWgjDt4$J8mg){XZ~y{`VJFEoT;A$4I@luP-l ze#vvR$rKyTeIls4D(mJqKZ26MbZIwikOfT=xR&Rm^i-Jh=+UiUi3$K8larMk`VBEB zbKL%j#>B(`hgkj|R?RT)*aoJwngW8Pi@g#d+BR#pa}xCdYdrcQ;=6LFl{jVk4h zX8hxscTtgahS}FGSV;;uZwe6Ti@)FcmF?^WIl0wfLJS-n!PwWa>hb5&-o7nCJxsUi zm*=~Z!;tSZbJyR0Lq#x0@cg`k?MTZ;vLxTz45gX23YZiDeOj%*A{syB4f6;ak71JPHr5 ztS8HVu;y_oo?fY^45y8il`iPf4Xlgaq0{Hi-FS03`^FYuIquMR!olnkdLA?z^|1I1 zWuounU&`j%XWI9ghc3N$nlZVE2FSC;R}-x$^IpB%@8Qimg!EQV&(7XJX(9M8)1vzY z_F_P!0?S<|5H8kgK!w8~qJh8BhA)D)KcDTdd3-n-7+rB6(;?6|VfPi6H=K_-BwYx) zs~R00Wu{1$*{E_>W|~WhiauFbTuh1sAosT$YY#iUd&^*7UrE7y--aweADY{LW?Hpb zM^t9OfmWv9Wo8=Snem|(nqJ@2A7k4PC=MJyFbma=$^u&bD7GKXy(3?Y)fPNAC zxnEb`EZI*FDhcZpQ3ikgs!L8zj(|Eb((-0U2PCWqm9mKm<4V`Au|Jh?7Jhv>TY%A5 zuIAGxUoS7O^qd@#M~@$Ws?W9?xkx^dXPf_H>}&GM%LOj)>xOvrb0fRUT`TSKX*jXQf)|f$mZ24T|LhrCX6vEHbZ5f_PQKB~ta?)-2XS!N@82KKbve@mlaftz z=y??JN~fAtVa%B>m?-0MwSd|Nh}XeiVT;VHE{x59*`JkmCp3 zQGo@)43G)MXCj*kDxh6WvRth#xOB-l?Re5}oKmpl%ba$X7;yojT^v~oA zsM#q#72e2J`B-!Zjd~ofQq_hwpb~wkYKJY+;PJ!SQMV)Dm)kaMV?xt*@7bd}KhgVN zf{2F&HN|aZ!7!<3$E8cv_JE)ecEWyE$eh$S{z(_HLt9(>`}gnYcva1pIBD#`S!q{W z{9+``-pb0#sOih6?@N}7$l&k)Dygyv_jzc<|n3^h~ms-}jxWHwQZ8LOk`FE4D zL6QiN*#^|rMu$}kXh^ZZ4iEqhs&@;So_}@WS_NMd`0MRgN>90jg3WfY}kMYc1kU= zx%pDUeK{`GxaHpaFJ5dy2~jtUZRV|7NLJamE98^n^Nk+qX=%4L`BKg+?e*c|ZQ?zs z|Hydj`)9fVoO}0HHh=wm+RaUJYHI4AS7u~*YVrx&3De!eb)7NHY*eEzJ0{X>IFdM`8c=qts?YS&l79P@34HlfAQfUiTJ>_z=wMIm|v*qBFP z;6@<8bo()_Ks!QDP$TZWdv`eb%^RX+kG7@D>K9;#MjO_@NJ~$@0~Sm6EVQ^c3yV;U zHgDa!23Q<0lN$me$j3ozDZjT^$5{BFFyYD_d5(v%M4iX;Ao#F4HV!y!{gRb$>AV2gpi>#NGsksF zI5Ixw7c+1D7xb_`^vR~SwX!e#%LmT3x~iwdCrlqdc5FNJ0;s~Cr*_{3*hVeY{C+&{ zj9oqou(ur7ox(z?M|_uJgi=#FdjE&gi>BZs$2HoO|FpYXzcNS{Jjv3^>JBI>x;!Lu zZ=%8#-KMg%w!Rw@!iaaJtG_3)Ufh_{-cs*{v3-$On7$m7@32$+>TdeK>Yx;dmF43V0W8!O78bbl)igAEv!h>C z_A!;Sul+K!0!4=u)G|FG85x-gJZ#gN*VXo+-L-%)dVp`Vw6wQR?e2Se$(qd@T3#=R z79n|nVS;u)&kI&qqRr$@*NLTNXM3Yi4c0!@=(D2wy;S=EH`AhEfd+is($examLS@p z7anKQmwO`U?fR!Jbj*Sj!XdHHC_RvnjJooUzf{`~z19ohkvI**4Y6wp z628^#x}9=AB*X(>z8?x4^HCEf0B9l*vx!(rb4wAq0pDWtwr%&?+BAU|S5`Rzc!Lp) zEFUMpl&z9D6A9Xdl9V32D>?>I6{WhtL?1kR761zeq#9pY$fS86pH>DfZsG~-L(0To zsvf8#r2Ny+vj+?h8^Ot7gS;V!75lU0&Fj|5MqHMwS51v#=Z9Ot7B>w|O<&tx+rE8! zvR3B$Znq_V_-a7Esaf>*#JA6U(m0**ydFx4gwyXLY-K^SHo?_v*Y*Q*&-Z$BSb~f` zU;HlaJWB@va{>y=XCQI2wlpd{D{qy3WqTuCR#BWRhVUHP(sZfigV=llgi7()vG3MW zT(BH1Z{1Qr^C8=ek#hURjd3u2Vu)F zY~K#COjHL=FUM(yNYj>!zl}3({Qsq3JLKbljAz#`{+ zJm@EW#7A<8in99p`VN2laLj(ZD-gQas@q*1rxg`F6?sZQ)?jM&jE>e6&K0la*d-#u z-0eEI(|)w=4!Bgw*RL!DenG?+be=VX$w`AXR@q@u7zq1d#O|5jfe!6FhdY0l3jfMN z@f5TjR>^Z&*iyK%cz}zW+v;cY{v$@8De&ar&lJJJnEdr&d}^!f!29G{-1juS^7X`$ zC5{Ur&t;XB{Z(NH=7DFM2l;D{Ury27`Y?4uZEJ% zubFn6N(Z#6XQ;Za?go@IItme`^#g>wWQwud%W>_GWujF&-0{DbbbSR$rGy;{YH;KP6HSol$+&$r*_=Dy9#6VK14zP)bs zs#)|R<8R$1L$>d3)G}9EjW#Ju4Kz(nPhY}n)|8WzORw}28f$%lR+bhc(BvhRjo;nI z%31{lH2%>|ef?PbuI*j)14EBqyr={98v%cJ?>bb`hT;Hu-DSaAQpqt7*ZwJfU~upd ztm|1giKev&g~UnoJ8b*-A`@_AOQxwt<`LJF`1ZPBfccB>ZYX6EGa%{e!7x32{mg3{ zGI#0^6hPLwY~BUMMG%~bh+vt&(IJEY>+B&91kSuxOSv^YSnHw%m74IXjXQ*!oyDrg zU*+dZoK;qC%`F74>~9JkScfZx9MRtt*1h(NO)*2^adOEwZ>DIq1Bpez30?XgqCB60 zJ80@MzJA=h5q%t92C@d33!we=lr+@S31b5Z!)ofI^-ZNfUP-gLP#SXVMz;MW#1#Ec z8n9;a`zOYggM55+u+&U~kL=m9PU;T?W^u=<)nGe!fm*Gt-?rOBa6~tIsd3Glj-^Cd zOjI-mmj_rz%IW-X-0aeZ232zo`(K^D6BCx`4iA;VMa!|T?>~9sR~>$EdfL&!AqocT zvI}grDC-mql!_-r->%0M7iDH@rX|_?~^*A98Y{@Rj03uT@;UHWb5YBo=w>W~UFzUo+en z{z4#5YD(tvwsK?%P^y+qp*KWCFu#nCCu}^f{TwtEfQ^is7VwTtYBxPOV!Wz?8U3jy z{*tCymx^J0@TrJRCgVfGgxxb z$DN;P{bqLdYXVDWs@8@XpMSMgpy_vX{ml89<&QNrWl%~OHgCQI?~n8F;di0V=Z|2s z0$P}S&&JaN;!~@(_>W;?_5I_6Ev>+9i7j`_y1Llm1s4HQOrV~rnGXRL3_Z=+`pT*6 zKy^sTK)}?J&ai2!m)9CB`rCIeva+zSe5|imz=J<~wT^TwiXh-ertJV@4PG5x|MhLm zprhy(|8my7nIRz|S|5JFS>!%^m=PkDR_g=o9_7D~DCZYPQmuN5sEDm=UQn`~0Y1q} z8Nd0poos9>vGN-Ov*I>w+GMh1iCDl9<3>6C=T!>JMBPVoyxCn42u&++=1a=e%VRUp ztEeaq4Gkv;kKtor1&EGbx&!lxc)mGS{WL_v59bum>|UqM~+D){Lb*ze*dA|gVe{;{T4TkBH2CQMmw+9VV=ur)@{S?g1_|Cl$Pux6@n-x;%#_E@4I)_V}L=53k z;T$pwgFbW|+v$L}ilCGkF z4oH<)&5m9o4mj(r#Pj2yL-(Yfj5{25?&rN%o#DrA2BQ!f(i#G~444Y)MLoA1Y2iPD zJ}@4YW7NQP-N2w1X7>DShedVhEm`)|7`Knxe(g5jhx=?2J6>Qc770570LdFA$Mote z%aboNet$b>)h*##V3;%%z**;oya9nbM&ELe5`OOB;6P+&G=D9zlJK?aSCoBvp(Kc& zf0Z9SGV>cZz^J=G0@|Dh-m$c!V~0#M1F|b+goHO+zXEhY;3B5QAT%uUZR0Jm#Ycy8 zU4riIGlU}?c!zpz3oSXuK{Bs@j%q!J(C5 zq&gbj^Eq19Ve$8Vx4(aGWu9|}`&RMsMmljLAQ1%G8ceT8#b5awcQ(9;SzT;l!@<|1 zwl^&-f`P=CcLzi4AvgzB$O`vurNNvaTx=z?&YaEG@AJwwJ-A1knLIM7)@8@X=9N5g zb*t>aYw^1Bn5nq>g011*2%|)yX2f;RUH=~U~?U@yW4FT+Aj5mxR^ z&;5OT95(@BCd)sIcA2~OKng8EkzN>&eIFyKMAICv%mK@_CPGNIKOaKT%J&N`&n}o& z5jzuvE%0G|ABi@^pL<>cn|C2&bkvMsWj8lBm%l*>_4EJ}n{WojsqH7;U{(Rp((2W# zb$)fe|5#Pk*wzI|cQ&jZ1)vu|oJ2~riwJ77x3`}MT$^X8DOX*|5pYp$@J^{bd+Bph zmcgUy0oKa1k5k_Y`yU9lNl$S3@y?=rU=Fed(T|UGRU_7dU-unBE{;Ue1jW}*22Uhy^Iw>VNkRL|?E1d{r&HXUXR4dxqm0 zxU<_SpM7>ExM&&w8c+;#Y0Lmxfk<$f+ZE2VftJPp3l~5JXruT0cTWISBIhGW2yNg- zL$X2R%hU5?+1;Pt+Zd7`z~Chcfts2c@zV<=X0S!lp_URlN!$}}@3kZr;O-cT(zG)E zcZCoQd1%r?MxS>PWe4|M1lV(QY>Y68{OPYS{0UBHtDIEY0fqT9 zgEhxM8ggQLK6U(Gd>Q)RdkmApe*I89$Opn%I&tCzNio$$N(CEKKbq8iurdAlTWbKK z@vi(y!2Y}aB4lZzvukxU9dO?btoc7#Aq@b7#5+auiGUaM#-K>I#dQGStge0tDG-;E z{QSj>7m$@(+bc`U%h#aK%Y#9}!1|0_&6Rwot;Dz`7n2!9PgKr}M|hil9{@BMnaO~o z16OvqD_@KVKw)8C+HMPLQ1nloJ*$4%5G)x?kVJ`~o0K4X25cjXZP(M^?}0t0U*fqA z%erW4pqj{fsHDT8-Rl^6bWo^p-B}(z{x+~@}09k zP9!7p!#Z++clF>~gKAA&X2_}CHFb4CD15<6&qN+bI85A*9D>`Mj5R{yI?Bq*!d889 zSqOu?;e4{eJ4E1|sw2ONM0<%(SDs@CGMVAK#ml7!%spcGyLxNXDAUU0PCSYO4gkC+YzbrF=NFOAp~ zpD1Z0H}I{sBs$!EVPHEU1vhkcqaVRo@k+njuPMTolu|i@_LdA!RdM$gp<~C62^&-~ zWkDAR><tr`rkPqiA_TDke2}gY9loH zZlH{rP4<`xEf(-! z%Zos!$s0!9KiA|kZwdA8@SDmHX(%-LB?BREqYnjEWlZgB%Ght>cr`U1@YVpJ$WSjm z0=9WNw&6QSj;3sp$Y8!DyirU{j7FN@0=HFH>B$jYIByH{^J%~X+gYryT)xI~fcBUa zWQA?a%pU=0Uth}?HHEVE6xy5Ak&~w@bE7ux)@iTFh%WnLb2hVkDNI+F)Ujr@~0SEKu%UqOPQ|Niwq-Vu5IDWQZ$xuj4) z9VGu+5;_$B3DJ^7^-ueCoSqRPwv^X=T5%}1aM#`U^$CnwZ4r0=SmzY5X*$1&+et*6n>o|f$^w2Kj>oXON6~_WU_9 z;QO(NMEOUChT@y7P=!igo(>`MdE3{oclvXz7tClA{SvMuF(+K%h2kI$eG^h46~tZS zId>uOeb}%LGS2}kM|>oymCr@(?xLr_>inFhD+?o9S|HrCCK3rv(7fXAZjLR!tAW`_ zy+_bVi~WO!!KEjn)MXivgo#@N1d$X6!<2dM*Ef$-yAP?>{k{cO0@bM)vYly5@(C!H zO3&jCg&-^I?@z5Fc>8rU{7Mp_{d|gk{rW@mSyfGN96smTnSu`Jz?lOIDFN`cn(RCI zOw?|@OmteA9RVPag}k)1diyPb4r^XW>;Ny7fe~bhAfLytwVa++du$X2)ILFFC3-Jv zY9BNTFW42J1-?*DN4pC-prukOtxgKK12PglkpNyuplh9-ok>@Wbe&}(+G4V1`rXK5 zaM(aQNEkwLY4oszvvWV>7|lEfA;>H~xRcVmq6~N6lMZ{un}TIWlDeQptIt*+i!TKc zCwlE(Wgy~PiCi`@u4-a0l?UW?6PXKR8$y9y5v3>wWUQ%emLtVeW9KgStB$09{b>MaLD+o!hiB-%n3c!cfgl zkU!29{@r@~FTN-P3kzYI*jai=-0Wj>bq6d$n8oAy^PPZ0s;So`5Xn%{m-yfOt=O{V zEaN8mk)63@Yh&F?pLl8mk((lo>eDSpWS1~p;?pQq~$=>)=WF)fBf2BKm>BXfDp^N zuh!Mn$XQz-0+Kyj(~qvz@aFQdf|At~1iz%QUWCosqNWQ52L?FB#6EnSL$C|*EO~m7 zmUA)I+P7-Y)hUy@frl#l&c7DipO3ULB`UqPBuOJR<)SX{%>}>>3I(GJ&4V)&6BNp+ z-4FG$PuePA4cFnqvD`~73jb8Z1Idt>;&4ZHYKsA|kDzYJY6QMkfi#tYwx%Hg2W_Me zjeFVU^y$;cRF*?3AafUFPU5LWK`{!`k(Pyj1L1G7yGjtnRY5N!bXYW-sv9vcm
    ztzal87G~@}MA~t{`Z%Cj&*0!a&>?p4NTgCvg4#>fBgR6KJBYXGqR9~|LgL|UQf|*8 zhyFEK{(fAB7VV|TBMZtjQOV&x96H_ynz4(MlS;bS%Ni|}c-AC5Ln(qc=}n3RN-P}U zEx6HZv3!1yepMeDtG<2+!kb(_R-UqX0l-)c23&{}M#4YfcT?kFgMIjbG6-veP)*6+ zLpa9=!B;0IQM~Ov2vxmhzKjUTtx;1~KRVU$;g-dzIJFwM*FYg_D9b43m^2Wu(*D(5 z7z`oz$jwg8{(Ojb&%|z>i`OUO4zU4|F$Z&Fl|u8XI5Ds#?CjS3E4)>3OsJU7`_W~( z08DZ+&fh&!+k-Gq``~MMM~CgJ9w=I1wZ0E zuR(xN3%*th8b}4RKHy6-OuS`lAWjI(aA2ETvYqE{2HFlRo(EVFG&MEl3Hdai0a2Oe z_M0p3A1YVZ)*3)lA*iO+*d3&y0?fX+@EUTb?=mv3Bf4rbeG;)7$|$_fp(dvFYew2L z4=}Fx8bupJ)LcH8y%u}(60Yz;2it$6vB$l9DM;Z%q|Ka&EJs>ccJ8bNfLLe^=8(EU z(KItluE>LYWC*0GF7{}>*WIHQUC;2iUw?EtK(%jbJiFUXz52jEYJcWH*ph*TMXJ@a z9Y>6cAXa>Zo3N09l`VKjx(LQmQ9J_YhbVy2ZmbLHFJhA_m7d}yrq86#F!CtsP@01t z2qSE52-We%+yOp}=ADj6Q`2BTC8wh~^)nlA;$Ic6=zCVAdDlP`ISa-g5 zbl}%s{Kz*$l5E4eb-Ebw7{x9!8UI6aDK@sY=f$o;i({RB@HQG4#6p-1AP^4)80p~K zb!Gokn)w^mdQ>++1GJ-$^{*FC%>&9gOnqMmNpubRJH)Ka8H{};W$JEQ??p>bUx_Mt zrsglMB^^K=a|6N3&Zw)!R#zS;YKn-cC>Mm0H;$)t!Cpuv zTg%V@5dt(HI%IWRY;I58fA9cBb5j(AwB>YHeE9#GhdyW}F`hTQ360_PWV}=5Cd$A1 zwcSo^Z^`Z})cbt;Y*v;XN&4l@{>z}24e&=Gi>mAIY>(mEh|pkw5Pq1}=KAr|r`~Bz z;OvV}sp@EBPK{I2Zw>BF!aW$FWTSE)9#T!$R5D+}(u)zzGPy zX~;)l|3Y|yMRj&059wqD$j?|mnFW#mPgS;{_KjNaY!N0|KKtzM2k%F*$u3()s+(lj zWneC#JZzNAhbe)ww-F;P(ZYq&yQEQnzchxbI)(-hO965aZxsCg9&%Yo=6NDBzl)dG zPkRt8*{JcQue=*5(88RotnAqucjL|+E9w9t(z?M~zlb75Et;2?=XX^Zq}gvQAPMs= z7q0Ksf~uWKV%{{AW~jgVT@dDhd=z5U&3o;JWf!>j9XqDcFJWM$;o4z~Qd*s?|M{-1=O0M!%(a{$stMoRG3pQ~R79y-XcmGOCT z1*`qY`q+m^M)WN&Ot-x=q=zrzg_$8vAt8c4RW{6A(qgML{&vi~|DaK6muV3Gv=k&|Z&!=MQl-T3R%n^3>XTlXV zLv-9E;0is2Lxxe|NVa5d3XO3dcViM@zHcP8u*JdsM`Ij91Wm)O9nc;~R%`FMXHjQD z?qH&ovTyh^3nQZks>uf2*n`#uKo6ssQozORo4sp2{FzhSb&(074lzOkjl#41Ns&YVMqB_nFdS3g`FlL7c5l5(aH2bdiV#l>@1RMdWXAoYfP=UcdI#Anvb zwcXG0vk5bZu!O#R40-G8otkR#V8Pzny0N`X*72BPVuBT0sMj0N%k2J<@z{G8waFH=!H^>c*lvJVK_$WlE zUB{2JpxF#Ik<8|{HR~u+gPv14yl_cl2zWu@y=5<7(#TO^#CHi>IdCyqrFst9^#qu` z$r1j2en`lYK#y5>!A=ae3^dJ-cL$^2CI-f#q8vb4@%7Gk0ReP@tA@}J2)@Cf-qH33 zMwtK)vghc^nFbm$5AhkQ7vep-sL97_E=B@-Hu19Gv@ zOH{dQnOgmJdefkCIc*xd;)Pg;9by4pRl}*=E#kZ0rUehe3+(R zxe2Vt%RPxqU?aXAg)NB55gKAcakwvU##{At+*5vXWL@E(*&SpS5ub*g#)4ZY4Ub>g zqU*{iZ6}$z9juQewl7i0fGsrNUSCIIDlK)_Fue)`lFZ-%Ap@gtW@Yt*F-anWMql6V zAub81PYiSia!UvyglyRh8ncJJ5YIptxPkQtB6gS^(f6(1Za`9zo$n6=1KmS)BXclh zc7TnIt+`@)Umd>n9e6jWBk5)5vJs~##=0iCEfOn-7!z^Wa1%ZuC+@Fz$s8fOd}Uk1hCOYv6fjn%6!%&M1ANo0dX=DzX_g6s@P{ z82&B0p}cDEf5Q|e&>PYX%$BeV5N86t(tqUL61ZSYqvK`)I!P|?Q5Mjw$qmwgv2n2|qbB>4bL zI~e>cK*1r#ywR(y6-;Kly_WAJgd`?X$Cp5eI|^Jsw2QaPd_b50H~kWN=wp?A*RiT} zoM%THa_doJp>gXX7(gyquQTjO2+JQD)`efuu8K18U#PKan< zfVwt7Oqp)gouUbY%%;a0$N!YqHSYA+(c-SI7wd8SgAkR8+biEH@?ZR})jTHG&;9q( zUBH>{_|w}1gkbL=I{Op?;BZYz$to?CN+lfc*)$s&@fc*BBX1v{C>5EtiL044{;UM>x_jh~GpP^~guIe$_ zE1y$%m~XYMtPg{pi>(A#*vYVzBl{b6I;p8Pyg$3X!*%~gs}2Efe)WoYM-iXR9TKZm zquF_CSU6uMMUQMwR2<&EozwIDmJbKxbmxD*Gm%oE^yt2#E7gcB|8>@wr|Q36X9j4+ z`<&!vVHUosqv4d_qaj#k{(3 zSk&K>fl(yvVYHQk##ZY4kM@@)hSK#LV7PQo!;H^)`R4e$QRO6S zTUs7P-Uv7$5MJ8&Q1XTh?}y*v7H&~KQ#R=aOB$5hFTNF=Oo=-+;(}YpKI|mN%k9*7 zonpPF^wU%2$-?P5tEs^zYuCC@?W19WOK)WCP??P>_0 zC7@{&4GJ8^aI{eKS*QPkV3Xu1zPcBr3~r36qB9S4*VolmK$%g&lm-bD5M_qoFyQIe zCtdsl1IzEO*+Bd%Wyyur1cQ@X4HXB>Z?ZO5@=edKFFHWvVo-GTy@3*V7 zn}vOoo;Y93X}YyV-=cSQ9^~{4-TTii;8ik+3ZPS;fB*2%A;l7fKt(mHCx}!)`AnEz z#)x3VQPXW`ao%W}R1^#(`;x3SptaTkrCYx`$;aTU(7?%t7I!;3-f*FRgp%(P-5j>fgo~Bk&@m0Nmr|s5`^-h_u@;}5cA#v6Gadwx} zS#!ZY+%rz-+5n(mJ`Uj!5=_5tW@37oWdY(fIA@LmP5=s2oI6m=$&Jfp6PBqtcRpe5 z7Dk@QWN2IcNF=^0-~=QLN@-lz3u_V$^!<6|I*Mlgkmu zEZ8OgE>Ejjgfpo7uAh;1mU8aYygVUltRhn4kv|+Hz*#m-9s*<1k+6p;o3(2=4+W4V zay9qmBYObj0&6(Dt8l^w95G@v}B-_rNM5jaiu?NtDgdIM^Z>Ny( zGhLR{c7ZF>kf?#3C_M-)5EqApod(U@(m26jFkHTnXVgD77N{)shlvys9_1GSaG1S# zm0p&p{W;NC*3ELaaasCPHO-B!M~uru*WUj8YJOoIr`!gQE5Wov)aQ+P_ETNmCKAv^#l(I5{MO*?bsTj9-xkR@1q?)2h(ZVmf|e4E<&B72 z2^KD1NREp^ts-ZkkO^{ead9$Ksrcl`C!M7b1or_@`oU-AaYzR;mqIu#BydHM!obAj zi5*1dBgk!M6*lvM?VyhXL`X6n_JJR%)!_ED>(`e+1hbszIe~Qe4!%N9=I35(%(%3s zn*at5kwvxo!jm(nj~)uhx^{MuiG>G8dqC45f)8iNr-@H{BX#g0q`*kkV^+Hd<~j)* zVRQxBx>0k&W=y@r*geM|<{?Z9nYSPLc>Pdw(uEC#Uw|-dplIT(9umbz8Gs711E+PU zZLqqw4Ru-pAx7dw;dGQ32!Q3Aem=#fM)tYS|6l<*>O~bZjYrJenb0<f8dPdxp-hH>eF)r@1$ibx<_rx<#s*Xrl;iRN!#a3>a8rsVBx9k(I zU(0i#!srYh3S^OMV-h&l1ki)@Fus<*zV{C~u~RbrLZ*rj+u0t;gUr?keXH%Qfq&vV zy=_%an;2tspUAEoH~g8G)R$6tVdu^rz)OLR*WC9zU1QnDU791Na-I1<*pBxAs*F59 z#4vMTkota>Q||jcUSB;CBJM{gODnY1>%5;Szt$?s-5?GR={<7pzOTghX;%6h<}BSj zA+>s3Do9!&^y2)? zedKr?2qrNrgoH#UPyiy&ioCbsD^ij z$}4t=G78?PVU8W@hjEKzjOrt$jxA%MTH_k+ov)p}N+HW*#@&OG|_nV7%mP+emVujhTW!(}rc&j&bu;~5O* zRoeO)9|>e466Ap(q>j$6SIUt=n3%Kfo<(0HbLGQFEKa5jGw2xsP5jaPOwa`9fIqp+ z3r~0x5)$^okL@K9Uo2ulYzH#kGLi?igr*|}lV0fgbXk%zNq&~@h7KY=s^@QOZCcY1 zQS#?`bwPTbna&g0V(n}8Ut1MG2JZiJCkRx;oGzv@pcWmxN6!&NrXyfQKtxGg0-fvy z5P%0tO(6sJ&@GD^8a5-_M2s&aU8+*m1;X_qSX$yh5TF@BWJd&Fb-howoX*2s8L@~a zW@qo6+AV{tb^9};0p1vcunL8OL_S<~Y=u=6NL|20m@}tG>;>Iz1D=OkM;%HS8ro|7 z9ekRk0red?SwiV44eV1`al{-%i(6j8qz5^LN!QSDEoiq3##2ryD-U&RNN$BbPmkI~ zi#LE=$SEPQ6Vnbd7?*)hF^MS_a{k25V8Aau#DGb58`V|_!%l(yuYO{CG9S853r$Q1 zF-813Q5cq&NFN|pCO!-lk{D!Yd;9JG`YyaM_wf!kaO@mBppU6h5xm{dck$01ES|N-_ zR?y=^E*{I6ZzBpZ2~pzom+dePDv_ih_9z-Jn}ib+$}%}?1?6gf09cWXr;@=R^k`Xm zc^YWo!Ft(Z7z-x4fM$l#=9DY%??T8T#~p#>v|*f&LfRdPAK?0^C}^~r@2&Sh^t_8h zcj$q}$`v?VJ>gMN?@Q>!VNlK_K}AAdoU-?Hka#N})Q1~QW`fAc0OaTt_$Vg;&gbW} z^PRk~qI>YhsB{}(oh_VsmIOwk3))DePVG-uAk>rW0Y;yD0Po8XI7`T?!66em#>T$r z6S}zT5mN4wBp#RgF%RCu)6{_Hn2CfCE(6sJbJk*voG#Gu>bq}ZTE+?D5)M(UdOPRKGa*vWGz;$xU)9MchA!DXJu)3z^Iz{uwjaL=U$B!Q$h11KocW=De zj!(6Ed@Th%2amlr8)-ZNeaIPmgIPZS=Y)`45*h12qh6T%*4e2ET2WX7z^w_j2(b=_ zkg?7yJP%H_K~KkTAkUwMLh!=NIC0O>DNxk%T!y+Grox%~|pFOnb7#jRZ)fe^iojV;;!3LxBB{8vrkEGAaS z$yWLS>4}4&>M9g`2kOl7IFJd^1?E zKKW_GkG>hZQ!8QAg>DNPn=De4o$ebN7G6@Yw@-Yl-Z_(<6ss?rv|jeg*_spYuV(u; zgv~CVRen+><2Wa*JYZ#WZU6YspT>Cl6sc%#cer3>dXU zCM?VE(_~x=T5TV;7v=T=jg*7|oXvRD^x~)!bjLu9vJ=D(AGjP+V)Qm>ikNXmpCW{e6Y<8Hn_mnA_3u#nVruIEqhnNO;>rN zbu{yj^6AsPbMC0Fq!wZ3{PuXkQjqfAFmDs}AfA;v8jE($wCl9LmMF!p9MU?Mawlca zSMFR8+v8Ee$ypZgX)E1c3(4}ob={R9qQm?iG4sfLD1=@JoqEvoV8IoGZlQ!FZr&}M$--7? z#ai2kekc>Y&rb5)+qXTvy?1aojq`?vhiRjte0#h(+7!f%8Vlopuk7Q$ET6w`BYHlVBQ5&cL3SXxgu3COg(*qKNK;f)Gc$W zTo1_6AY^U8B)kI`um=NEIsv^POT;4R85`S3q1l*BKFIWYCfI-4XKPi^xXsWZX+c)t z6l{H8hg2y}TGEB^I^3Gdv4xQ_vGKewtsK`p-&K?OtsBfUUS2X)K3D?rh@dG?uL@4b zi2bD#UknpBN^#zhTX5xHzT#VudQamfV4AMVbl%&Dsi+={<-lXc{T6wrk8uMSD_|G> z|FY;Y zvo;E4VrB+MQNYmb<=&>>J!MhGhobs#4_YP-&CMTCq4XB+4M3lOP?D1IPZ z{ZI;^`U`8it=sQ+rI;am>}>_TAjO49Zq2{F84gQ#YsduPz$ydI497q5jlWMiYG?$D zeW=heHxK;38hZ<{uDY*VbW;-2p_BqD4HD8I-Q6XPfP{#I2-4jp-CYVOAp#0mq^Pu% z2vUNGQVIy%xzY3gzVF;~&vSiveO~0pjs_My!dnvSE2XP_XZ zVP>QDJ1*eT1iA{eNE-+Zlv@Z33;PU^(kmFgeqUuDAXd?&7j(XSpn*l96~McP#dzS6 zg1Hq?m+>fb!p4RNwHI$7;hKZ>4OE{fw~78LY&EFC6Rdw1FJFFVwa07#Id7rQk4E+9 zwF(uw{3F@FDZ7wTyFND#=2u7-$lMgd_u2`tNh2d<>1he@RvcKXo{DtE&}f(Eo)dM) zM%NR_-cZj;6qtfXxG?Jw$0Q_l1_c;^J4^u09;y$5Fm%X=4qRJI4;}&21V*k9Yk6!yS%&SPc(&lQ#jd0QtxV1da52cZu#0Qa`Xl2oh{x19GoeIV2+PV`JQx9;iMEd!o z*=@41odeYvhgwKtIv|hJFkM9PyKA?eho2Cn2f!a*h3gGtR!jd3TYdyAK2TfAbf3!K z;msvgAEOLXU9ZQv^pA3fh$OCAn5}z0xVt9Cz*PF@!>`fbZ-2IG#C4YjGoln546w6* zB3}f~=`)`pm+unac!q_bQ5*XzrD z!l<*ugX91MLvW;?>a$yR-ytdbsiGR($6|AXYVn}Dr}stto%P!#=J(iE*omq--#!`I z{xK(q;Dkn3gWKzu*3ZR$pLsdd{rivp1*;YYxNJq18f1J8*fGZs09_72>`^+bD6(vg?e8 zWrVeNa5VhX{kvN)l+;J3W>jjEH2!Mnj&H)LyMHSDqQuA8_PQ|Y%n8G6FiONgfVmEd z?9LHKYjtFys3OKp6OgV;(kQg;A!M7YATzqh0PGm5xzlJoCzXR8`L* z6pTXuZ31v*PM$L?aKJ+Vl2ISuW7COGP+m$4B8E(`*{yGA;Dd603GiEkW#Tn}bdIE2 zf3bzA!OZPF8_2#osfT~7sj6ZkkfNa~QMiMo-s|Y@9ug9wp0&t5I`CM5?RrRz|#~w!zoKnD=CE<$|IJm-^ zr4Hse{8_s)-?!%gKWl6h0HrrAV?|ifOUm~JpC_-$Jl!=Uh5yYI6NijnlPT>&i}l4z z!Yy%$FN!sqmGB5@FWH-356oz5*5B%0@($! z5PV>`0JsMv{DP!pyC-fesYg%z78>?J02(m5dy^m$k7DRWDoE`)Oh0eN=YJ$!9m=Q?k^}6 z0JOh0;A9YkVu>nDm$icFG%C78u`<{VUI98A%G3T^c0-~alx?t~p_};&>Q|6!e1K{z zG$Dj?G1@I2Fsne7UQpAIaujXe8}l&nd;!dD;OlSr94LwVdFQvs={2nMhgQC(Dp9?E#)vMQYA+w&ou)_moPwqJNsQjDfmFX{BJ;Hn0Io)#1N7G|FIYEkIw0OEq_{5RrG%FBxfss;nt zO_XK$35$gcM_MtY-A-G}i;G3TcpyRW?tKC9vm0arsJSsZ2Z7`oZ_R^b1SO`N9Q*n* zB~gnwK&EK{ikAhw7=Y5iDnKD%iPsTeEa)LC&D(^~S}725fk;GUW1ZYasQg(i+oDQv zOEAvxms`loda0W%Xm!M15sY)w`+WBO0ojW;anDH^Za)iGj3b`zsN{Z+seP9fhw zaXXGp4nwU5sf|$OP-(cNe_Wrlg+KTkpt^iC_=6|*HxxeyZxUq-N(MlPN(K@bh^?(5 z!~~KFOaj5B@`33j+(OyoT3}8>_)dM_5g`ok_7MY71X^p0R$xlYSXQxW#gH_+02)Lh z`5QqUWFZ3x2Sg6EADtj3`?v`2UlNBLV-U>7t>IF7JZw_VNSh*CGiG?=mngjv)x zg8+T8`4<8M?1lRLLGdF)`C_Ds>5FWs0AP=RTcO$bAlcgirm&W6x4mN%`7MV?*FU=KkTpkaEGapkmhtUO? zsHjU$`|;5|^?V5$fM^u_{3PHC!8$4&vNSZpLLiUigTf-zCLV-)P-ln(7915`&!Nm0 zMVy|vOq_sxQ0(>s>KR9t@WG`WC3hh(!bc0-K(-YQYQqNuTM*8m>KK#=K(%maStXS0 zDSS_av>v?%l*BcF(XIoZ zU2+j)M^EXoG*($%)&jWQ z^=18mdNErWax81Yy7Lm95GQfGzN>eN8xDikJylcxMbU3?Zl*edBtfp0tw{TAzu{V2 zY0t8oT^?)g!>zCj9Ztgj08HA5!Re!DPfo83VbqOjAH6sKVs)$m%xbFV#VPq0*m3J2 ztmIH~~g>jb;&SZ~1ix+lCi8xM ztSVVr$n`hq)J>syz!>I5|0yAbk#QYfag=_cx~v^7~7#&mFz{ z;YM0gX`w_TdSg(8g?1!?JQ2129#*&48Cmg*B4z9SPf*jKp^GZkn+C_W`d zK>u;7TL;(G=7`!hNwo|hCTRAvs3WOr!qU5+~;p>;s5X{$S7Tv#HlP7yqi(vj3f^|?>|t?5_n zxR;|^;h?(U##6{p7^crxjiW>n=zxc9_Tr;jpWD3N`5W;eMnlxp^aNdQ6< z0tHQCSFiO`NM(zyI5v57w8wTt=OnERt5znw7Uey?GO8P&5ac+XuK$pl&gEg5Zn}p& z#mP)d|4Y;1@NCuxKl5YMV$LpKZS1399eoV>n8M1mNA09Bb;x+A&>%H+7-@+W`jUKx zCuBwFBka04yf~n3%UzY`QL2toSA3lNiunB0cK0O=Jg{cfwfYxW9SI4+i9f!#_A3>A z+Lfh9wA!#iXOc~(tb8)?tDu5pWED+>4iIxyWhA7|Fe@9J%-j03IL!A&+Ii`UZ zP4PHPQ=!D`;hjaaNqdwZDNT4^tpsA=4m_C;@9z9ycspg0^U<$OxtKy&DpK?>#*GE~ zg7ASaeDFdX?IBBUFpbkg=KszLc4kMsV&kU8h$ppiXI8++Bq+x2r8?!Llz$RT60n>N z>WzI$X=-wI+ir+lMoaWmL}*jmn8KB^I$pmWlW*;*_L&kCvdTEt+s?jBeFWF$qzT|} zg_g^l{WPem;4Y{No^#}JvEl-eC{b@*_5rmj?xo>UO{z`R*xVFF>Q3i{?_j8v=NpD3 zFn!N_vcr!I@O@(276DNd|GuEPf*l=sNS7%;13G0gbBl_x(<-$YSb~}5+-7kV-t%y^ z)1VqA2|w#>hErjrFTebK$@KI(%9pD&=(e+K6_ z_*0nRi1Xq?e20#2VQPdNtXT0$}?V+Hog-WUvAwqo^iNRF5%0Yil zGqoU+`CyrG_ZJQI{p}1B`A9vB;d6QGdyQslMKX0?CsW70OqlL7l^NYC-CiXMe~2mA za^{2}eFO#b($Z1~{UQ0VO}tg~k;hNrq!z;)pPxHTle=j-Z}j#Q(wNHKc|}d!4WZY+ zz|Iah`Z+N|lfO<$Pr z61qpejVUU@u2NL)3#JaJZ(@E2H0e@xu|gz;v8il+@(V3G=*<)%%kd_D_Z)%*AN=|{ z+L&00p0QN71kW3s>*X_6+|GMyU`ZFXru|JJ(=fva8N>9QDCZJLb+ahFHS~&|X_+E` zEM?a!F<56sAat5??3WwfPbken^`=DyUIJ z-YKo~B9z8hMDztH$wnp^S!4&cV6CW-D8~LuF_&;(nXug-`0~=1gLCrELWV0VZi_`+ zMXjePjIUi>S-_XG(|8|$Hm<2Om!%vvqE}8UQVSCgK4Gk#OX0~u=yYm8~I`WoB;E&PZ%9860GXsAcNz&|1hc7 z&3lCxeB62Z->y=hRbvbi5lAgyDMN1=Pp7D7;LUTL5(E)E!wt#`{@V$#*I*OiT+g`t z6M|+o%A|bB0WCVOVLLGM26!WIxNEb|G)?y8mzhs!x`1pDAn*yoOSS!WUo_9`ayR%< z{bP@z(gH+XSaFVY#VMq^?9UR&uY(3)BsB=ykrmMw&zWwWDSxM$Oj&2~v#6D-&LRs? zh16^lDaUMZJO&N~^;1A&Z%`VMK5X|eQ({nHjYK1^`a#k3q)s` zqgKff@~9e;$$D+T2cF){NKR+}G7LOM1gAGO3XJ`B)b(Auwtniw8U$*j;pkY18Bk#; zAq<>EtQiaVsRAseeQ+^MAMu%}J1*sL91{Mz=mJ42p{4&B63d>EAqM-te@ngLcOS;I zvmotFdYQVI;1rHu=-zFXwR685^3FL(hb#3eh#@F~egr?^28siZj}$IbfQLFLDlaaj zklK(oNMIl_e#F90TEK&G%UQ;1Z~xuL#<%WMjVE9_f8$K3Eh2(IJX!*Po#5!jGFgE8 zZec1Zx#jMe7f^Bx;8KZ-`YHrm5zfWPdGb~7d**C~sUUg@kg8h&qh#E6ha1|H#8Pn@ zqQ*DixuTHVk0vKT^Zl8H0*3fZTUlijd@h=hPyS(r-VD#Hc+(m{I792bug`}#X7T5$ z_VTI*C@;#PXKo_?ru3ngf~r;5<_19(<#@JWca!}-~XT09!3HI=f zJ(%{nI+h|LF!J!g-r$b?VW}e|l_>qZzq@`KL0cuEBm&T?@=8k1=8AwK%73XQ)%~gN zGqWV7Mevr0EAxCH7HRr1Nf2mj32M#M$i#l-UYqv|TL}fhwE8PEIj;xlti^xKTqcgF zkV1#%TglGspuA(ORkY>K!f2=!P$|>JBN(mZ=He5#}Dysqx3N@md0~7|hw0Xe#QHEV|Z-GJ&MS`Ql4#Z!*l8KAV z;;&in7e&4k<5*kaGrcxrD|z)l*Zrj}xg5RT(+-MzLm3x@&;GI?@z%2)ynxsDd%?JA zda%oCk+@rinz~c#{?L^*)nOn!qQKl^9_2Rxpa2jPT5W}bVz6t( zLXt^iJdxLTNM1n3^8&+(ignTA4ZzBbq+z66rP|WI`vOuoHT6UFGHnY9@_HA0B>ln) zE7t`G_UaKZ*r-_e%m?(9(a*}jidIP;2y!W(B#gb_fo!C_G;1( zos04Q9(C9UiXch)!`7b%?_5=xbE>!^wR2T{`g+9mF?5;s(Me#}`JGZjuXh2&1XTO; zv{b9X5V&|&z<4pSmFiC4n0@gOm)l@rS?=@7Z~Mu{2|jTUXx-RwNThQzqmeQ)Ngj5G^jFz-j)h-K#V^rb4o{u4`o_dKXL{_us7;5u3vJKan zs1-*SZ_ncQndGp7Kot8oR%V$HN?UKeGV72Ao3E^`Djb7`BKE~sGJ^qQO8%@dN1OAN zF5~A;`T~}g@-aVTdS&;MwUYzUJ5Oz=Z3@W~j#GgMad1;pRYQ?VlS*bXmSHNhzPB~J zC)li(eEMVCdI`4NG&i5I$)UawTM3CX&@Qq)#1qut=_HYNCI@g$}|s5Nz6K+_r$L%5G}dcBteM+3IL`0jHKBwXJO z4hG_UhCC>j%qpx-4 zxV!l`wdiubqk>M#BQESmKkv>sdVXW>d{_0EuVu6L{cpOU4}{{EDB!_RY2d@a$C>Sx zyJYkunL479i!v{c-&S;%_k?^MJjNvMq7#ongRYt9uEJcO7k)(duqh9Z$;;f(7=)0b zG?%^P6`u#O`;S#BY~tNp{Ej&32;rci#O>J^wo%XDRRBy`S%59o!jCED5kXs9CZ-6!;Ba-T=VBS@o0Jwq7H?J>!t-uy6cs$ZcYkiy__ zCAD4DEBVHnyMQ{+8@`-JrM#|&Ga(alJ%-&U0Dignwy$CUv+K^%MTDbI7Y8tRQxgA zh3XcSv;NQ9cz5Alpn`fwCGVXIna`Kb>;$epvw4n(%lA;sepCt^dY%mSMbw9_UXHu3 ztbOo}CSZ~+RV#ke!^$r2z1G&(H~O`im_5G6q;HEf3vfTjIPVy}CDQSDg&*#==U1xo zMB`+7gbDpIub#;mLN{`}Ess(Ae=>jp2Z7r#GKwW!bl%=*i%&Rh$@`9gCW^Hkja z*+(_5AN?z3(WlNX_`g7=7;1!A8c1>!4jABr_4UJB0Wq^OIN?N?rbkO;oQ5D#VYr>RBEO-k2SE#T5^~gkP#)lbn#H5=CU6LLkKRHC zk^>v}7rw9P#+?-N-lgx<=%c9#EyS7OaJ*taxmArtqH=9T5RDk)~B6PlhK=;q#*oOgx~tl>`i!LhTbsZ9e^1{ljNe&|L3) z%jH*6%Qn6 zGNAo%_ZnIh1jO@8ATP9`LIh>uU`(hb?z;Ib+&|G>Qx=W=v?W!(^!XAwG`8q0;lw#E znDxKE`{#R(XY=RHsVDXCbv57R2+%nk?(-YCsb7xVF1UgqYhBi#xwuq~i)}uK$v)M{ zwMx0Ub~WxhsMXM($0&awYZvE>6crWy?l?X;dD^f{gr%x8CQDo0Bucn)-+mBZBRH&_ z^xJ-}7CHJC>tgKU(!8dpz;}72(8Z-}G4ZsJPif^Kma~B#IH)X>OW&ymMf5=^M3gOx z9*M@FJ=bw@M~@>DA^RJx2gsh2SDoZv+kRloh1D#!yKA$U;IJt*?%{+Qc<=>8FIT)UEYRvj= z5dpt+8!i|M^8bM4H7a333ps#_%uqUy&(x}YCSUr6m7l$21U9_g)h}c(345KvW4{8w zjaqAf$~gka_E%6?h8_&HnF6JB5m0xF$x0`KSDczbglJ1VPn-%g#MN&jIN8ROK3oaS zalq_|z-x6IxHM%?fMao085AHszCjc#Gie2+7FD!&pMdi-ME-)F7I;i$KU;su=j$_T z*i8}=pb+?}QWt}y_cG~oMMoCshUQijlgN3!?}(>#v&iChn7(J5gakc;=Ez{!Nq12(4lO80_3Gf83cbZfKGU#-Q5JJaY?a>1=S6jZRWz6o3aY*l zY<2EMV+bS3bxS>Jb7W(@B3=CvGiPY`P8J2g+)_CxC z1#ZPVKB?Kjgra4c$nm^Yedu6y9=^dqP%aUy<2dep$75>M<5gyeW%TF5q*Ofbl?R9W ziGPz|zozz?=Z>2nkIlYMyNZwE$B;>eO0o zsz~(f5fjGJx7*bMP-$-sI&x6AIQc9_qQ@|xA{sOQiubTaXL;J>zDo`ts>Vi;*b7}W z%<@LQxFIXhW(TN5U1AehrKnhwKb*TNs~leO-?*%TfR&UXo!!8ePtS0SH4~r&nnUFI zfdFXUPJ>pw8z@FFb~{B!x1+QOZBhA+m*AOPS)7r8rf5k!#{NMr*Uo-fp4+jSF}uT3 zZOM)&^v3@-&7<4A0Ty^wog!*#`V4#_=tlEP2Q@`O6?Whx_Ze1Hr;Xe64nI_fK?m{^ zwAVHO!vKc_J#6H`xe09-|6v`y#n-PbewoZ8Pe<;VB-wwF_Rr$rRe*i``O!^dG>?|Q zK(Sougs#?pl0zdCmHOMY0$`q$swCM61+z40VIQm*=)qA@EtC&}=h*c|3G~QRDYe#Z z4d>7Pc^|OPqTw}3sgr#?kxI|tOkvx82GJ|T{p!F!);0ZcMfnc@^5G^PZ4h9Jloqz{ukf0Re zCD7=o=_6|9geuWN^%-Mk56%0)4S(Sn>UNwzNRZWoCK_cd4XrVQ0Ng>rKF~sEU9t?{ zSgLAg3Cm2E=_@zM#|f;aDr^b9HFSDV)TO!RwAC7>Oq1{$ss7f&Fg`nxoyp#haQyzh z@k2{`-(3OUk35@}uQUbM`5BSm+1VI@9;C<0JC63_X%<{-1%`cE0LOXUqMT18th*A zvv1@WDbv@UC@)rcm-TjaMQn60fV@1XPCN^v6WoIdGp-;Yi0uTK@aPa~tOH>{sB0vt zIV2h%?!GLI!BFl`g-ZX3L9j{nhO~I`R$pjH$KY|h3fTrb@gG%LXo|~JV|s9L;yU&1i}} z-naBLQBsLo`dGyQ{UFPf7Q)AYt1_|TX8>9m)GHJ9XaV`>4@Zb#x}oT7Zh1KpgfgRr zS@#asn?dLm_!;(QIp{+G7Cym%u@wRUd}@Z6snxxSg3hzOzPQaNjMSidJdFO?UT}5p z<=yPU69licLJNyvSmE78D)dqP-7NC%yP`5!b z(2Dr=yT0cXY$J9qzNQDHmPiO3x=hPmMmzUsC$e{tGX6480Ql_3lt#nUO<5R@s5>oK~KokXzab$oVFpz1eTH77)XrTqhY zAQQvc6Z|$R_1Kt_IH?1d$IKr}1CSj>s=HW8tbX9Fdg^5;^EP%gDYyE>ak#!D<W;LD10P4uw~@!yf4!at^@#_jC~;@rcOCYQqW-T#?-kS;A( zmr1=%MGZdeNWQpOW32sX-gAU~`WBGFN#s3LgbmI4;nb^A`q=4&UcOQSI<*i0MgL<+ zb}rkq;jx}?UgsB!c$(2(l~0Vpf>)*d_kI9fEX6)HYcu+ANy}_o!nz*&kHjOY?2%HL z>-bC&-W<$w+~`Cgp`*Cnp)L398IMiFxFL-_zZGOGCy`#FjE`d#u@)T(CbwyYT&t+;dCg>%sjOe}@#84Y&@wGbPl&d<4O7`IufTx!H3saJ9^BwG zi#4prarzp69Um{9JF>%toBkX<*>^1sbbY?~ca@;rnU0iu5nnuH7MVK!L5ZgkH`ePeOe}V8HCasj1$+x?^OI0bVKB%tJ6H$657@ixKw>)EW z&2OTFtTg;o-9Rv2{rI^&VB$WAu!kkN+cJG71v?i+fJ3uyysU+o<_oPUEvO#GbDBGM zmbg3J-Q!ws%5q@s6VseG$W{(7*IHhwufK2na~LkYCzv{6%+QWxOAW|?sK4r;&)*9_ zVZWb|FW=T3#bulR-8uxh+6HZd)sn~HY1q4@3jd;Zw~+Gk zW$`yy-65BaYXAGv&sfRa8o^S>YtlTMpO{4ODh$nk0Z;!=O#=iM73iJzwRw2n)-Xb_ z;^Sd9`T|LQ>NKatjA64RM(aDJC{YNWnbKmbvgvfUBy@|?9F}hcW!D1IBWm8pVUQoN z0GG(IT^g-_qwn1ySk?&MSiZq*8;2gNeDl(Q#_MPEwYRH5SfOknuLUDe@r{1n{?^lX zgQ|-&e;`p&kFLF4$`3NQ|02NwL?{g@Vbv(rMF-`7dOY+^zgM34vteDX?z~1xjFWbS zwj`QVmn>^t@ZAtO@zr~q!(pk0X++H$vrHh(RpuZXCD27>N`$1DMQ&SCqYP9LH;#bP zFTY+pE)uMb_kLqOzEwk(g93d*P(_~Bz4oCPs1qVu(?*FdMMzmaWX44xsbC`Q++T0D z1|m7A^XVvc0-s@(?L4&W!Gy{M2(iMTMhs0Zcw0hfTwEw+t1(-T&J55e#*3PNp|v_HBu@2+})0xA935yGr zhaVTKKO~yV5(cAh7E1G z?O3wpXbkY}Dv8a~Te7iBDkt;_EG_AOl^3Y2lNZt|mWj&-Z6_%@9OmQ%em)a8$Cg!b zyN9a#Eao?C|42aNUP#~h9LZkjaz{9n_F)ukWUoGoE|>(MtIRnf&IV8(4lk80QpB@Um$bh zGP7EBA=R}W^zQiq*fNLYP5<<;v5}0E&c3DmWPc@g@zE%YfnFSG1M6Q^+zAA(>w|Zb zZ3PPwT1*%$(2Rq>Ou<#TbHDKiDf}Y|6TAM)WDC_Np}()Ym-*=UG&@5~F8%cCBmgZQ z-FNUH62F=A?-Zp&e0C7fmxJZov5heqrO?8x1IyrIX z)nNIMp!F$@hMV_Yno`>jzKcUC2H10Co>=GOv9rS6gT90biY9=m)N_bTB8Y?;Wn`T% zkSFO+i}nD7dN9;Q3(k>xMRD*Q9y|I%7>3^+N`*EVM~4SrGAsjv!JY8LLkv#)c4qB} z=Vd2S@(VBmIk?9h=wxBv>e()W`cE7+B1^ z3MVQx23s*3KQ!>`!?*2eyM4bvh@~NLsL^#s4E2+Nx(cxYsPiWQ`Nh%usj0SH59Ond z=LaMS)$PBejwXq&3#;5;j66jiPf!#?S##kfQB<62xR`bp@t8OVQ2@Js^k`qBthH)k zPvVqJM6#vD;|RmYjytDyxrJ+NE<(RD*%d(?Tc#pi6*ql#OzSP%UmnL%CKUB;k0ux# zytn+`ep~p~JJGP=S6|>?@GIcmG8e)q;!c5Fdh+ygHd<%heg2+0_yF7={`#!4`_GD! zl&-)3^Nl<6ucO(N@DXXZ1r|iv<+i`?C(zzQyHN&5NFUq1&L=9NK6X$8F)}^96oBj0_WhV4wmKYPYDl!E1L#|Lr< zIjuiiJzF;Xev%(o?s3SH|$Vtfv@m)LLhH`I&g@!fNcDgRWx_-$D#N33S8@of}bpR$$V zyFi)LyU(5W*DU?}AIlGpT=U9J8K20uw&dceNi^p|FOa_E#_Hm;n?eT9Q!c8s+AG$4 z^B38mf}gC!6<0NzZtpF0?rW7Au{85Y?RKO%7kcr*bCi!t+pVO(EfH=ZN7XuRjhA zaphoz_B3ohO-nBS^zO^xMDwrT>2RtPjIAg4lft`e;{G!sp67NSyz!|GtlDer_lJvN zv@rIWRvjFPO*0sC8k%hO|K*VOZJhgfC%uounLCaEKH>LvO7pTIL$j{Hj;>u=lke|0 zq1DcA-X@kt8)L6{&X>AjBF5b((|>1RAk9VSL5WQ9zTZ(lT*Y>~&y>no&J=vXM@U|?FQtq+lt-rw#&&8KVZSkfCEYr{JFq+DI&?Ho8z@%S`)nIW0YdGzy(=osL! zJ$N_@XY$^QTzZACECzkoo) z-T)%%m6$2pE7)gv&_N5Y`a<8o#rgSeK3!_w*SM8CFhx?v*A3xK(!cWMyw8kZP~?Z# zul}5Uz^um*lkvqcrkvMog6G;ze%hRb)8>-pqPz2=9>IqCD< z-N>-d+=SidxuH@hc1mTpJz0kQ$sMz2za-^ZHjdzU69UFkZQ~qHM7(GBAd!s>esFUuS1md6Rn;(H{Tf=cfmy?pOnKsSADOVOlZ>%nd#B3jF_m#PFID$SF+FHar8izayD3ku zpa)gVY_{_??RIcwNN;d4yX-gO$lBi<$HeVA{S)s4g}hA19ka+-5d;=XFPnE}D!QN2 z9qhc_eQq#PZWr~G+=wYl(AyNdcI8{_V%g5)SR5qi?&41M*Sf%+e=FMj{;Chu4Gj$l z&pw3OBs&KOXQ)i9uCA^Zd-%v_%0)~!`@zMsmdz)8Ok{8oQ4Y7S4sC;?>_KIvGE^@$ zfKNU2m$)$Fvz#a9_xlk)|NNT15bPG87y@H2eO>k5?aR;q%z)I|l5+?j>VRO`4a{0Sw7Ip2%ha1jNZ$&HWe!X)&J11v!diq=2 z?1vAeu|H(*XMYa_4keWwwq68NH8Z)qxcGvBpcyZ8!nEH z^<`xy=T0Gvr^uOw-%xB`B0ayaeSHF&AFMK7UPx9tKL{*U6ENpQH2jxVo*YRIfYplL zzqxRxWBf7JC8XH-uu4rMe)p~!W zYvtbTK|U1r7r_ObKBJzp;3fiJHl#2e&PKu~qq%Ei(F=aQJ)}0-;!y#W?gmCitSas; zW~#9nb6x2T>rzwGsvV6J&EKxQPlJsM2?{$iUa>ska@6s=skBrcY)4uv{pAn0lgesO z|7mZZ;<)9}aCEp*l2Y1V5MX*o(jPl1OynvL?<9-J-@CO^x#w%4yLs1hcl|0l2S4EZ zkmHfSHaOpULlrMp^U1M`W6Btzv*pnXoy*Lyz9W~S!&(-eQb|~p_;*VVzLR5cg7t9W z=RVDz>cH!Cpt|eaDQ)!e00&{DNn#ED@gcG}G?_&T7XO_evAAuN>M8nv zmw9~?yPLmo%j_F1x2+xhJITru4#xz;9$6xYFaF;58$o{vVw(S7admgG6_ zPZa_!qkNBAV6kT+@woG8WAEhz54?Cm68K>d`Iu8`EKgW&IX07(zl$rK^ljc9_oF@g z->)WCRAYIsa-FC8C6!M?In^VQ)~f2Xhe$JHLbzd!S&_qJ%8i}&M$f!m@6MjN~LYO6GcKA-<~774m8 z{qHA?fjM9 z+(Uva;{OidRUun-T}$srCoh58&dyFBh(NwRKL3lR>|wil%67)_!;zH-Jc-I)-+$7s z*X`*u2)3r>O-QP%j*>7=gl!$=e7lkJ`S0xq?`$7th-%1`igb(|*)^qfDY|#u$syRp zRdh>)c}Vj2F)!Q(D?a#xErW)j|9iBFkYb}j@+f zsVy0WF@gkrvvX3~Z#cQQq+Wu?Nhb98$hKyDd_264hLL1OJsOBrjkd@AflGTzK^ApJ z&`(MqtfX9oaz^J~o5{(A!dvb+{~90GK9h5r(}m`LO3*LN(_=zi)Ph1n9MH_+j<`dp z_gIbx#UVTIWS7+B8~Ext-K3`Fp*NH$FtL8eJYa9pFw(jOtX9NQ85&~)(JwWt;K>Lw zmg63c?Fy?KbjLT&qF*8KYwi>bhiFLmFVseFUN6N%gr4K_<@8r`kHk`+%C`MhLcg#T zrx^90*<+U6{Cin{Vh!6a9fmWaL&Zq{U-23G{r_PInSHz|Tj;UGiS$}&dGKTf{CB3NoYanX+pPF6X+6X)r?@CrO zs$3r`DlhLcyE9>D52ElT@b^j3c*IBP)FvP-oC7_Ma@xMP_JgO}_{hje=H{k5cqkN6 zE1fqldeRT1ZrrOmUPv#m2nq@=d3PM-LB}dSxUai$7ZZV|x|0QobPjMbbSm~B?3Zb- ztF09R6Mg%mKYL5yM`RBPJqt7Q8%W>Zuvopu+0?aKKp+1R++qWVu#N!pQxqV8tz*_d zvD+4mI0nFx?DFc*n>TN=T)8s*=g+cKqG43}ib`8!Q&Se0*=8;0-+M<10tY)Af+lk? zm~nvyA5NyGrZ!Z?%(6E#;7}gG2zi?b2Ojxd3HbB7vdgh7Rz+P);%C@ipHcge-3)lg zIfE%G{hJ=rTmNmeTOS!Ov1q_gvq@k-GVY|7LU&!u?E^A$1|N1__xph- z!<9@?H9f+bBE$4V5^9fqB+i2meP#bXc;0-4n`x``=Q4gI4K>;`7V6OtS=Z~qmhJGH zgVZp%CE1V`3BqYdM@N0vYMMQvv>!dGnD2Ha2umv9Zm7a!sXe8fMZy`1E=8 zb2s-;mw@f~)$I>*KzW)9uX!Q;5`6}Q9KeD?%wKFw)WC$hx&0+$KMadcM(yjz?(+dnO?5)&} zs#Y0h?$J;}4Nd}8m)EqQ(FRz_;`)CBXS7D}wtl|1l+y|fW(ke`+wjTDFiVF)ZXpPM z7qen@QUalmA9dsQrq@jEbr<)L$Db@@4~M6*Sq&6&|3am^EogPqqbXR{@;eF z{L_T5$_3~`EIi|R|IWsz^=r#JzD>{*iwZ0z{id}7_w~3f+s$D4rM>Fw$t4?Kr9EYb zw!#6gE9kJtG`mhW`#Zw5ZCwZqK$Jnx@R(4_$#>J2Mk?SLD9BdTMpCvx^Ti1gDE5?; zk%l{tPEM=oeqt%!2jG%cRajW4CayX}ck3D1_90$x_%llCmR>1zs+IRd8@m#GttX2T zhZU@A&Mg!45!s-t$$?Jv+0e}-tF=|kHjWP#AOYxXwSBV##`k`9wv&a=B9-2?Yin(= zJDBdU$}3c^ySHC|E~`_yz6pMKnGm~g^;}HMtghzR0td1jXdb-cI1$P2nTzJxXDjmYZC=`wTEYu5&#V}nLOK>Ka6JQg0F_iuBA-{*qC_-?S$Aj8oQ z+>VX2=Wgm48{dF_ULV)_gs0s?(Q5#agdms{@+H!ZUF5-0blpruCw7(i1NZv_nHCZj zxadB1wlntxI>HM%0gaH5!W(FA5yW@``5=sVf03pC%a0ieA;L}>uwT0)4z6h*4?cm_ z1kmVb1Ck&D3ZKT7w_tdHg^cS#fmSN$fBXHF8R8?cZRig?ztMq>Lbb@^YOeC@(>GDgw)^?``eKG8=UuT1z5Cm1jV3>7D z1;P>EA1`nXEG;(^o)KsmaLsu99oF%0$$%qn0dngP3HUBC_}^#9P&tk6RN z&P!K+O?$NNPC74oiU~qfTC~~p>N*w@b@ktM>HO%HI?|av+HKx}4lT~G*@3xxCir}C zs-R)SC5TF{9&QiQ@?Vdte((T#xk+68`I-WD<@?}uQ~_qyj=d?&uoR?R^8c!4Nx&Wf z9cxyrD)NWog+HyRX@W@vikEtRi5IReE|vv6Sw~SZ0=lJLoPtO3RL&Lir4_Pm3_XIu z-H4BdhNgWAe5Bu=^PaW_>E(po)LDzWFTmCfdwRT|r^DqGG5t!LrLL}x*=3{V^-Eb` zS>b82?B1fTq7u#5m6@GQd$IQ+^gHPn0^^t^cpR(su6vt{<%NZxFC6J2(!~(zL1sch zT0Z64rU7!C-I@12Js&{+RpvS^*Vni8li!vG_)x2Y2V6ca zE$s+&n_c`h2TM%jE$FD&fR2^@`}`VFkEP#Z%}q_vPd5C}3<>%z>xy|Hwiq8z5AGK3 zf6Q(Mo+R4B#eXYj_S=)^5J#a4D!A2N{qy6M>0_I4Xm?i#k}4-qb2&LW;%8?#h-UuukH@1@fh5^S+j8@b-bqq@d-}6!-ql{2(b*jyqdmu z93N>P(<9R4FiuyM2C?N|fEGyruG0MW8>(zR+lX&>+K!;n=Y#rsUFcI#3b$4c+hL7m zhXvTM4TE)Ai9^0mdB0?B>naBry()rH(3nxjzCl?<1t*x_gWRMI?V1@9f(aWkxACLT z#g`{qVOXy2uarQfoy~6-14;f)bH||u7@UrQKUK*nB@!eegHDv4QVzyssxxP@!Q^gg zV*_%Hx8VJ$4*P0vUmriZFo6*5;Rd={3xaNmg`0Z-3JE9v%%AOmH4vn^{!&XpG$EtEsZyEJy~ zmLQn;CTcqa(t*)}A0?5)x11LQ_#c=h&h_6!gYo|#>k9hu|I4!TfBf+giH3y1`2K#z Q2>N3c1x@){S*wu$2X0BDv;Y7A literal 0 HcmV?d00001 diff --git a/doc/modules/permutation_importance.rst b/doc/modules/permutation_importance.rst index f2530aac3a388..368c6a6409aa0 100644 --- a/doc/modules/permutation_importance.rst +++ b/doc/modules/permutation_importance.rst @@ -6,15 +6,45 @@ Permutation feature importance .. currentmodule:: sklearn.inspection -Permutation feature importance is a model inspection technique that can be used -for any :term:`fitted` :term:`estimator` when the data is tabular. This is -especially useful for non-linear or opaque :term:`estimators`. The permutation -feature importance is defined to be the decrease in a model score when a single -feature value is randomly shuffled [1]_. This procedure breaks the relationship -between the feature and the target, thus the drop in the model score is -indicative of how much the model depends on the feature. This technique -benefits from being model agnostic and can be calculated many times with -different permutations of the feature. +Permutation feature importance is a model inspection technique that measures the +contribution of each feature to a :term:`fitted` model's statistical performance +on a given tabular dataset. This technique is particularly useful for non-linear +or opaque :term:`estimators`, and involves randomly shuffling the values of a +single feature and observing the resulting degradation of the model's score +[1]_. By breaking the relationship between the feature and the target, we +determine how much the model relies on such particular feature. + +In the following figures, we observe the effect of permuting features on the correlation +between the feature and the target and consequently on the model statistical +performance. + +.. image:: ../images/permuted_predictive_feature.png + :align: center + +.. image:: ../images/permuted_non_predictive_feature.png + :align: center + +On the top figure, we observe that permuting a predictive feature breaks the +correlation between the feature and the target, and consequently the model +statistical performance decreases. On the bottom figure, we observe that permuting +a non-predictive feature does not significantly degrade the model statistical performance. + +One key advantage of permutation feature importance is that it is +model-agnostic, i.e. it can be applied to any fitted estimator. Moreover, it can +be calculated multiple times with different permutations of the feature, further +providing a measure of the variance in the estimated feature importances for the +specific trained model. + +The figure below shows the permutation feature importance of a +:class:`~sklearn.ensemble.RandomForestClassifier` trained on an augmented +version of the titanic dataset that contains a `random_cat` and a `random_num` +features, i.e. a categrical and a numerical feature that are not correlated in +any way with the target variable: + +.. figure:: ../auto_examples/inspection/images/sphx_glr_plot_permutation_importance_002.png + :target: ../auto_examples/inspection/plot_permutation_importance.html + :align: center + :scale: 70 .. warning:: @@ -74,15 +104,18 @@ highlight which features contribute the most to the generalization power of the inspected model. Features that are important on the training set but not on the held-out set might cause the model to overfit. -The permutation feature importance is the decrease in a model score when a single -feature value is randomly shuffled. The score function to be used for the -computation of importances can be specified with the `scoring` argument, -which also accepts multiple scorers. Using multiple scorers is more computationally -efficient than sequentially calling :func:`permutation_importance` several times -with a different scorer, as it reuses model predictions. +The permutation feature importance depends on the score function that is +specified with the `scoring` argument. This argument accepts multiple scorers, +which is more computationally efficient than sequentially calling +:func:`permutation_importance` several times with a different scorer, as it +reuses model predictions. -An example of using multiple scorers is shown below, employing a list of metrics, -but more input formats are possible, as documented in :ref:`multimetric_scoring`. +|details-start| +**Example of permutation feature importance using multiple scorers** +|details-split| + +In the example below we use a list of metrics, but more input formats are +possible, as documented in :ref:`multimetric_scoring`. >>> scoring = ['r2', 'neg_mean_absolute_percentage_error', 'neg_mean_squared_error'] >>> r_multi = permutation_importance( @@ -116,7 +149,9 @@ The ranking of the features is approximately the same for different metrics even if the scales of the importance values are very different. However, this is not guaranteed and different metrics might lead to significantly different feature importances, in particular for models trained for imbalanced classification problems, -for which the choice of the classification metric can be critical. +for which **the choice of the classification metric can be critical**. + +|details-end| Outline of the permutation importance algorithm ----------------------------------------------- @@ -156,9 +191,9 @@ over low cardinality features such as binary features or categorical variables with a small number of possible categories. Permutation-based feature importances do not exhibit such a bias. Additionally, -the permutation feature importance may be computed performance metric on the -model predictions and can be used to analyze any model class (not -just tree-based models). +the permutation feature importance may be computed with any performance metric +on the model predictions and can be used to analyze any model class (not just +tree-based models). The following example highlights the limitations of impurity-based feature importance in contrast to permutation-based feature importance: @@ -168,13 +203,29 @@ Misleading values on strongly correlated features ------------------------------------------------- When two features are correlated and one of the features is permuted, the model -will still have access to the feature through its correlated feature. This will -result in a lower importance value for both features, where they might -*actually* be important. +still has access to the latter through its correlated feature. This results in a +lower reported importance value for both features, though they might *actually* +be important. + +The figure below shows the permutation feature importance of a +:class:`~sklearn.ensemble.RandomForestClassifier` trained using the +:ref:`breast_cancer_dataset`, which contains strongly correlated features. A +naive interpretation would suggest that all features are unimportant: + +.. figure:: ../auto_examples/inspection/images/sphx_glr_plot_permutation_importance_multicollinear_002.png + :target: ../auto_examples/inspection/plot_permutation_importance_multicollinear.html + :align: center + :scale: 70 + +One way to handle the issue is to cluster features that are correlated and only +keep one feature from each cluster. + +.. figure:: ../auto_examples/inspection/images/sphx_glr_plot_permutation_importance_multicollinear_004.png + :target: ../auto_examples/inspection/plot_permutation_importance_multicollinear.html + :align: center + :scale: 70 -One way to handle this is to cluster features that are correlated and only -keep one feature from each cluster. This strategy is explored in the following -example: +For more details on such strategy, see the example :ref:`sphx_glr_auto_examples_inspection_plot_permutation_importance_multicollinear.py`. .. topic:: Examples: diff --git a/examples/inspection/plot_permutation_importance.py b/examples/inspection/plot_permutation_importance.py index 751413f69b59a..8cf63dd80fd4d 100644 --- a/examples/inspection/plot_permutation_importance.py +++ b/examples/inspection/plot_permutation_importance.py @@ -24,8 +24,6 @@ 2001. <10.1023/A:1010933404324>` """ -# %% -import numpy as np # %% # Data Loading and Feature Engineering @@ -40,6 +38,8 @@ # values as records). # - ``random_cat`` is a low cardinality categorical variable (3 possible # values). +import numpy as np + from sklearn.datasets import fetch_openml from sklearn.model_selection import train_test_split From 7bbd7499587284fdcb748c54a036c0d2d8686182 Mon Sep 17 00:00:00 2001 From: Michael Higgins <55243596+Higgs32584@users.noreply.github.com> Date: Wed, 17 Jan 2024 06:34:12 -0500 Subject: [PATCH 075/554] DOC add example for sklearn.utils.validation, check_memory and check_is_fitted (#28145) Co-authored-by: Higgs32584 Co-authored-by: Guillaume Lemaitre --- sklearn/utils/validation.py | 20 ++++++++++++++++++++ 1 file changed, 20 insertions(+) diff --git a/sklearn/utils/validation.py b/sklearn/utils/validation.py index e58fb41501c96..6531a9da3404b 100644 --- a/sklearn/utils/validation.py +++ b/sklearn/utils/validation.py @@ -395,6 +395,12 @@ def check_memory(memory): ------ ValueError If ``memory`` is not joblib.Memory-like. + + Examples + -------- + >>> from sklearn.utils.validation import check_memory + >>> check_memory("caching_dir") + Memory(location=caching_dir/joblib) """ if memory is None or isinstance(memory, str): memory = joblib.Memory(location=memory, verbose=0) @@ -1508,6 +1514,20 @@ def check_is_fitted(estimator, attributes=None, *, msg=None, all_or_any=all): NotFittedError If the attributes are not found. + Examples + -------- + >>> from sklearn.linear_model import LogisticRegression + >>> from sklearn.utils.validation import check_is_fitted + >>> from sklearn.exceptions import NotFittedError + >>> lr = LogisticRegression() + >>> try: + ... check_is_fitted(lr) + ... except NotFittedError as exc: + ... print(f"Model is not fitted yet.") + Model is not fitted yet. + >>> lr.fit([[1, 2], [1, 3]], [1, 0]) + LogisticRegression() + >>> check_is_fitted(lr) """ if isclass(estimator): raise TypeError("{} is a class, not an instance.".format(estimator)) From 7aa7485466be233498b50f9f564b8eb9fceb3702 Mon Sep 17 00:00:00 2001 From: Olivier Grisel Date: Wed, 17 Jan 2024 13:57:46 +0100 Subject: [PATCH 076/554] DOC add the HTML output for the polars dataframe for the 1.4 release highlights (#28149) --- examples/release_highlights/plot_release_highlights_1_4_0.py | 3 +++ 1 file changed, 3 insertions(+) diff --git a/examples/release_highlights/plot_release_highlights_1_4_0.py b/examples/release_highlights/plot_release_highlights_1_4_0.py index 0d2924d9e8bb4..d7344e4a9b99f 100644 --- a/examples/release_highlights/plot_release_highlights_1_4_0.py +++ b/examples/release_highlights/plot_release_highlights_1_4_0.py @@ -73,6 +73,9 @@ preprocessor.set_output(transform="polars") df_out = preprocessor.fit_transform(df) +df_out + +# %% print(f"Output type: {type(df_out)}") # %% From fe4ffd233d701cd77eda0d34f6af7343c664a14e Mon Sep 17 00:00:00 2001 From: Julien Jerphanion Date: Wed, 17 Jan 2024 14:00:12 +0100 Subject: [PATCH 077/554] DOC 1.4 release highlights: PCA on sparse data improvements (#28138) Signed-off-by: Julien Jerphanion Co-authored-by: jeremie du boisberranger --- .../plot_release_highlights_1_4_0.py | 25 +++++++++++++++++++ 1 file changed, 25 insertions(+) diff --git a/examples/release_highlights/plot_release_highlights_1_4_0.py b/examples/release_highlights/plot_release_highlights_1_4_0.py index d7344e4a9b99f..74f0c881fc4dc 100644 --- a/examples/release_highlights/plot_release_highlights_1_4_0.py +++ b/examples/release_highlights/plot_release_highlights_1_4_0.py @@ -207,3 +207,28 @@ # Setting the flag to the default `False` to avoid interference with other # scripts. sklearn.set_config(enable_metadata_routing=False) + +# %% +# Improved memory and runtime efficiency for PCA on sparse data +# ------------------------------------------------------------- +# PCA is now able to handle sparse matrices natively for the `arpack` +# solver by levaraging `scipy.sparse.linalg.LinearOperator` to avoid +# materializing large sparse matrices when performing the +# eigenvalue decomposition of the data set covariance matrix. +# +from sklearn.decomposition import PCA +import scipy.sparse as sp +from time import time + +X_sparse = sp.random(m=1000, n=1000, random_state=0) +X_dense = X_sparse.toarray() + +t0 = time() +PCA(n_components=10, svd_solver="arpack").fit(X_sparse) +time_sparse = time() - t0 + +t0 = time() +PCA(n_components=10, svd_solver="arpack").fit(X_dense) +time_dense = time() - t0 + +print(f"Speedup: {time_dense / time_sparse:.1f}x") From 6a3517e0ca41fb3bc8e375dd4fea7e1bb2f906ff Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= <34657725+jeremiedbb@users.noreply.github.com> Date: Wed, 17 Jan 2024 14:01:56 +0100 Subject: [PATCH 078/554] REL Update what's new and index for 1.4.0 final (#28148) Co-authored-by: Christian Lorentzen --- doc/developers/maintainer.rst | 2 +- doc/templates/index.html | 13 +++--- doc/whats_new/v1.4.rst | 88 +++++++++++++++++++++-------------- 3 files changed, 59 insertions(+), 44 deletions(-) diff --git a/doc/developers/maintainer.rst b/doc/developers/maintainer.rst index 048ad5d9906a1..e82a7993997b2 100644 --- a/doc/developers/maintainer.rst +++ b/doc/developers/maintainer.rst @@ -210,7 +210,7 @@ Making a release - Edit the ``doc/templates/index.html`` to change the 'News' entry of the front page (with the release month as well). Do not forget to remove the old entries (two years or three releases are typically good - enough) + enough) and to update the on-going development entry. 2. On the branch for releasing, update the version number in ``sklearn/__init__.py``, the ``__version__``. diff --git a/doc/templates/index.html b/doc/templates/index.html index a20da900bafcb..460ef9d865046 100644 --- a/doc/templates/index.html +++ b/doc/templates/index.html @@ -167,20 +167,19 @@

    Machine Learning in

    News

    diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index f79a15d7a15f0..ad3cc404f5930 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -2,46 +2,15 @@ .. currentmodule:: sklearn -.. _changes_1_4_1: - -Version 1.4.1 -============= - -**In Development** - -Changelog ---------- - -:mod:`sklearn.datasets` -....................... - -- |Fix| :func:`datasets.dump_svmlight_file` now does not raise `ValueError` when `X` - is read-only, e.g., a `numpy.memmap` instance. - :pr:`28111` by :user:`Yao Xiao `. - -:mod:`sklearn.neighbors` -........................ - -- |Fix| :meth:`neighbors.KNeighborsClassifier.predict` and - :meth:`neighbors.KNeighborsClassifier.predict_proba` now raises an error when the - weights of all neighbors of some sample are zero. This can happen when `weights` - is a user-defined function. - :pr:`26410` by :user:`Yao Xiao `. - -:mod:`sklearn.utils` -.................... - -- |Fix| Fix the function :func:`~utils.check_array` to output the right error message - when the input is Series instead of a DataFrame. - :pr:`28090` by :user:`Stan Furrer ` and :user:`Yao Xiao `. - - .. _changes_1_4: Version 1.4.0 ============= -**In Development** +**January 2024** + +For a short description of the main highlights of the release, please refer to +:ref:`sphx_glr_auto_examples_release_highlights_plot_release_highlights_1_4_0.py`. .. include:: changelog_legend.inc @@ -395,6 +364,10 @@ Changelog which returns a dense numpy ndarray as before. :pr:`27438` by :user:`Yao Xiao `. +- |Fix| :func:`datasets.dump_svmlight_file` now does not raise `ValueError` when `X` + is read-only, e.g., a `numpy.memmap` instance. + :pr:`28111` by :user:`Yao Xiao `. + - |API| :func:`datasets.make_sparse_spd_matrix` deprecated the keyword argument ``dim`` in favor of ``n_dim``. ``dim`` will be removed in version 1.6. :pr:`27718` by :user:`Adam Li `. @@ -717,6 +690,12 @@ Changelog when it is invoked with `n_samples=n_neighbors`. :pr:`23317` by :user:`Bharat Raghunathan `. +- |Fix| :meth:`neighbors.KNeighborsClassifier.predict` and + :meth:`neighbors.KNeighborsClassifier.predict_proba` now raises an error when the + weights of all neighbors of some sample are zero. This can happen when `weights` + is a user-defined function. + :pr:`26410` by :user:`Yao Xiao `. + - |API| :class:`neighbors.KNeighborsRegressor` now accepts :class:`metrics.DistanceMetric` objects directly via the `metric` keyword argument allowing for the use of accelerated third-party @@ -810,6 +789,10 @@ Changelog `X.toarray()`. :pr:`27757` by :user:`Lucy Liu `. +- |Fix| Fix the function :func:`~utils.check_array` to output the right error message + when the input is a Series instead of a DataFrame. + :pr:`28090` by :user:`Stan Furrer ` and :user:`Yao Xiao `. + - |API| :func:`sklearn.extmath.log_logistic` is deprecated and will be removed in 1.6. Use `-np.logaddexp(0, -x)` instead. :pr:`27544` by :user:`Christian Lorentzen `. @@ -820,4 +803,37 @@ Code and Documentation Contributors Thanks to everyone who has contributed to the maintenance and improvement of the project since version 1.3, including: -TODO: update at the time of the release. +101AlexMartin, Abhishek Singh Kushwah, Adam Li, Adarsh Wase, Adrin Jalali, +Advik Sinha, Alex, Alexander Al-Feghali, Alexis IMBERT, AlexL, Alex Molas, Anam +Fatima, Andrew Goh, andyscanzio, Aniket Patil, Artem Kislovskiy, Arturo Amor, +ashah002, avm19, Ben Holmes, Ben Mares, Benoit Chevallier-Mames, Bharat +Raghunathan, Binesh Bannerjee, Brendan Lu, Brevin Kunde, Camille Troillard, +Carlo Lemos, Chad Parmet, Christian Clauss, Christian Lorentzen, Christian +Veenhuis, Christos Aridas, Cindy Liang, Claudio Salvatore Arcidiacono, Connor +Boyle, cynthias13w, DaminK, Daniele Ongari, Daniel Schmitz, Daniel Tinoco, +David Brochart, Deborah L. Haar, DevanshKyada27, Dimitri Papadopoulos Orfanos, +Dmitry Nesterov, DUONG, Edoardo Abati, Eitan Hemed, Elabonga Atuo, Elisabeth +Günther, Emma Carballal, Emmanuel Ferdman, epimorphic, Erwan Le Floch, Fabian +Egli, Filip Karlo Došilović, Florian Idelberger, Franck Charras, Gael +Varoquaux, Ganesh Tata, Gleb Levitski, Guillaume Lemaitre, Haoying Zhang, +Harmanan Kohli, Ily, ioangatop, IsaacTrost, Isaac Virshup, Iwona Zdzieblo, +Jakub Kaczmarzyk, James McDermott, Jarrod Millman, JB Mountford, Jérémie du +Boisberranger, Jérôme Dockès, Jiawei Zhang, Joel Nothman, John Cant, John +Hopfensperger, Jona Sassenhagen, Jon Nordby, Julien Jerphanion, Kennedy Waweru, +kevin moore, Kian Eliasi, Kishan Ved, Konstantinos Pitas, Koustav Ghosh, Kushan +Sharma, ldwy4, Linus, Lohit SundaramahaLingam, Loic Esteve, Lorenz, Louis +Fouquet, Lucy Liu, Luis Silvestrin, Lukáš Folwarczný, Lukas Geiger, Malte +Londschien, Marcus Fraaß, Marek Hanuš, Maren Westermann, Mark Elliot, Martin +Larralde, Mateusz Sokół, mathurinm, mecopur, Meekail Zain, Michael Higgins, +Miki Watanabe, Milton Gomez, MN193, Mohammed Hamdy, Mohit Joshi, mrastgoo, +Naman Dhingra, Naoise Holohan, Narendra Singh dangi, Noa Malem-Shinitski, +Nolan, Nurseit Kamchyev, Oleksii Kachaiev, Olivier Grisel, Omar Salman, partev, +Peter Hull, Peter Steinbach, Pierre de Fréminville, Pooja Subramaniam, Puneeth +K, qmarcou, Quentin Barthélemy, Rahil Parikh, Rahul Mahajan, Raj Pulapakura, +Raphael, Ricardo Peres, Riccardo Cappuzzo, Roman Lutz, Salim Dohri, Samuel O. +Ronsin, Sandip Dutta, Sayed Qaiser Ali, scaja, scikit-learn-bot, Sebastian +Berg, Shreesha Kumar Bhat, Shubhal Gupta, Søren Fuglede Jørgensen, Stefanie +Senger, Tamara, Tanjina Afroj, THARAK HEGDE, thebabush, Thomas J. Fan, Thomas +Roehr, Tialo, Tim Head, tongyu, Venkatachalam N, Vijeth Moudgalya, Vincent M, +Vivek Reddy P, Vladimir Fokow, Xiao Yuan, Xuefeng Xu, Yang Tao, Yao Xiao, +Yuchen Zhou, Yuusuke Hiramatsu From 82e1f130580414b439509d3a201b282830039c64 Mon Sep 17 00:00:00 2001 From: Salim Dohri <104096451+dohrisalim@users.noreply.github.com> Date: Wed, 17 Jan 2024 15:16:36 +0100 Subject: [PATCH 079/554] DOC Update Mixin classes documentation and examples (#28146) Co-authored-by: Guillaume Lemaitre --- sklearn/base.py | 131 ++++++++++++++++++++++++++++++++++++++++++++++-- 1 file changed, 127 insertions(+), 4 deletions(-) diff --git a/sklearn/base.py b/sklearn/base.py index c2b119cbf63e5..e73ae4c8a180e 100644 --- a/sklearn/base.py +++ b/sklearn/base.py @@ -853,7 +853,23 @@ def _more_tags(self): class ClusterMixin: - """Mixin class for all cluster estimators in scikit-learn.""" + """Mixin class for all cluster estimators in scikit-learn. + + - `_estimator_type` class attribute defaulting to `"clusterer"`; + - `fit_predict` method returning the cluster labels associated to each sample. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import BaseEstimator, ClusterMixin + >>> class MyClusterer(ClusterMixin, BaseEstimator): + ... def fit(self, X, y=None): + ... self.labels_ = np.ones(shape=(len(X),), dtype=np.int64) + ... return self + >>> X = [[1, 2], [2, 3], [3, 4]] + >>> MyClusterer().fit_predict(X) + array([1, 1, 1]) + """ _estimator_type = "clusterer" @@ -994,6 +1010,11 @@ def get_submatrix(self, i, data): class TransformerMixin(_SetOutputMixin): """Mixin class for all transformers in scikit-learn. + This mixin defines the following functionality: + + - a `fit_transform` method that delegates to `fit` and `transform`; + - a `set_output` method to output `X` as a specific container type. + If :term:`get_feature_names_out` is defined, then :class:`BaseEstimator` will automatically wrap `transform` and `fit_transform` to follow the `set_output` API. See the :ref:`developer_api_set_output` for details. @@ -1001,6 +1022,22 @@ class TransformerMixin(_SetOutputMixin): :class:`OneToOneFeatureMixin` and :class:`ClassNamePrefixFeaturesOutMixin` are helpful mixins for defining :term:`get_feature_names_out`. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import BaseEstimator, TransformerMixin + >>> class MyTransformer(TransformerMixin, BaseEstimator): + ... def __init__(self, *, param=1): + ... self.param = param + ... def fit(self, X, y=None): + ... return self + ... def transform(self, X): + ... return np.full(shape=len(X), fill_value=self.param) + >>> transformer = MyTransformer() + >>> X = [[1, 2], [2, 3], [3, 4]] + >>> transformer.fit_transform(X) + array([1, 1, 1]) """ def fit_transform(self, X, y=None, **fit_params): @@ -1069,6 +1106,18 @@ class OneToOneFeatureMixin: This mixin assumes there's a 1-to-1 correspondence between input features and output features, such as :class:`~sklearn.preprocessing.StandardScaler`. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import OneToOneFeatureMixin + >>> class MyEstimator(OneToOneFeatureMixin): + ... def fit(self, X, y=None): + ... self.n_features_in_ = X.shape[1] + ... return self + >>> X = np.array([[1, 2], [3, 4]]) + >>> MyEstimator().fit(X).get_feature_names_out() + array(['x0', 'x1'], dtype=object) """ def get_feature_names_out(self, input_features=None): @@ -1106,6 +1155,18 @@ class ClassNamePrefixFeaturesOutMixin: This mixin assumes that a `_n_features_out` attribute is defined when the transformer is fitted. `_n_features_out` is the number of output features that the transformer will return in `transform` of `fit_transform`. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import ClassNamePrefixFeaturesOutMixin + >>> class MyEstimator(ClassNamePrefixFeaturesOutMixin): + ... def fit(self, X, y=None): + ... self._n_features_out = X.shape[1] + ... return self + >>> X = np.array([[1, 2], [3, 4]]) + >>> MyEstimator().fit(X).get_feature_names_out() + array(['myestimator0', 'myestimator1'], dtype=object) """ def get_feature_names_out(self, input_features=None): @@ -1132,7 +1193,24 @@ def get_feature_names_out(self, input_features=None): class DensityMixin: - """Mixin class for all density estimators in scikit-learn.""" + """Mixin class for all density estimators in scikit-learn. + + This mixin defines the following functionality: + + - `_estimator_type` class attribute defaulting to `"DensityEstimator"`; + - `score` method that default that do no-op. + + Examples + -------- + >>> from sklearn.base import DensityMixin + >>> class MyEstimator(DensityMixin): + ... def fit(self, X, y=None): + ... self.is_fitted_ = True + ... return self + >>> estimator = MyEstimator() + >>> hasattr(estimator, "score") + True + """ _estimator_type = "DensityEstimator" @@ -1155,7 +1233,28 @@ def score(self, X, y=None): class OutlierMixin: - """Mixin class for all outlier detection estimators in scikit-learn.""" + """Mixin class for all outlier detection estimators in scikit-learn. + + This mixin defines the following functionality: + + - `_estimator_type` class attribute defaulting to `outlier_detector`; + - `fit_predict` method that default to `fit` and `predict`. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.base import BaseEstimator, OutlierMixin + >>> class MyEstimator(OutlierMixin): + ... def fit(self, X, y=None): + ... self.is_fitted_ = True + ... return self + ... def predict(self, X): + ... return np.ones(shape=len(X)) + >>> estimator = MyEstimator() + >>> X = np.array([[1, 2], [2, 3], [3, 4]]) + >>> estimator.fit_predict(X) + array([1., 1., 1.]) + """ _estimator_type = "outlier_detector" @@ -1213,7 +1312,31 @@ def fit_predict(self, X, y=None, **kwargs): class MetaEstimatorMixin: - """Mixin class for all meta estimators in scikit-learn.""" + """Mixin class for all meta estimators in scikit-learn. + + This mixin defines the following functionality: + + - define `_required_parameters` that specify the mandatory `estimator` parameter. + + Examples + -------- + >>> from sklearn.base import MetaEstimatorMixin + >>> from sklearn.datasets import load_iris + >>> from sklearn.linear_model import LogisticRegression + >>> class MyEstimator(MetaEstimatorMixin): + ... def __init__(self, *, estimator=None): + ... self.estimator = estimator + ... def fit(self, X, y=None): + ... if self.estimator is None: + ... self.estimator_ = LogisticRegression() + ... else: + ... self.estimator_ = self.estimator + ... return self + >>> X, y = load_iris(return_X_y=True) + >>> estimator = MyEstimator().fit(X, y) + >>> estimator.estimator_ + LogisticRegression() + """ _required_parameters = ["estimator"] From b0df1ce5c067c16e5b70e0ea30375517f4c68851 Mon Sep 17 00:00:00 2001 From: Weyb <143994332+klaurent83@users.noreply.github.com> Date: Wed, 17 Jan 2024 15:18:45 +0100 Subject: [PATCH 080/554] DOC add example in docstring of ridge_regression (#28122) Co-authored-by: Guillaume Lemaitre --- sklearn/linear_model/_ridge.py | 13 +++++++++++++ 1 file changed, 13 insertions(+) diff --git a/sklearn/linear_model/_ridge.py b/sklearn/linear_model/_ridge.py index e39af10053c34..c4f52c68e697e 100644 --- a/sklearn/linear_model/_ridge.py +++ b/sklearn/linear_model/_ridge.py @@ -549,6 +549,19 @@ def ridge_regression( :class:`~sklearn.svm.LinearSVC`. If an array is passed, penalties are assumed to be specific to the targets. Hence they must correspond in number. + + Examples + -------- + >>> from sklearn.datasets import make_regression + >>> from sklearn.linear_model import ridge_regression + >>> X, y = make_regression( + ... n_features=4, n_informative=2, shuffle=False, random_state=0 + ... ) + >>> coef, intercept = ridge_regression(X, y, alpha=1.0, return_intercept=True) + >>> coef + array([20.2..., 33.7..., 0.1..., 0.0...]) + >>> intercept + -0.0... """ return _ridge_regression( X, From 74f4aaa0ba76f4f63bbc079f428bcc7014f88abc Mon Sep 17 00:00:00 2001 From: Saad Mahmood <90573707+SaadMahm00d@users.noreply.github.com> Date: Wed, 17 Jan 2024 13:00:41 -0500 Subject: [PATCH 081/554] DOC Add dropdowns to module 9.1 Python specific serialization (#26881) --- doc/model_persistence.rst | 353 ++++++++++++++++++++------------------ 1 file changed, 184 insertions(+), 169 deletions(-) diff --git a/doc/model_persistence.rst b/doc/model_persistence.rst index b8da5c8a3961f..1e0cc36be534d 100644 --- a/doc/model_persistence.rst +++ b/doc/model_persistence.rst @@ -1,169 +1,184 @@ -.. Places parent toc into the sidebar - -:parenttoc: True - -.. _model_persistence: - -================= -Model persistence -================= - -After training a scikit-learn model, it is desirable to have a way to persist -the model for future use without having to retrain. The following sections give -you some hints on how to persist a scikit-learn model. - -Python specific serialization ------------------------------ - -It is possible to save a model in scikit-learn by using Python's built-in -persistence model, namely `pickle -`_:: - - >>> from sklearn import svm - >>> from sklearn import datasets - >>> clf = svm.SVC() - >>> X, y= datasets.load_iris(return_X_y=True) - >>> clf.fit(X, y) - SVC() - - >>> import pickle - >>> s = pickle.dumps(clf) - >>> clf2 = pickle.loads(s) - >>> clf2.predict(X[0:1]) - array([0]) - >>> y[0] - 0 - -In the specific case of scikit-learn, it may be better to use joblib's -replacement of pickle (``dump`` & ``load``), which is more efficient on -objects that carry large numpy arrays internally as is often the case for -fitted scikit-learn estimators, but can only pickle to the disk and not to a -string:: - - >>> from joblib import dump, load - >>> dump(clf, 'filename.joblib') # doctest: +SKIP - -Later you can load back the pickled model (possibly in another Python process) -with:: - - >>> clf = load('filename.joblib') # doctest:+SKIP - -.. note:: - - ``dump`` and ``load`` functions also accept file-like object - instead of filenames. More information on data persistence with Joblib is - available `here - `_. - -When an estimator is unpickled with a scikit-learn version that is inconsistent -with the version the estimator was pickled with, a -:class:`~sklearn.exceptions.InconsistentVersionWarning` is raised. This warning -can be caught to obtain the original version the estimator was pickled with:: - - from sklearn.exceptions import InconsistentVersionWarning - warnings.simplefilter("error", InconsistentVersionWarning) - - try: - est = pickle.loads("model_from_prevision_version.pickle") - except InconsistentVersionWarning as w: - print(w.original_sklearn_version) - -.. _persistence_limitations: - -Security & maintainability limitations -...................................... - -pickle (and joblib by extension), has some issues regarding maintainability -and security. Because of this, - -* Never unpickle untrusted data as it could lead to malicious code being - executed upon loading. -* While models saved using one version of scikit-learn might load in - other versions, this is entirely unsupported and inadvisable. It should - also be kept in mind that operations performed on such data could give - different and unexpected results. - -In order to rebuild a similar model with future versions of scikit-learn, -additional metadata should be saved along the pickled model: - -* The training data, e.g. a reference to an immutable snapshot -* The python source code used to generate the model -* The versions of scikit-learn and its dependencies -* The cross validation score obtained on the training data - -This should make it possible to check that the cross-validation score is in the -same range as before. - -Aside for a few exceptions, pickled models should be portable across -architectures assuming the same versions of dependencies and Python are used. -If you encounter an estimator that is not portable please open an issue on -GitHub. Pickled models are often deployed in production using containers, like -Docker, in order to freeze the environment and dependencies. - -If you want to know more about these issues and explore other possible -serialization methods, please refer to this -`talk by Alex Gaynor -`_. - - -A more secure format: `skops` -............................. - -`skops `__ provides a more secure -format via the :mod:`skops.io` module. It avoids using :mod:`pickle` and only -loads files which have types and references to functions which are trusted -either by default or by the user. The API is very similar to ``pickle``, and -you can persist your models as explain in the `docs -`__ using -:func:`skops.io.dump` and :func:`skops.io.dumps`:: - - import skops.io as sio - obj = sio.dumps(clf) - -And you can load them back using :func:`skops.io.load` and -:func:`skops.io.loads`. However, you need to specify the types which are -trusted by you. You can get existing unknown types in a dumped object / file -using :func:`skops.io.get_untrusted_types`, and after checking its contents, -pass it to the load function:: - - unknown_types = sio.get_untrusted_types(data=obj) - clf = sio.loads(obj, trusted=unknown_types) - -If you trust the source of the file / object, you can pass ``trusted=True``:: - - clf = sio.loads(obj, trusted=True) - -Please report issues and feature requests related to this format on the `skops -issue tracker `__. - -Interoperable formats ---------------------- - -For reproducibility and quality control needs, when different architectures -and environments should be taken into account, exporting the model in -`Open Neural Network -Exchange `_ format or `Predictive Model Markup Language -(PMML) `_ format -might be a better approach than using `pickle` alone. -These are helpful where you may want to use your model for prediction in a -different environment from where the model was trained. - -ONNX is a binary serialization of the model. It has been developed to improve -the usability of the interoperable representation of data models. -It aims to facilitate the conversion of the data -models between different machine learning frameworks, and to improve their -portability on different computing architectures. More details are available -from the `ONNX tutorial `_. -To convert scikit-learn model to ONNX a specific tool `sklearn-onnx -`_ has been developed. - -PMML is an implementation of the `XML -`_ document standard -defined to represent data models together with the data used to generate them. -Being human and machine readable, -PMML is a good option for model validation on different platforms and -long term archiving. On the other hand, as XML in general, its verbosity does -not help in production when performance is critical. -To convert scikit-learn model to PMML you can use for example `sklearn2pmml -`_ distributed under the Affero GPLv3 -license. +.. Places parent toc into the sidebar + +:parenttoc: True + +.. _model_persistence: + +================= +Model persistence +================= + +After training a scikit-learn model, it is desirable to have a way to persist +the model for future use without having to retrain. The following sections give +you some hints on how to persist a scikit-learn model. + +Python specific serialization +----------------------------- + +It is possible to save a model in scikit-learn by using Python's built-in +persistence model, namely `pickle +`_:: + + >>> from sklearn import svm + >>> from sklearn import datasets + >>> clf = svm.SVC() + >>> X, y= datasets.load_iris(return_X_y=True) + >>> clf.fit(X, y) + SVC() + + >>> import pickle + >>> s = pickle.dumps(clf) + >>> clf2 = pickle.loads(s) + >>> clf2.predict(X[0:1]) + array([0]) + >>> y[0] + 0 + +In the specific case of scikit-learn, it may be better to use joblib's +replacement of pickle (``dump`` & ``load``), which is more efficient on +objects that carry large numpy arrays internally as is often the case for +fitted scikit-learn estimators, but can only pickle to the disk and not to a +string:: + + >>> from joblib import dump, load + >>> dump(clf, 'filename.joblib') # doctest: +SKIP + +Later you can load back the pickled model (possibly in another Python process) +with:: + + >>> clf = load('filename.joblib') # doctest:+SKIP + +.. note:: + + ``dump`` and ``load`` functions also accept file-like object + instead of filenames. More information on data persistence with Joblib is + available `here + `_. + +|details-start| +**InconsistentVersionWarning** +|details-split| + +When an estimator is unpickled with a scikit-learn version that is inconsistent +with the version the estimator was pickled with, a +:class:`~sklearn.exceptions.InconsistentVersionWarning` is raised. This warning +can be caught to obtain the original version the estimator was pickled with:: + + from sklearn.exceptions import InconsistentVersionWarning + warnings.simplefilter("error", InconsistentVersionWarning) + + try: + est = pickle.loads("model_from_prevision_version.pickle") + except InconsistentVersionWarning as w: + print(w.original_sklearn_version) + +|details-end| + +.. _persistence_limitations: + +Security & maintainability limitations +...................................... + +pickle (and joblib by extension), has some issues regarding maintainability +and security. Because of this, + +* Never unpickle untrusted data as it could lead to malicious code being + executed upon loading. +* While models saved using one version of scikit-learn might load in + other versions, this is entirely unsupported and inadvisable. It should + also be kept in mind that operations performed on such data could give + different and unexpected results. + +In order to rebuild a similar model with future versions of scikit-learn, +additional metadata should be saved along the pickled model: + +* The training data, e.g. a reference to an immutable snapshot +* The python source code used to generate the model +* The versions of scikit-learn and its dependencies +* The cross validation score obtained on the training data + +This should make it possible to check that the cross-validation score is in the +same range as before. + +Aside for a few exceptions, pickled models should be portable across +architectures assuming the same versions of dependencies and Python are used. +If you encounter an estimator that is not portable please open an issue on +GitHub. Pickled models are often deployed in production using containers, like +Docker, in order to freeze the environment and dependencies. + +If you want to know more about these issues and explore other possible +serialization methods, please refer to this +`talk by Alex Gaynor +`_. + + +A more secure format: `skops` +............................. + +`skops `__ provides a more secure +format via the :mod:`skops.io` module. It avoids using :mod:`pickle` and only +loads files which have types and references to functions which are trusted +either by default or by the user. + +|details-start| +**Using skops** + +|details-split| + +The API is very similar to ``pickle``, and +you can persist your models as explain in the `docs +`__ using +:func:`skops.io.dump` and :func:`skops.io.dumps`:: + + import skops.io as sio + obj = sio.dumps(clf) + +And you can load them back using :func:`skops.io.load` and +:func:`skops.io.loads`. However, you need to specify the types which are +trusted by you. You can get existing unknown types in a dumped object / file +using :func:`skops.io.get_untrusted_types`, and after checking its contents, +pass it to the load function:: + + unknown_types = sio.get_untrusted_types(data=obj) + clf = sio.loads(obj, trusted=unknown_types) + +If you trust the source of the file / object, you can pass ``trusted=True``:: + + clf = sio.loads(obj, trusted=True) + +Please report issues and feature requests related to this format on the `skops +issue tracker `__. + +|details-end| + +Interoperable formats +--------------------- + +For reproducibility and quality control needs, when different architectures +and environments should be taken into account, exporting the model in +`Open Neural Network +Exchange `_ format or `Predictive Model Markup Language +(PMML) `_ format +might be a better approach than using `pickle` alone. +These are helpful where you may want to use your model for prediction in a +different environment from where the model was trained. + +ONNX is a binary serialization of the model. It has been developed to improve +the usability of the interoperable representation of data models. +It aims to facilitate the conversion of the data +models between different machine learning frameworks, and to improve their +portability on different computing architectures. More details are available +from the `ONNX tutorial `_. +To convert scikit-learn model to ONNX a specific tool `sklearn-onnx +`_ has been developed. + +PMML is an implementation of the `XML +`_ document standard +defined to represent data models together with the data used to generate them. +Being human and machine readable, +PMML is a good option for model validation on different platforms and +long term archiving. On the other hand, as XML in general, its verbosity does +not help in production when performance is critical. +To convert scikit-learn model to PMML you can use for example `sklearn2pmml +`_ distributed under the Affero GPLv3 +license. From 8c9505263973480a59234f79ffb292b1db0e4a29 Mon Sep 17 00:00:00 2001 From: Mavs <32366550+tvdboom@users.noreply.github.com> Date: Wed, 17 Jan 2024 23:40:11 +0100 Subject: [PATCH 082/554] add feature_names_in_ and n_features_in_ attributes to dummy estimators (#27937) --- doc/whats_new/v1.5.rst | 8 ++++++++ sklearn/dummy.py | 18 ++++++++++++++++++ sklearn/tests/test_dummy.py | 19 ++++++++++++++++++- 3 files changed, 44 insertions(+), 1 deletion(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 96cbd21021f08..4a44bd6666615 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -25,12 +25,20 @@ Changelog :pr:`123456` by :user:`Joe Bloggs `. where 123455 is the *pull request* number, not the issue number. + :mod:`sklearn.compose` ...................... - |Feature| A fitted :class:`compose.ColumnTransformer` now implements `__getitem__` which returns the fitted transformers by name. :pr:`27990` by `Thomas Fan`_. +:mod:`sklearn.dummy` +....................... + +- |Enhancement| :class:`dummy.DummyClassifier` and :class:`dummy.DummyRegressor` now + have the `n_features_in_` and `feature_names_in_` attributes after `fit`. + :pr:`27937` by :user:`Marco vd Boom `. + :mod:`sklearn.feature_extraction` ................................. diff --git a/sklearn/dummy.py b/sklearn/dummy.py index 63318b07ce580..17812fe1b3d05 100644 --- a/sklearn/dummy.py +++ b/sklearn/dummy.py @@ -110,6 +110,13 @@ class prior probabilities. Frequency of each class observed in `y`. For multioutput classification problems, this is computed independently for each output. + n_features_in_ : int + Number of features seen during :term:`fit`. + + feature_names_in_ : ndarray of shape (`n_features_in_`,) + Names of features seen during :term:`fit`. Defined only when `X` has + feature names that are all strings. + n_outputs_ : int Number of outputs. @@ -170,6 +177,8 @@ def fit(self, X, y, sample_weight=None): self : object Returns the instance itself. """ + self._validate_data(X, cast_to_ndarray=False) + self._strategy = self.strategy if self._strategy == "uniform" and sp.issparse(y): @@ -488,6 +497,13 @@ class DummyRegressor(MultiOutputMixin, RegressorMixin, BaseEstimator): Mean or median or quantile of the training targets or constant value given by the user. + n_features_in_ : int + Number of features seen during :term:`fit`. + + feature_names_in_ : ndarray of shape (`n_features_in_`,) + Names of features seen during :term:`fit`. Defined only when `X` has + feature names that are all strings. + n_outputs_ : int Number of outputs. @@ -545,6 +561,8 @@ def fit(self, X, y, sample_weight=None): self : object Fitted estimator. """ + self._validate_data(X, cast_to_ndarray=False) + y = check_array(y, ensure_2d=False, input_name="y") if len(y) == 0: raise ValueError("y must not be empty.") diff --git a/sklearn/tests/test_dummy.py b/sklearn/tests/test_dummy.py index 14bab1e0ffe97..e398894095b18 100644 --- a/sklearn/tests/test_dummy.py +++ b/sklearn/tests/test_dummy.py @@ -72,6 +72,23 @@ def _check_equality_regressor(statistic, y_learn, y_pred_learn, y_test, y_pred_t assert_array_almost_equal(np.tile(statistic, (y_test.shape[0], 1)), y_pred_test) +def test_feature_names_in_and_n_features_in_(global_random_seed, n_samples=10): + pd = pytest.importorskip("pandas") + + random_state = np.random.RandomState(seed=global_random_seed) + + X = pd.DataFrame([[0]] * n_samples, columns=["feature_1"]) + y = random_state.rand(n_samples) + + est = DummyRegressor().fit(X, y) + assert hasattr(est, "feature_names_in_") + assert hasattr(est, "n_features_in_") + + est = DummyClassifier().fit(X, y) + assert hasattr(est, "feature_names_in_") + assert hasattr(est, "n_features_in_") + + def test_most_frequent_and_prior_strategy(): X = [[0], [0], [0], [0]] # ignored y = [1, 2, 1, 1] @@ -376,7 +393,7 @@ def test_quantile_invalid(): def test_quantile_strategy_empty_train(): est = DummyRegressor(strategy="quantile", quantile=0.4) - with pytest.raises(ValueError): + with pytest.raises(IndexError): est.fit([], []) From 4b59708ee8cd2d1bfadabaeba6eaec4db948a276 Mon Sep 17 00:00:00 2001 From: Ian Faust Date: Wed, 17 Jan 2024 23:46:45 +0100 Subject: [PATCH 083/554] EFF remove superfluous check_array call from LocalOutlierFactor._predict (#28113) --- sklearn/neighbors/_lof.py | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/sklearn/neighbors/_lof.py b/sklearn/neighbors/_lof.py index 05dfdb13a1cbe..fcf1c1ce990bd 100644 --- a/sklearn/neighbors/_lof.py +++ b/sklearn/neighbors/_lof.py @@ -373,9 +373,9 @@ def _predict(self, X=None): check_is_fitted(self) if X is not None: - X = check_array(X, accept_sparse="csr") - is_inlier = np.ones(X.shape[0], dtype=int) - is_inlier[self.decision_function(X) < 0] = -1 + shifted_opposite_lof_scores = self.decision_function(X) + is_inlier = np.ones(shifted_opposite_lof_scores.shape[0], dtype=int) + is_inlier[shifted_opposite_lof_scores < 0] = -1 else: is_inlier = np.ones(self.n_samples_fit_, dtype=int) is_inlier[self.negative_outlier_factor_ < self.offset_] = -1 From d825fb18ed2f4c4734d7d077c35518411784a0ad Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Thu, 18 Jan 2024 13:53:59 +0800 Subject: [PATCH 084/554] MAINT fix `update_environments_and_lock_files` for non-posix systems (#28133) --- .../update_environments_and_lock_files.py | 48 ++++++++++++++----- 1 file changed, 36 insertions(+), 12 deletions(-) diff --git a/build_tools/update_environments_and_lock_files.py b/build_tools/update_environments_and_lock_files.py index 1115e89408dd9..27d7d3fe85b31 100644 --- a/build_tools/update_environments_and_lock_files.py +++ b/build_tools/update_environments_and_lock_files.py @@ -39,7 +39,6 @@ import json import logging import re -import shlex import subprocess import sys from importlib.metadata import version @@ -481,11 +480,21 @@ def write_all_conda_environments(build_metadata_list): def conda_lock(environment_path, lock_file_path, platform): - command = ( - f"conda-lock lock --mamba --kind explicit --platform {platform} " - f"--file {environment_path} --filename-template {lock_file_path}" + execute_command( + [ + "conda-lock", + "lock", + "--mamba", + "--kind", + "explicit", + "--platform", + platform, + "--file", + str(environment_path), + "--filename-template", + str(lock_file_path), + ] ) - execute_command(shlex.split(command)) def create_conda_lock_file(build_metadata): @@ -533,8 +542,15 @@ def write_all_pip_requirements(build_metadata_list): def pip_compile(pip_compile_path, requirements_path, lock_file_path): - command = f"{pip_compile_path} --upgrade {requirements_path} -o {lock_file_path}" - execute_command(shlex.split(command)) + execute_command( + [ + str(pip_compile_path), + "--upgrade", + str(requirements_path), + "-o", + str(lock_file_path), + ] + ) def write_pip_lock_file(build_metadata): @@ -546,13 +562,21 @@ def write_pip_lock_file(build_metadata): # create a conda environment with the correct Python version and # pip-compile and run pip-compile in this environment - command = ( - "conda create -c conda-forge -n" - f" pip-tools-python{python_version} python={python_version} pip-tools -y" + execute_command( + [ + "conda", + "create", + "-c", + "conda-forge", + "-n", + f"pip-tools-python{python_version}", + f"python={python_version}", + "pip-tools", + "-y", + ] ) - execute_command(shlex.split(command)) - json_output = execute_command(shlex.split("conda info --json")) + json_output = execute_command(["conda", "info", "--json"]) conda_info = json.loads(json_output) environment_folder = [ each for each in conda_info["envs"] if each.endswith(environment_name) From b83f206bb3651a91111b670209eafc54c0eb5c96 Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Thu, 18 Jan 2024 10:16:40 +0100 Subject: [PATCH 085/554] ENH multinomial Cython loss (#28028) --- sklearn/_loss/_loss.pyx.tp | 20 +++++++++----------- sklearn/_loss/tests/test_loss.py | 5 +++-- 2 files changed, 12 insertions(+), 13 deletions(-) diff --git a/sklearn/_loss/_loss.pyx.tp b/sklearn/_loss/_loss.pyx.tp index da974a3c3f4fd..61043162ae51a 100644 --- a/sklearn/_loss/_loss.pyx.tp +++ b/sklearn/_loss/_loss.pyx.tp @@ -271,7 +271,7 @@ cdef inline void sum_exp_minus_max( const floating_in[:, :] raw_prediction, # IN floating_in *p # OUT ) noexcept nogil: - # Thread local buffers are used to stores results of this function via p. + # Thread local buffers are used to store results of this function via p. # The results are stored as follows: # p[k] = exp(raw_prediction_i_k - max_value) for k = 0 to n_classes-1 # p[-2] = max(raw_prediction_i_k, k = 0 to n_classes-1) @@ -1185,10 +1185,9 @@ cdef class CyHalfMultinomialLoss(CyLossFunction): sum_exps = p[n_classes + 1] # p[-1] loss_out[i] = log(sum_exps) + max_value - for k in range(n_classes): - # label decode y_true - if y_true[i] == k: - loss_out[i] -= raw_prediction[i, k] + # label encoded y_true + k = int(y_true[i]) + loss_out[i] -= raw_prediction[i, k] free(p) else: @@ -1201,10 +1200,9 @@ cdef class CyHalfMultinomialLoss(CyLossFunction): sum_exps = p[n_classes + 1] # p[-1] loss_out[i] = log(sum_exps) + max_value - for k in range(n_classes): - # label decode y_true - if y_true[i] == k: - loss_out[i] -= raw_prediction[i, k] + # label encoded y_true + k = int(y_true[i]) + loss_out[i] -= raw_prediction[i, k] loss_out[i] *= sample_weight[i] @@ -1241,7 +1239,7 @@ cdef class CyHalfMultinomialLoss(CyLossFunction): for k in range(n_classes): # label decode y_true - if y_true [i] == k: + if y_true[i] == k: loss_out[i] -= raw_prediction[i, k] p[k] /= sum_exps # p_k = y_pred_k = prob of class k # gradient_k = p_k - (y_true == k) @@ -1260,7 +1258,7 @@ cdef class CyHalfMultinomialLoss(CyLossFunction): for k in range(n_classes): # label decode y_true - if y_true [i] == k: + if y_true[i] == k: loss_out[i] -= raw_prediction[i, k] p[k] /= sum_exps # p_k = y_pred_k = prob of class k # gradient_k = (p_k - (y_true == k)) * sw diff --git a/sklearn/_loss/tests/test_loss.py b/sklearn/_loss/tests/test_loss.py index 9c8bba4d717d1..cf441001d0ccb 100644 --- a/sklearn/_loss/tests/test_loss.py +++ b/sklearn/_loss/tests/test_loss.py @@ -120,7 +120,8 @@ def test_loss_boundary(loss): """Test interval ranges of y_true and y_pred in losses.""" # make sure low and high are always within the interval, used for linspace if loss.is_multiclass: - y_true = np.linspace(0, 9, num=10) + n_classes = 3 # default value + y_true = np.tile(np.linspace(0, n_classes - 1, num=n_classes), 3) else: low, high = _inclusive_low_high(loss.interval_y_true) y_true = np.linspace(low, high, num=10) @@ -136,7 +137,7 @@ def test_loss_boundary(loss): n = y_true.shape[0] low, high = _inclusive_low_high(loss.interval_y_pred) if loss.is_multiclass: - y_pred = np.empty((n, 3)) + y_pred = np.empty((n, n_classes)) y_pred[:, 0] = np.linspace(low, high, num=n) y_pred[:, 1] = 0.5 * (1 - y_pred[:, 0]) y_pred[:, 2] = 0.5 * (1 - y_pred[:, 0]) From fd814a05ab85c6156e25629417aaa06eef87b72d Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Thu, 18 Jan 2024 10:22:10 +0100 Subject: [PATCH 086/554] MNT replace Cython loss functions in SGD part 1 (#27999) --- sklearn/linear_model/_sag_fast.pyx.tp | 18 ++++---- sklearn/linear_model/_sgd_fast.pxd | 20 ++++---- sklearn/linear_model/_sgd_fast.pyx.tp | 66 +++++++++++++-------------- 3 files changed, 52 insertions(+), 52 deletions(-) diff --git a/sklearn/linear_model/_sag_fast.pyx.tp b/sklearn/linear_model/_sag_fast.pyx.tp index 97bf3020d6602..9bfeed559bc13 100644 --- a/sklearn/linear_model/_sag_fast.pyx.tp +++ b/sklearn/linear_model/_sag_fast.pyx.tp @@ -85,7 +85,7 @@ cdef {{c_type}} _logsumexp{{name_suffix}}({{c_type}}* arr, int n_classes) noexce {{for name_suffix, c_type, np_type in dtypes}} cdef class MultinomialLogLoss{{name_suffix}}: - cdef {{c_type}} _loss(self, {{c_type}}* prediction, {{c_type}} y, int n_classes, + cdef {{c_type}} _loss(self, {{c_type}} y, {{c_type}}* prediction, int n_classes, {{c_type}} sample_weight) noexcept nogil: r"""Multinomial Logistic regression loss. @@ -100,12 +100,12 @@ cdef class MultinomialLogLoss{{name_suffix}}: Parameters ---------- - prediction : pointer to a np.ndarray[{{c_type}}] of shape (n_classes,) - Prediction of the multinomial classifier, for current sample. - y : {{c_type}}, between 0 and n_classes - 1 Indice of the correct class for current sample (i.e. label encoded). + prediction : pointer to a np.ndarray[{{c_type}}] of shape (n_classes,) + Prediction of the multinomial classifier, for current sample. + n_classes : integer Total number of classes. @@ -129,7 +129,7 @@ cdef class MultinomialLogLoss{{name_suffix}}: loss = (logsumexp_prediction - prediction[int(y)]) * sample_weight return loss - cdef void dloss(self, {{c_type}}* prediction, {{c_type}} y, int n_classes, + cdef void dloss(self, {{c_type}} y, {{c_type}}* prediction, int n_classes, {{c_type}} sample_weight, {{c_type}}* gradient_ptr) noexcept nogil: r"""Multinomial Logistic regression gradient of the loss. @@ -414,9 +414,9 @@ def sag{{name_suffix}}( # compute the gradient for this sample, given the prediction if multinomial: - multiloss.dloss(&prediction[0], y, n_classes, sample_weight, &gradient[0]) + multiloss.dloss(y, &prediction[0], n_classes, sample_weight, &gradient[0]) else: - gradient[0] = loss.dloss(prediction[0], y) * sample_weight + gradient[0] = loss.dloss(y, prediction[0]) * sample_weight # L2 regularization by simply rescaling the weights wscale *= wscale_update @@ -835,10 +835,10 @@ def _multinomial_grad_loss_all_samples( ) # compute the gradient for this sample, given the prediction - multiloss.dloss(&prediction[0], y, n_classes, sample_weight, &gradient[0]) + multiloss.dloss(y, &prediction[0], n_classes, sample_weight, &gradient[0]) # compute the loss for this sample, given the prediction - sum_loss += multiloss._loss(&prediction[0], y, n_classes, sample_weight) + sum_loss += multiloss._loss(y, &prediction[0], n_classes, sample_weight) # update the sum of the gradient for j in range(xnnz): diff --git a/sklearn/linear_model/_sgd_fast.pxd b/sklearn/linear_model/_sgd_fast.pxd index 7ae704eee18db..da7f155c6fa6e 100644 --- a/sklearn/linear_model/_sgd_fast.pxd +++ b/sklearn/linear_model/_sgd_fast.pxd @@ -2,25 +2,25 @@ """Helper to load LossFunction from sgd_fast.pyx to sag_fast.pyx""" cdef class LossFunction: - cdef double loss(self, double p, double y) noexcept nogil - cdef double dloss(self, double p, double y) noexcept nogil + cdef double loss(self, double y, double p) noexcept nogil + cdef double dloss(self, double y, double p) noexcept nogil cdef class Regression(LossFunction): - cdef double loss(self, double p, double y) noexcept nogil - cdef double dloss(self, double p, double y) noexcept nogil + cdef double loss(self, double y, double p) noexcept nogil + cdef double dloss(self, double y, double p) noexcept nogil cdef class Classification(LossFunction): - cdef double loss(self, double p, double y) noexcept nogil - cdef double dloss(self, double p, double y) noexcept nogil + cdef double loss(self, double y, double p) noexcept nogil + cdef double dloss(self, double y, double p) noexcept nogil cdef class Log(Classification): - cdef double loss(self, double p, double y) noexcept nogil - cdef double dloss(self, double p, double y) noexcept nogil + cdef double loss(self, double y, double p) noexcept nogil + cdef double dloss(self, double y, double p) noexcept nogil cdef class SquaredLoss(Regression): - cdef double loss(self, double p, double y) noexcept nogil - cdef double dloss(self, double p, double y) noexcept nogil + cdef double loss(self, double y, double p) noexcept nogil + cdef double dloss(self, double y, double p) noexcept nogil diff --git a/sklearn/linear_model/_sgd_fast.pyx.tp b/sklearn/linear_model/_sgd_fast.pyx.tp index bcd2bd7e5576e..b92d983a1b4b8 100644 --- a/sklearn/linear_model/_sgd_fast.pyx.tp +++ b/sklearn/linear_model/_sgd_fast.pyx.tp @@ -77,15 +77,15 @@ cdef extern from *: cdef class LossFunction: """Base class for convex loss functions""" - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: """Evaluate the loss function. Parameters ---------- - p : double - The prediction, `p = w^T x + intercept`. y : double The true value (aka target). + p : double + The prediction, `p = w^T x + intercept`. Returns ------- @@ -111,7 +111,7 @@ cdef class LossFunction: double The derivative of the loss function with regards to `p`. """ - return self.dloss(p, y) + return self.dloss(y, p) def py_loss(self, double p, double y): """Python version of `loss` for testing. @@ -130,18 +130,18 @@ cdef class LossFunction: double The loss evaluated at `p` and `y`. """ - return self.loss(p, y) + return self.loss(y, p) - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: """Evaluate the derivative of the loss function with respect to the prediction `p`. Parameters ---------- - p : double - The prediction, `p = w^T x`. y : double The true value (aka target). + p : double + The prediction, `p = w^T x`. Returns ------- @@ -154,20 +154,20 @@ cdef class LossFunction: cdef class Regression(LossFunction): """Base class for loss functions for regression""" - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: return 0. - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: return 0. cdef class Classification(LossFunction): """Base class for loss functions for classification""" - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: return 0. - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: return 0. @@ -179,7 +179,7 @@ cdef class ModifiedHuber(Classification): See T. Zhang 'Solving Large Scale Linear Prediction Problems Using Stochastic Gradient Descent', ICML'04. """ - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: cdef double z = p * y if z >= 1.0: return 0.0 @@ -188,7 +188,7 @@ cdef class ModifiedHuber(Classification): else: return -4.0 * z - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: cdef double z = p * y if z >= 1.0: return 0.0 @@ -217,13 +217,13 @@ cdef class Hinge(Classification): def __init__(self, double threshold=1.0): self.threshold = threshold - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: cdef double z = p * y if z <= self.threshold: return self.threshold - z return 0.0 - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: cdef double z = p * y if z <= self.threshold: return -y @@ -249,13 +249,13 @@ cdef class SquaredHinge(Classification): def __init__(self, double threshold=1.0): self.threshold = threshold - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: cdef double z = self.threshold - p * y if z > 0: return z * z return 0.0 - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: cdef double z = self.threshold - p * y if z > 0: return -2 * y * z @@ -268,7 +268,7 @@ cdef class SquaredHinge(Classification): cdef class Log(Classification): """Logistic regression loss for binary classification with y in {-1, 1}""" - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: cdef double z = p * y # approximately equal and saves the computation of the log if z > 18: @@ -277,7 +277,7 @@ cdef class Log(Classification): return -z return log(1.0 + exp(-z)) - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: cdef double z = p * y # approximately equal and saves the computation of the log if z > 18.0: @@ -292,10 +292,10 @@ cdef class Log(Classification): cdef class SquaredLoss(Regression): """Squared loss traditional used in linear regression.""" - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: return 0.5 * (p - y) * (p - y) - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: return p - y def __reduce__(self): @@ -316,7 +316,7 @@ cdef class Huber(Regression): def __init__(self, double c): self.c = c - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: cdef double r = p - y cdef double abs_r = fabs(r) if abs_r <= self.c: @@ -324,7 +324,7 @@ cdef class Huber(Regression): else: return self.c * abs_r - (0.5 * self.c * self.c) - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: cdef double r = p - y cdef double abs_r = fabs(r) if abs_r <= self.c: @@ -349,11 +349,11 @@ cdef class EpsilonInsensitive(Regression): def __init__(self, double epsilon): self.epsilon = epsilon - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: cdef double ret = fabs(y - p) - self.epsilon return ret if ret > 0 else 0 - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: if y - p > self.epsilon: return -1 elif p - y > self.epsilon: @@ -376,11 +376,11 @@ cdef class SquaredEpsilonInsensitive(Regression): def __init__(self, double epsilon): self.epsilon = epsilon - cdef double loss(self, double p, double y) noexcept nogil: + cdef double loss(self, double y, double p) noexcept nogil: cdef double ret = fabs(y - p) - self.epsilon return ret * ret if ret > 0 else 0 - cdef double dloss(self, double p, double y) noexcept nogil: + cdef double dloss(self, double y, double p) noexcept nogil: cdef double z z = y - p if z > self.epsilon: @@ -569,7 +569,7 @@ def _plain_sgd{{name_suffix}}( if learning_rate == OPTIMAL: typw = np.sqrt(1.0 / np.sqrt(alpha)) # computing eta0, the initial learning rate - initial_eta0 = typw / max(1.0, loss.dloss(-typw, 1.0)) + initial_eta0 = typw / max(1.0, loss.dloss(1.0, -typw)) # initialize t such that eta at first sample equals eta0 optimal_init = 1.0 / (initial_eta0 * alpha) @@ -598,7 +598,7 @@ def _plain_sgd{{name_suffix}}( eta = eta0 / pow(t, power_t) if verbose or not early_stopping: - sumloss += loss.loss(p, y) + sumloss += loss.loss(y, p) if y > 0.0: class_weight = weight_pos @@ -609,12 +609,12 @@ def _plain_sgd{{name_suffix}}( update = sqnorm(x_data_ptr, x_ind_ptr, xnnz) if update == 0: continue - update = min(C, loss.loss(p, y) / update) + update = min(C, loss.loss(y, p) / update) elif learning_rate == PA2: update = sqnorm(x_data_ptr, x_ind_ptr, xnnz) - update = loss.loss(p, y) / (update + 0.5 / C) + update = loss.loss(y, p) / (update + 0.5 / C) else: - dloss = loss.dloss(p, y) + dloss = loss.dloss(y, p) # clip dloss with large values to avoid numerical # instabilities if dloss < -MAX_DLOSS: From 783643525060c0988570c7400048e08b1e59826e Mon Sep 17 00:00:00 2001 From: Yao Xiao <108576690+Charlie-XIAO@users.noreply.github.com> Date: Thu, 18 Jan 2024 19:44:26 +0800 Subject: [PATCH 087/554] FIX `AffinityPropagation` assigning multiple clusters for equal points (#28121) --- doc/whats_new/v1.4.rst | 19 +++++++++++++++++++ sklearn/cluster/_affinity_propagation.py | 2 +- .../tests/test_affinity_propagation.py | 11 +++++++++++ 3 files changed, 31 insertions(+), 1 deletion(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index ad3cc404f5930..c674a8619e076 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -2,6 +2,25 @@ .. currentmodule:: sklearn +.. _changes_1_4_1: + +Version 1.4.1 +============= + +**In Development** + +Changelog +--------- + +:mod:`sklearn.cluster` +...................... + +- |Fix| :class:`cluster.AffinityPropagation` now avoids assigning multiple different + clusters for equal points. + :pr:`28121` by :user:`Pietro Peterlongo ` and + :user:`Yao Xiao `. + + .. _changes_1_4: Version 1.4.0 diff --git a/sklearn/cluster/_affinity_propagation.py b/sklearn/cluster/_affinity_propagation.py index 5587a7fd5aa1f..f9ae3ae8cb1f4 100644 --- a/sklearn/cluster/_affinity_propagation.py +++ b/sklearn/cluster/_affinity_propagation.py @@ -53,7 +53,7 @@ def _affinity_propagation( "All samples have mutually equal similarities. " "Returning arbitrary cluster center(s)." ) - if preference.flat[0] >= S.flat[n_samples - 1]: + if preference.flat[0] > S.flat[n_samples - 1]: return ( (np.arange(n_samples), np.arange(n_samples), 0) if return_n_iter diff --git a/sklearn/cluster/tests/test_affinity_propagation.py b/sklearn/cluster/tests/test_affinity_propagation.py index 319385635376e..c3138e59111ed 100644 --- a/sklearn/cluster/tests/test_affinity_propagation.py +++ b/sklearn/cluster/tests/test_affinity_propagation.py @@ -308,3 +308,14 @@ def test_sparse_input_for_fit_predict(csr_container): X = csr_container(rng.randint(0, 2, size=(5, 5))) labels = af.fit_predict(X) assert_array_equal(labels, (0, 1, 1, 2, 3)) + + +def test_affinity_propagation_equal_points(): + """Make sure we do not assign multiple clusters to equal points. + + Non-regression test for: + https://github.com/scikit-learn/scikit-learn/pull/20043 + """ + X = np.zeros((8, 1)) + af = AffinityPropagation(affinity="euclidean", damping=0.5, random_state=42).fit(X) + assert np.all(af.labels_ == 0) From 06e566eb86cfd8c6107cf3bc2b477c97b80002a3 Mon Sep 17 00:00:00 2001 From: Neto Date: Thu, 18 Jan 2024 12:48:28 +0100 Subject: [PATCH 088/554] ENH add n_jobs to mutual_info_regression and mutual_info_classif (#28085) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Guillaume Lemaitre Co-authored-by: Loïc Estève --- doc/whats_new/v1.5.rst | 8 ++ sklearn/feature_selection/_mutual_info.py | 80 +++++++++++++++++-- .../tests/test_mutual_info.py | 16 ++++ 3 files changed, 97 insertions(+), 7 deletions(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 4a44bd6666615..6eec5591c440d 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -54,6 +54,14 @@ Changelog by passing a function in place of a strategy name. :pr:`28053` by :user:`Mark Elliot `. +:mod:`sklearn.feature_selection` +................................ + +- |Enhancement| :func:`feature_selection.mutual_info_regression` and + :func:`feature_selection.mutual_info_classif` now support `n_jobs` parameter. + :pr:`28085` by :user:`Neto Menoci ` and + :user:`Florin Andrei `. + :mod:`sklearn.metrics` ...................... diff --git a/sklearn/feature_selection/_mutual_info.py b/sklearn/feature_selection/_mutual_info.py index 821ef889e7ed9..f3808068f46a5 100644 --- a/sklearn/feature_selection/_mutual_info.py +++ b/sklearn/feature_selection/_mutual_info.py @@ -13,6 +13,7 @@ from ..utils import check_random_state from ..utils._param_validation import Interval, StrOptions, validate_params from ..utils.multiclass import check_classification_targets +from ..utils.parallel import Parallel, delayed from ..utils.validation import check_array, check_X_y @@ -201,11 +202,13 @@ def _iterate_columns(X, columns=None): def _estimate_mi( X, y, + *, discrete_features="auto", discrete_target=False, n_neighbors=3, copy=True, random_state=None, + n_jobs=None, ): """Estimate mutual information between the features and the target. @@ -242,6 +245,16 @@ def _estimate_mi( Pass an int for reproducible results across multiple function calls. See :term:`Glossary `. + n_jobs : int, default=None + The number of jobs to use for computing the mutual information. + The parallelization is done on the columns of `X`. + ``None`` means 1 unless in a :obj:`joblib.parallel_backend` context. + ``-1`` means using all processors. See :term:`Glossary ` + for more details. + + .. versionadded:: 1.5 + + Returns ------- mi : ndarray, shape (n_features,) @@ -301,10 +314,10 @@ def _estimate_mi( * rng.standard_normal(size=n_samples) ) - mi = [ - _compute_mi(x, y, discrete_feature, discrete_target, n_neighbors) + mi = Parallel(n_jobs=n_jobs)( + delayed(_compute_mi)(x, y, discrete_feature, discrete_target, n_neighbors) for x, discrete_feature in zip(_iterate_columns(X), discrete_mask) - ] + ) return np.array(mi) @@ -317,11 +330,19 @@ def _estimate_mi( "n_neighbors": [Interval(Integral, 1, None, closed="left")], "copy": ["boolean"], "random_state": ["random_state"], + "n_jobs": [Integral, None], }, prefer_skip_nested_validation=True, ) def mutual_info_regression( - X, y, *, discrete_features="auto", n_neighbors=3, copy=True, random_state=None + X, + y, + *, + discrete_features="auto", + n_neighbors=3, + copy=True, + random_state=None, + n_jobs=None, ): """Estimate mutual information for a continuous target variable. @@ -367,6 +388,16 @@ def mutual_info_regression( Pass an int for reproducible results across multiple function calls. See :term:`Glossary `. + n_jobs : int, default=None + The number of jobs to use for computing the mutual information. + The parallelization is done on the columns of `X`. + + ``None`` means 1 unless in a :obj:`joblib.parallel_backend` context. + ``-1`` means using all processors. See :term:`Glossary ` + for more details. + + .. versionadded:: 1.5 + Returns ------- mi : ndarray, shape (n_features,) @@ -407,7 +438,16 @@ def mutual_info_regression( >>> mutual_info_regression(X, y) array([0.1..., 2.6... , 0.0...]) """ - return _estimate_mi(X, y, discrete_features, False, n_neighbors, copy, random_state) + return _estimate_mi( + X, + y, + discrete_features=discrete_features, + discrete_target=False, + n_neighbors=n_neighbors, + copy=copy, + random_state=random_state, + n_jobs=n_jobs, + ) @validate_params( @@ -418,11 +458,19 @@ def mutual_info_regression( "n_neighbors": [Interval(Integral, 1, None, closed="left")], "copy": ["boolean"], "random_state": ["random_state"], + "n_jobs": [Integral, None], }, prefer_skip_nested_validation=True, ) def mutual_info_classif( - X, y, *, discrete_features="auto", n_neighbors=3, copy=True, random_state=None + X, + y, + *, + discrete_features="auto", + n_neighbors=3, + copy=True, + random_state=None, + n_jobs=None, ): """Estimate mutual information for a discrete target variable. @@ -468,6 +516,15 @@ def mutual_info_classif( Pass an int for reproducible results across multiple function calls. See :term:`Glossary `. + n_jobs : int, default=None + The number of jobs to use for computing the mutual information. + The parallelization is done on the columns of `X`. + ``None`` means 1 unless in a :obj:`joblib.parallel_backend` context. + ``-1`` means using all processors. See :term:`Glossary ` + for more details. + + .. versionadded:: 1.5 + Returns ------- mi : ndarray, shape (n_features,) @@ -511,4 +568,13 @@ def mutual_info_classif( 0. , 0. , 0. , 0. , 0. ]) """ check_classification_targets(y) - return _estimate_mi(X, y, discrete_features, True, n_neighbors, copy, random_state) + return _estimate_mi( + X, + y, + discrete_features=discrete_features, + discrete_target=True, + n_neighbors=n_neighbors, + copy=copy, + random_state=random_state, + n_jobs=n_jobs, + ) diff --git a/sklearn/feature_selection/tests/test_mutual_info.py b/sklearn/feature_selection/tests/test_mutual_info.py index 26367544baa53..4922b7e4e57b3 100644 --- a/sklearn/feature_selection/tests/test_mutual_info.py +++ b/sklearn/feature_selection/tests/test_mutual_info.py @@ -1,6 +1,7 @@ import numpy as np import pytest +from sklearn.datasets import make_classification, make_regression from sklearn.feature_selection import mutual_info_classif, mutual_info_regression from sklearn.feature_selection._mutual_info import _compute_mi from sklearn.utils import check_random_state @@ -252,3 +253,18 @@ def test_mutual_info_regression_X_int_dtype(global_random_seed): expected = mutual_info_regression(X_float, y, random_state=global_random_seed) result = mutual_info_regression(X, y, random_state=global_random_seed) assert_allclose(result, expected) + + +@pytest.mark.parametrize( + "mutual_info_func, data_generator", + [ + (mutual_info_regression, make_regression), + (mutual_info_classif, make_classification), + ], +) +def test_mutual_info_n_jobs(global_random_seed, mutual_info_func, data_generator): + """Check that results are consistent with different `n_jobs`.""" + X, y = data_generator(random_state=global_random_seed) + single_job = mutual_info_func(X, y, random_state=global_random_seed, n_jobs=1) + multi_job = mutual_info_func(X, y, random_state=global_random_seed, n_jobs=2) + assert_allclose(single_job, multi_job) From d418e79a41d5d4de167114d88465a0b47b0d6a2c Mon Sep 17 00:00:00 2001 From: Miki Watanabe <105326591+MikiPWata@users.noreply.github.com> Date: Thu, 18 Jan 2024 08:39:36 -0500 Subject: [PATCH 089/554] DOC: Added dropdowns to 4.1 PDPs (#27187) --- doc/modules/partial_dependence.rst | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/doc/modules/partial_dependence.rst b/doc/modules/partial_dependence.rst index 7ce099f2342e9..6fe5a79b51f63 100644 --- a/doc/modules/partial_dependence.rst +++ b/doc/modules/partial_dependence.rst @@ -79,6 +79,10 @@ parameter takes a list of indices, names of the categorical features or a boolea mask. The graphical representation of partial dependence for categorical features is a bar plot or a 2D heatmap. +|details-start| +**PDPs for multi-class classification** +|details-split| + For multi-class classification, you need to set the class label for which the PDPs should be created via the ``target`` argument:: @@ -93,6 +97,8 @@ the PDPs should be created via the ``target`` argument:: The same parameter ``target`` is used to specify the target in multi-output regression settings. +|details-end| + If you need the raw values of the partial dependence function rather than the plots, you can use the :func:`sklearn.inspection.partial_dependence` function:: From fe5ba6ff3bc2baac7a0a776db61c8e27c5094fb8 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Thu, 18 Jan 2024 15:07:59 +0100 Subject: [PATCH 090/554] ENH TfidfTransformer perserves np.float32 dtype (#28136) --- doc/whats_new/v1.5.rst | 4 ++++ sklearn/feature_extraction/text.py | 27 +++++++++++++++++---------- 2 files changed, 21 insertions(+), 10 deletions(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 6eec5591c440d..159b8029c9137 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -47,6 +47,10 @@ Changelog for storing the inverse document frequency. :pr:`18843` by :user:`Paolo Montesel `. +- |Enhancement| :class:`feature_extraction.text.TfidfTransformer` now preserves + the data type of the input matrix if it is `np.float64` or `np.float32`. + :pr:`28136` by :user:`Guillaume Lemaitre `. + :mod:`sklearn.impute` ..................... diff --git a/sklearn/feature_extraction/text.py b/sklearn/feature_extraction/text.py index cef6f340e83c8..ea6686ef45eaa 100644 --- a/sklearn/feature_extraction/text.py +++ b/sklearn/feature_extraction/text.py @@ -1666,23 +1666,21 @@ def fit(self, X, y=None): ) if not sp.issparse(X): X = sp.csr_matrix(X) - dtype = X.dtype if X.dtype in FLOAT_DTYPES else np.float64 + dtype = X.dtype if X.dtype in (np.float64, np.float32) else np.float64 if self.use_idf: - n_samples, n_features = X.shape + n_samples, _ = X.shape df = _document_frequency(X) df = df.astype(dtype, copy=False) # perform idf smoothing if required - df += int(self.smooth_idf) + df += float(self.smooth_idf) n_samples += int(self.smooth_idf) # log+1 instead of log makes sure terms with zero idf don't get # suppressed entirely. + # `np.log` preserves the dtype of `df` and thus `dtype`. self.idf_ = np.log(n_samples / df) + 1.0 - # FIXME: for backward compatibility, we force idf_ to be np.float64 - # In the future, we should preserve the `dtype` of `X`. - self.idf_ = self.idf_.astype(np.float64, copy=False) return self @@ -1705,14 +1703,18 @@ def transform(self, X, copy=True): """ check_is_fitted(self) X = self._validate_data( - X, accept_sparse="csr", dtype=FLOAT_DTYPES, copy=copy, reset=False + X, + accept_sparse="csr", + dtype=[np.float64, np.float32], + copy=copy, + reset=False, ) if not sp.issparse(X): - X = sp.csr_matrix(X, dtype=np.float64) + X = sp.csr_matrix(X, dtype=X.dtype) if self.sublinear_tf: np.log(X.data, X.data) - X.data += 1 + X.data += 1.0 if hasattr(self, "idf_"): # the columns of X (CSR matrix) can be accessed with `X.indices `and @@ -1725,7 +1727,12 @@ def transform(self, X, copy=True): return X def _more_tags(self): - return {"X_types": ["2darray", "sparse"]} + return { + "X_types": ["2darray", "sparse"], + # FIXME: np.float16 could be preserved if _inplace_csr_row_normalize_l2 + # accepted it. + "preserves_dtype": [np.float64, np.float32], + } class TfidfVectorizer(CountVectorizer): From 95b2c9de28ab9c9f5895d5c5115b99f039de0105 Mon Sep 17 00:00:00 2001 From: Arturo Amor <86408019+ArturoAmorQ@users.noreply.github.com> Date: Thu, 18 Jan 2024 16:19:09 +0100 Subject: [PATCH 091/554] DOC Fix blank space in dropdown (#28166) Co-authored-by: ArturoAmorQ --- doc/model_persistence.rst | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/doc/model_persistence.rst b/doc/model_persistence.rst index 1e0cc36be534d..0f775c774465a 100644 --- a/doc/model_persistence.rst +++ b/doc/model_persistence.rst @@ -118,11 +118,10 @@ A more secure format: `skops` `skops `__ provides a more secure format via the :mod:`skops.io` module. It avoids using :mod:`pickle` and only loads files which have types and references to functions which are trusted -either by default or by the user. +either by default or by the user. |details-start| **Using skops** - |details-split| The API is very similar to ``pickle``, and From d43d7d61c159a63fd2c8ffebca505b9b3ae41a4e Mon Sep 17 00:00:00 2001 From: Linus Sommer <95619282+linus-md@users.noreply.github.com> Date: Thu, 18 Jan 2024 23:22:43 +0100 Subject: [PATCH 092/554] DOC: Added drop down menus to `1.8` Cross Decomposition (#27916) --- doc/modules/cross_decomposition.rst | 21 ++++++++++++++------- 1 file changed, 14 insertions(+), 7 deletions(-) diff --git a/doc/modules/cross_decomposition.rst b/doc/modules/cross_decomposition.rst index 337a7bcd250bb..8f8d217f87144 100644 --- a/doc/modules/cross_decomposition.rst +++ b/doc/modules/cross_decomposition.rst @@ -92,9 +92,9 @@ Step *a)* may be performed in two ways: either by computing the whole SVD of values, or by directly computing the singular vectors using the power method (cf section 11.3 in [1]_), which corresponds to the `'nipals'` option of the `algorithm` parameter. - -Transforming data -^^^^^^^^^^^^^^^^^ +|details-start| +**Transforming data** +|details-split| To transform :math:`X` into :math:`\bar{X}`, we need to find a projection matrix :math:`P` such that :math:`\bar{X} = XP`. We know that for the @@ -106,9 +106,11 @@ training data, :math:`\Xi = XP`, and :math:`X = \Xi \Gamma^T`. Setting Similarly, :math:`Y` can be transformed using the rotation matrix :math:`V(\Delta^T V)^{-1}`, accessed via the `y_rotations_` attribute. +|details-end| -Predicting the targets Y -^^^^^^^^^^^^^^^^^^^^^^^^ +|details-start| +**Predicting the targets Y** +|details-split| To predict the targets of some data :math:`X`, we are looking for a coefficient matrix :math:`\beta \in R^{d \times t}` such that :math:`Y = @@ -125,6 +127,8 @@ P \Delta^T`, and as a result the coefficient matrix :math:`\beta = \alpha P :math:`\beta` can be accessed through the `coef_` attribute. +|details-end| + PLSSVD ------ @@ -180,14 +184,17 @@ Since :class:`CCA` involves the inversion of :math:`X_k^TX_k` and :math:`Y_k^TY_k`, this estimator can be unstable if the number of features or targets is greater than the number of samples. - -.. topic:: Reference: +|details-start| +**Reference** +|details-split| .. [1] `A survey of Partial Least Squares (PLS) methods, with emphasis on the two-block case `_ JA Wegelin +|details-end| + .. topic:: Examples: * :ref:`sphx_glr_auto_examples_cross_decomposition_plot_compare_cross_decomposition.py` From 26dfe833aa5122997a6b66197df0e03629a45e3a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= <34657725+jeremiedbb@users.noreply.github.com> Date: Fri, 19 Jan 2024 06:51:23 +0100 Subject: [PATCH 093/554] Fix prevent infinite loop in KMeans (#28165) --- doc/whats_new/v1.4.rst | 3 +++ sklearn/cluster/_k_means_common.pyx | 16 ++++++++++++++++ sklearn/cluster/tests/test_k_means.py | 18 ++++++++++++++++++ 3 files changed, 37 insertions(+) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index c674a8619e076..ee47bae7b1f5b 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -20,6 +20,9 @@ Changelog :pr:`28121` by :user:`Pietro Peterlongo ` and :user:`Yao Xiao `. +- |Fix| Avoid infinite loop in :class:`cluster.KMeans` when the number of clusters is + larger than the number of non-duplicate samples. + :pr:`28165` by :user:`Jérémie du Boisberranger `. .. _changes_1_4: diff --git a/sklearn/cluster/_k_means_common.pyx b/sklearn/cluster/_k_means_common.pyx index 151af55076b7b..7c9c1bb54eaae 100644 --- a/sklearn/cluster/_k_means_common.pyx +++ b/sklearn/cluster/_k_means_common.pyx @@ -192,6 +192,11 @@ cpdef void _relocate_empty_clusters_dense( int new_cluster_id, old_cluster_id, far_idx, idx, k floating weight + if np.max(distances) == 0: + # Happens when there are more clusters than non-duplicate samples. Relocating + # is pointless in this case. + return + for idx in range(n_empty): new_cluster_id = empty_clusters[idx] @@ -241,6 +246,11 @@ cpdef void _relocate_empty_clusters_sparse( X_indices[X_indptr[i]: X_indptr[i + 1]], centers_old[j], centers_squared_norms[j], True) + if np.max(distances) == 0: + # Happens when there are more clusters than non-duplicate samples. Relocating + # is pointless in this case. + return + cdef: int[::1] far_from_centers = np.argpartition(distances, -n_empty)[:-n_empty-1:-1].astype(np.int32) @@ -274,12 +284,18 @@ cdef void _average_centers( int n_features = centers.shape[1] int j, k floating alpha + int argmax_weight = np.argmax(weight_in_clusters) for j in range(n_clusters): if weight_in_clusters[j] > 0: alpha = 1.0 / weight_in_clusters[j] for k in range(n_features): centers[j, k] *= alpha + else: + # For convenience, we avoid setting empty clusters at the origin but place + # them at the location of the biggest cluster. + for k in range(n_features): + centers[j, k] = centers[argmax_weight, k] cdef void _center_shift( diff --git a/sklearn/cluster/tests/test_k_means.py b/sklearn/cluster/tests/test_k_means.py index 5b0c7ab9aace8..4a112a30b29ed 100644 --- a/sklearn/cluster/tests/test_k_means.py +++ b/sklearn/cluster/tests/test_k_means.py @@ -1352,3 +1352,21 @@ def test_sample_weight_zero(init, global_random_seed): # (i.e. be at a distance=0 from it) d = euclidean_distances(X[::2], clusters_weighted) assert not np.any(np.isclose(d, 0)) + + +@pytest.mark.parametrize("array_constr", data_containers, ids=data_containers_ids) +@pytest.mark.parametrize("algorithm", ["lloyd", "elkan"]) +def test_relocating_with_duplicates(algorithm, array_constr): + """Check that kmeans stops when there are more centers than non-duplicate samples + + Non-regression test for issue: + https://github.com/scikit-learn/scikit-learn/issues/28055 + """ + X = np.array([[0, 0], [1, 1], [1, 1], [1, 0], [0, 1]]) + km = KMeans(n_clusters=5, init=X, algorithm=algorithm) + + msg = r"Number of distinct clusters \(4\) found smaller than n_clusters \(5\)" + with pytest.warns(ConvergenceWarning, match=msg): + km.fit(array_constr(X)) + + assert km.n_iter_ == 1 From 2da6d17bb472524b883d81afa4a85bd7a1c89d60 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Fri, 19 Jan 2024 07:32:04 +0100 Subject: [PATCH 094/554] CI Remove temporary work-around related to scipy and pandas development wheel installing numpy<2 (#28163) --- build_tools/azure/install.sh | 18 +++++++++++++----- 1 file changed, 13 insertions(+), 5 deletions(-) diff --git a/build_tools/azure/install.sh b/build_tools/azure/install.sh index 5bd4112a1820b..df20e27b3c068 100755 --- a/build_tools/azure/install.sh +++ b/build_tools/azure/install.sh @@ -47,6 +47,16 @@ pre_python_environment_install() { } +check_packages_dev_version() { + for package in $@; do + package_version=$(python -c "import $package; print($package.__version__)") + if ! [[ $package_version =~ "dev" ]]; then + echo "$package is not a development version: $package_version" + exit 1 + fi + done +} + python_environment_install_and_activate() { if [[ "$DISTRIB" == "conda"* ]]; then # Install/update conda with the libmamba solver because the legacy @@ -71,12 +81,10 @@ python_environment_install_and_activate() { if [[ "$DISTRIB" == "conda-pip-scipy-dev" ]]; then echo "Installing development dependency wheels" dev_anaconda_url=https://pypi.anaconda.org/scientific-python-nightly-wheels/simple - pip install --pre --upgrade --timeout=60 --extra-index $dev_anaconda_url numpy pandas scipy + dev_packages="numpy scipy pandas" + pip install --pre --upgrade --timeout=60 --extra-index $dev_anaconda_url $dev_packages - # XXX: at the time of writing, installing scipy or pandas from the dev - # wheels forces the numpy dependency to be < 2.0.0. Let's force the - # installation of numpy dev wheels instead. - pip install --pre --upgrade --timeout=60 --extra-index $dev_anaconda_url numpy + check_packages_dev_version $dev_packages echo "Installing Cython from latest sources" pip install https://github.com/cython/cython/archive/master.zip From 21fcab7223257d01dab5397424de9057128d5467 Mon Sep 17 00:00:00 2001 From: Andrei Dzis Date: Fri, 19 Jan 2024 13:11:23 +0300 Subject: [PATCH 095/554] DOC Added relation between ROC-AUC and Gini in docstring of roc_auc_score (#28156) Co-authored-by: Guillaume Lemaitre --- sklearn/metrics/_ranking.py | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/sklearn/metrics/_ranking.py b/sklearn/metrics/_ranking.py index 4a2e7aa1b78a3..a117a5427a996 100644 --- a/sklearn/metrics/_ranking.py +++ b/sklearn/metrics/_ranking.py @@ -538,6 +538,21 @@ class scores must correspond to the order of ``labels``, RocCurveDisplay.from_predictions : Plot Receiver Operating Characteristic (ROC) curve given the true and predicted values. + Notes + ----- + The Gini Coefficient is a summary measure of the ranking ability of binary + classifiers. It is expressed using the area under of the ROC as follows: + + G = 2 * AUC - 1 + + Where G is the Gini coefficient and AUC is the ROC-AUC score. This normalisation + will ensure that random guessing will yield a score of 0 in expectation, and it is + upper bounded by 1. + + Note that there is another version of the Gini coefficient for regressors of a + continuous positive target variable. In this case, AUC is taken over the Lorenz + curve instead of the ROC [6]_. + References ---------- .. [1] `Wikipedia entry for the Receiver operating characteristic @@ -558,6 +573,8 @@ class scores must correspond to the order of ``labels``, Under the ROC Curve for Multiple Class Classification Problems. Machine Learning, 45(2), 171-186. `_ + .. [6] `Wikipedia entry for the Gini coefficient + `_ Examples -------- From a3c8da18af46da0d0e32027dacb20501647b078a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= <34657725+jeremiedbb@users.noreply.github.com> Date: Fri, 19 Jan 2024 13:01:11 +0100 Subject: [PATCH 096/554] MAINT Update SECURITY.md for 1.4.0 (#28182) --- SECURITY.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/SECURITY.md b/SECURITY.md index 721f2041c2b85..3f291e7a566f8 100644 --- a/SECURITY.md +++ b/SECURITY.md @@ -4,8 +4,8 @@ | Version | Supported | | --------- | ------------------ | -| 1.3.2 | :white_check_mark: | -| < 1.3.2 | :x: | +| 1.4.0 | :white_check_mark: | +| < 1.4.0 | :x: | ## Reporting a Vulnerability From 5c7e831306e0a087c2b6af6913fa5b3c402f6d67 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Fri, 19 Jan 2024 13:58:02 +0100 Subject: [PATCH 097/554] DOC use list for the ridge_regression docstring (#28168) --- sklearn/linear_model/_ridge.py | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/sklearn/linear_model/_ridge.py b/sklearn/linear_model/_ridge.py index c4f52c68e697e..5ce4a8c2fd3b8 100644 --- a/sklearn/linear_model/_ridge.py +++ b/sklearn/linear_model/_ridge.py @@ -552,14 +552,15 @@ def ridge_regression( Examples -------- + >>> import numpy as np >>> from sklearn.datasets import make_regression >>> from sklearn.linear_model import ridge_regression - >>> X, y = make_regression( - ... n_features=4, n_informative=2, shuffle=False, random_state=0 - ... ) + >>> rng = np.random.RandomState(0) + >>> X = rng.randn(100, 4) + >>> y = 2.0 * X[:, 0] - 1.0 * X[:, 1] + 0.1 * rng.standard_normal(100) >>> coef, intercept = ridge_regression(X, y, alpha=1.0, return_intercept=True) - >>> coef - array([20.2..., 33.7..., 0.1..., 0.0...]) + >>> list(coef) + [1.97..., -1.00..., -0.0..., -0.0...] >>> intercept -0.0... """ From 66a6551786c3d257a7b4f0b23a705f52f868c235 Mon Sep 17 00:00:00 2001 From: Andrei Dzis Date: Fri, 19 Jan 2024 23:15:37 +0300 Subject: [PATCH 098/554] DOC Fix for roc_auc_score documentation (#28190) --- sklearn/metrics/_ranking.py | 4 ---- 1 file changed, 4 deletions(-) diff --git a/sklearn/metrics/_ranking.py b/sklearn/metrics/_ranking.py index a117a5427a996..4a960a2f4402a 100644 --- a/sklearn/metrics/_ranking.py +++ b/sklearn/metrics/_ranking.py @@ -549,10 +549,6 @@ class scores must correspond to the order of ``labels``, will ensure that random guessing will yield a score of 0 in expectation, and it is upper bounded by 1. - Note that there is another version of the Gini coefficient for regressors of a - continuous positive target variable. In this case, AUC is taken over the Lorenz - curve instead of the ROC [6]_. - References ---------- .. [1] `Wikipedia entry for the Receiver operating characteristic From 2020648edfdbdeb4797465434ed4afd6e79ce2ed Mon Sep 17 00:00:00 2001 From: 101AlexMartin <101071686+101AlexMartin@users.noreply.github.com> Date: Sat, 20 Jan 2024 10:53:07 +0100 Subject: [PATCH 099/554] MNT changed order pre-commits hooks following ruff recommendation (#28062) Co-authored-by: Alejandro Martin --- .pre-commit-config.yaml | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index abffbbe149f2c..506e3ab4fe64e 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -5,16 +5,16 @@ repos: - id: check-yaml - id: end-of-file-fixer - id: trailing-whitespace -- repo: https://github.com/psf/black - rev: 23.3.0 - hooks: - - id: black - repo: https://github.com/astral-sh/ruff-pre-commit # Ruff version. rev: v0.0.272 hooks: - id: ruff args: ["--fix", "--show-source"] +- repo: https://github.com/psf/black + rev: 23.3.0 + hooks: + - id: black - repo: https://github.com/pre-commit/mirrors-mypy rev: v1.3.0 hooks: From 6a1022353103cefb93258f503b087d821262a1b6 Mon Sep 17 00:00:00 2001 From: Rodrigo Romero <69991220+rromer07@users.noreply.github.com> Date: Sat, 20 Jan 2024 06:48:55 -0500 Subject: [PATCH 100/554] DOC add docstring example to `sklearn.metrics.consensus_score` (#28193) --- sklearn/metrics/cluster/_bicluster.py | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/sklearn/metrics/cluster/_bicluster.py b/sklearn/metrics/cluster/_bicluster.py index b9ca47c9b91aa..713d0bee8fa2e 100644 --- a/sklearn/metrics/cluster/_bicluster.py +++ b/sklearn/metrics/cluster/_bicluster.py @@ -89,6 +89,14 @@ def consensus_score(a, b, *, similarity="jaccard"): * Hochreiter, Bodenhofer, et. al., 2010. `FABIA: factor analysis for bicluster acquisition `__. + + Examples + -------- + >>> from sklearn.metrics import consensus_score + >>> a = ([[True, False], [False, True]], [[False, True], [True, False]]) + >>> b = ([[False, True], [True, False]], [[True, False], [False, True]]) + >>> consensus_score(a, b, similarity='jaccard') + 1.0 """ if similarity == "jaccard": similarity = _jaccard From 836690a401057572ef7d3478a9a3aa78dfa1447b Mon Sep 17 00:00:00 2001 From: Rodrigo Romero <69991220+rromer07@users.noreply.github.com> Date: Sat, 20 Jan 2024 14:42:16 -0500 Subject: [PATCH 101/554] DOC add docstring example to `sklearn.metrics.coverage_error` (#28196) Co-authored-by: Guillaume Lemaitre --- sklearn/metrics/_ranking.py | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/sklearn/metrics/_ranking.py b/sklearn/metrics/_ranking.py index 4a960a2f4402a..74ae6dcf04299 100644 --- a/sklearn/metrics/_ranking.py +++ b/sklearn/metrics/_ranking.py @@ -1300,6 +1300,14 @@ def coverage_error(y_true, y_score, *, sample_weight=None): .. [1] Tsoumakas, G., Katakis, I., & Vlahavas, I. (2010). Mining multi-label data. In Data mining and knowledge discovery handbook (pp. 667-685). Springer US. + + Examples + -------- + >>> from sklearn.metrics import coverage_error + >>> y_true = [[1, 0, 0], [0, 1, 1]] + >>> y_score = [[1, 0, 0], [0, 1, 1]] + >>> coverage_error(y_true, y_score) + 1.5 """ y_true = check_array(y_true, ensure_2d=True) y_score = check_array(y_score, ensure_2d=True) From 897c0c570511be4b7912a335052ed479ac5ca1f3 Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Sat, 20 Jan 2024 21:08:36 +0100 Subject: [PATCH 102/554] ENH improve HGBT predict classes (#27844) Co-authored-by: Guillaume Lemaitre --- doc/whats_new/v1.4.rst | 4 ++++ .../_hist_gradient_boosting/gradient_boosting.py | 16 +++++++++++++--- 2 files changed, 17 insertions(+), 3 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index ee47bae7b1f5b..d832e4b508359 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -494,6 +494,10 @@ Changelog which allows to retrieve the training sample indices used for each tree estimator. :pr:`26736` by :user:`Adam Li `. +- |Efficiency| Improves runtime of `predict` of + :class:`ensemble.HistGradientBoostingClassifier` by avoiding to call `predict_proba`. + :pr:`27844` by :user:`Christian Lorentzen `. + - |Fix| Fixes :class:`ensemble.IsolationForest` when the input is a sparse matrix and `contamination` is set to a float value. :pr:`27645` by :user:`Guillaume Lemaitre `. diff --git a/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py b/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py index 0837d19407030..698fd0629d02e 100644 --- a/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py +++ b/sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py @@ -2137,7 +2137,13 @@ def predict(self, X): The predicted classes. """ # TODO: This could be done in parallel - encoded_classes = np.argmax(self.predict_proba(X), axis=1) + raw_predictions = self._raw_predict(X) + if raw_predictions.shape[1] == 1: + # np.argmax([0.5, 0.5]) is 0, not 1. Therefore "> 0" not ">= 0" to be + # consistent with the multiclass case. + encoded_classes = (raw_predictions.ravel() > 0).astype(int) + else: + encoded_classes = np.argmax(raw_predictions, axis=1) return self.classes_[encoded_classes] def staged_predict(self, X): @@ -2158,8 +2164,12 @@ def staged_predict(self, X): y : generator of ndarray of shape (n_samples,) The predicted classes of the input samples, for each iteration. """ - for proba in self.staged_predict_proba(X): - encoded_classes = np.argmax(proba, axis=1) + for raw_predictions in self._staged_raw_predict(X): + if raw_predictions.shape[1] == 1: + # np.argmax([0, 0]) is 0, not 1, therefor "> 0" not ">= 0" + encoded_classes = (raw_predictions.ravel() > 0).astype(int) + else: + encoded_classes = np.argmax(raw_predictions, axis=1) yield self.classes_.take(encoded_classes, axis=0) def predict_proba(self, X): From b4754ba7eeacf1519fb827392d99207d38011627 Mon Sep 17 00:00:00 2001 From: "Thomas J. Fan" Date: Mon, 22 Jan 2024 02:31:13 -0500 Subject: [PATCH 103/554] ENH Checks pandas and polars directly (#28195) --- doc/whats_new/v1.4.rst | 3 +++ sklearn/utils/validation.py | 26 ++++++++++---------------- 2 files changed, 13 insertions(+), 16 deletions(-) diff --git a/doc/whats_new/v1.4.rst b/doc/whats_new/v1.4.rst index d832e4b508359..98bfcd2d96f54 100644 --- a/doc/whats_new/v1.4.rst +++ b/doc/whats_new/v1.4.rst @@ -24,6 +24,9 @@ Changelog larger than the number of non-duplicate samples. :pr:`28165` by :user:`Jérémie du Boisberranger `. +- |Enhancement| Pandas and Polars dataframe are validated directly without ducktyping + checks. :pr:`28195` by `Thomas Fan`_. + .. _changes_1_4: Version 1.4.0 diff --git a/sklearn/utils/validation.py b/sklearn/utils/validation.py index 6531a9da3404b..43f553eb2d2d5 100644 --- a/sklearn/utils/validation.py +++ b/sklearn/utils/validation.py @@ -2070,26 +2070,20 @@ def _check_method_params(X, params, indices=None): def _is_pandas_df(X): """Return True if the X is a pandas dataframe.""" - if hasattr(X, "columns") and hasattr(X, "iloc"): - # Likely a pandas DataFrame, we explicitly check the type to confirm. - try: - pd = sys.modules["pandas"] - except KeyError: - return False - return isinstance(X, pd.DataFrame) - return False + try: + pd = sys.modules["pandas"] + except KeyError: + return False + return isinstance(X, pd.DataFrame) def _is_polars_df(X): """Return True if the X is a polars dataframe.""" - if hasattr(X, "columns") and hasattr(X, "schema"): - # Likely a polars DataFrame, we explicitly check the type to confirm. - try: - pl = sys.modules["polars"] - except KeyError: - return False - return isinstance(X, pl.DataFrame) - return False + try: + pl = sys.modules["polars"] + except KeyError: + return False + return isinstance(X, pl.DataFrame) def _get_feature_names(X): From 69cef4adc1d689828958328598712e8b2937971d Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Mon, 22 Jan 2024 10:53:04 +0100 Subject: [PATCH 104/554] FIX _convert_container should be able to convert from sparse to sparse (#28185) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Loïc Estève --- sklearn/utils/_testing.py | 40 ++++++++++++++++------------- sklearn/utils/tests/test_testing.py | 29 +++++++++++++++++++++ 2 files changed, 51 insertions(+), 18 deletions(-) diff --git a/sklearn/utils/_testing.py b/sklearn/utils/_testing.py index b49622627c7ae..bb4da452712d2 100644 --- a/sklearn/utils/_testing.py +++ b/sklearn/utils/_testing.py @@ -775,8 +775,6 @@ def _convert_container( return tuple(np.asarray(container, dtype=dtype).tolist()) elif constructor_name == "array": return np.asarray(container, dtype=dtype) - elif constructor_name == "sparse": - return sp.sparse.csr_matrix(np.atleast_2d(container), dtype=dtype) elif constructor_name in ("pandas", "dataframe"): pd = pytest.importorskip("pandas", minversion=minversion) result = pd.DataFrame(container, columns=columns_name, dtype=dtype, copy=False) @@ -813,22 +811,28 @@ def _convert_container( return pd.Index(container, dtype=dtype) elif constructor_name == "slice": return slice(container[0], container[1]) - elif constructor_name == "sparse_csr": - return sp.sparse.csr_matrix(np.atleast_2d(container), dtype=dtype) - elif constructor_name == "sparse_csr_array": - if sp_version >= parse_version("1.8"): - return sp.sparse.csr_array(np.atleast_2d(container), dtype=dtype) - raise ValueError( - f"sparse_csr_array is only available with scipy>=1.8.0, got {sp_version}" - ) - elif constructor_name == "sparse_csc": - return sp.sparse.csc_matrix(np.atleast_2d(container), dtype=dtype) - elif constructor_name == "sparse_csc_array": - if sp_version >= parse_version("1.8"): - return sp.sparse.csc_array(np.atleast_2d(container), dtype=dtype) - raise ValueError( - f"sparse_csc_array is only available with scipy>=1.8.0, got {sp_version}" - ) + elif "sparse" in constructor_name: + if not sp.sparse.issparse(container): + # For scipy >= 1.13, sparse array constructed from 1d array may be + # 1d or raise an exception. To avoid this, we make sure that the + # input container is 2d. For more details, see + # https://github.com/scipy/scipy/pull/18530#issuecomment-1878005149 + container = np.atleast_2d(container) + + if "array" in constructor_name and sp_version < parse_version("1.8"): + raise ValueError( + f"{constructor_name} is only available with scipy>=1.8.0, got " + f"{sp_version}" + ) + if constructor_name in ("sparse", "sparse_csr"): + # sparse and sparse_csr are equivalent for legacy reasons + return sp.sparse.csr_matrix(container, dtype=dtype) + elif constructor_name == "sparse_csr_array": + return sp.sparse.csr_array(container, dtype=dtype) + elif constructor_name == "sparse_csc": + return sp.sparse.csc_matrix(container, dtype=dtype) + elif constructor_name == "sparse_csc_array": + return sp.sparse.csc_array(container, dtype=dtype) def raises(expected_exc_type, match=None, may_pass=False, err_msg=None): diff --git a/sklearn/utils/tests/test_testing.py b/sklearn/utils/tests/test_testing.py index f24b4de928201..c6132afd0c1d4 100644 --- a/sklearn/utils/tests/test_testing.py +++ b/sklearn/utils/tests/test_testing.py @@ -845,3 +845,32 @@ def test_assert_run_python_script_without_output(): match="output was not supposed to match.+got.+something to stderr", ): assert_run_python_script_without_output(code, pattern="to.+stderr") + + +@pytest.mark.parametrize( + "constructor_name", + [ + "sparse_csr", + "sparse_csc", + pytest.param( + "sparse_csr_array", + marks=pytest.mark.skipif( + sp_version < parse_version("1.8"), + reason="sparse arrays are available as of scipy 1.8.0", + ), + ), + pytest.param( + "sparse_csc_array", + marks=pytest.mark.skipif( + sp_version < parse_version("1.8"), + reason="sparse arrays are available as of scipy 1.8.0", + ), + ), + ], +) +def test_convert_container_sparse_to_sparse(constructor_name): + """Non-regression test to check that we can still convert a sparse container + from a given format to another format. + """ + X_sparse = sparse.random(10, 10, density=0.1, format="csr") + _convert_container(X_sparse, constructor_name) From 1df773fe12d54beaed1136d7b040571e51f17205 Mon Sep 17 00:00:00 2001 From: Anderson Nelson Date: Mon, 22 Jan 2024 05:16:30 -0500 Subject: [PATCH 105/554] DOC Add docstring examples for covariance module (#28192) Co-authored-by: Guillaume Lemaitre --- sklearn/covariance/_shrunk_covariance.py | 37 ++++++++++++++++++++++++ 1 file changed, 37 insertions(+) diff --git a/sklearn/covariance/_shrunk_covariance.py b/sklearn/covariance/_shrunk_covariance.py index 3a79afa30729f..5df229260b03c 100644 --- a/sklearn/covariance/_shrunk_covariance.py +++ b/sklearn/covariance/_shrunk_covariance.py @@ -134,6 +134,18 @@ def shrunk_covariance(emp_cov, shrinkage=0.1): (1 - shrinkage) * cov + shrinkage * mu * np.identity(n_features) where `mu = trace(cov) / n_features`. + + Examples + -------- + >>> import numpy as np + >>> from sklearn.datasets import make_gaussian_quantiles + >>> from sklearn.covariance import empirical_covariance, shrunk_covariance + >>> real_cov = np.array([[.8, .3], [.3, .4]]) + >>> rng = np.random.RandomState(0) + >>> X = rng.multivariate_normal(mean=[0, 0], cov=real_cov, size=500) + >>> shrunk_covariance(empirical_covariance(X)) + array([[0.73..., 0.25...], + [0.25..., 0.41...]]) """ emp_cov = check_array(emp_cov, allow_nd=True) n_features = emp_cov.shape[-1] @@ -316,6 +328,17 @@ def ledoit_wolf_shrinkage(X, assume_centered=False, block_size=1000): (1 - shrinkage) * cov + shrinkage * mu * np.identity(n_features) where mu = trace(cov) / n_features + + Examples + -------- + >>> import numpy as np + >>> from sklearn.covariance import ledoit_wolf_shrinkage + >>> real_cov = np.array([[.4, .2], [.2, .8]]) + >>> rng = np.random.RandomState(0) + >>> X = rng.multivariate_normal(mean=[0, 0], cov=real_cov, size=50) + >>> shrinkage_coefficient = ledoit_wolf_shrinkage(X) + >>> shrinkage_coefficient + 0.23... """ X = check_array(X) # for only one feature, the result is the same whatever the shrinkage @@ -419,6 +442,20 @@ def ledoit_wolf(X, *, assume_centered=False, block_size=1000): (1 - shrinkage) * cov + shrinkage * mu * np.identity(n_features) where mu = trace(cov) / n_features + + Examples + -------- + >>> import numpy as np + >>> from sklearn.covariance import empirical_covariance, ledoit_wolf + >>> real_cov = np.array([[.4, .2], [.2, .8]]) + >>> rng = np.random.RandomState(0) + >>> X = rng.multivariate_normal(mean=[0, 0], cov=real_cov, size=50) + >>> covariance, shrinkage = ledoit_wolf(X) + >>> covariance + array([[0.44..., 0.16...], + [0.16..., 0.80...]]) + >>> shrinkage + 0.23... """ estimator = LedoitWolf( assume_centered=assume_centered, From 55eb8900b44d62cf665444258adf4a3ae29926a1 Mon Sep 17 00:00:00 2001 From: Shubham <134207725+shubhamparmar1@users.noreply.github.com> Date: Mon, 22 Jan 2024 15:51:08 +0530 Subject: [PATCH 106/554] DOC Add a docstring examples for utils functions (#28181) Co-authored-by: Guillaume Lemaitre --- sklearn/utils/_estimator_html_repr.py | 7 ++++++ sklearn/utils/estimator_checks.py | 7 ++++++ sklearn/utils/extmath.py | 33 +++++++++++++++++++++++++-- 3 files changed, 45 insertions(+), 2 deletions(-) diff --git a/sklearn/utils/_estimator_html_repr.py b/sklearn/utils/_estimator_html_repr.py index dd51a8bbb71de..5e465234f516b 100644 --- a/sklearn/utils/_estimator_html_repr.py +++ b/sklearn/utils/_estimator_html_repr.py @@ -329,6 +329,13 @@ def estimator_html_repr(estimator): ------- html: str HTML representation of estimator. + + Examples + -------- + >>> from sklearn.utils._estimator_html_repr import estimator_html_repr + >>> from sklearn.linear_model import LogisticRegression + >>> estimator_html_repr(LogisticRegression()) + ' + + + + + + + + + + + + + + + From 21bc79201e4b2f57752bc7f148fa585993321216 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Tue, 7 May 2024 13:49:31 +0200 Subject: [PATCH 521/554] DOC Mention the renaming of check_estimator_sparse_data in 1.5 changelog (#28968) --- doc/whats_new/v1.5.rst | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index ede5d5dcbf1ec..e50309a330e39 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -527,6 +527,11 @@ Changelog `axis=0` and supports indexing polars Series. :pr:`28521` by :user:`Yao Xiao `. +- |API| :func:`utils.estimator_checks.check_estimator_sparse_data` was split into two + functions: :func:`utils.estimator_checks.check_estimator_sparse_matrix` and + :func:`utils.estimator_checks.check_estimator_sparse_array`. + :pr:`27576` by :user:`Stefanie Senger `. + .. rubric:: Code and documentation contributors Thanks to everyone who has contributed to the maintenance and improvement of From 92a315c837611ca22f1b8607d212420bf018262d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Tue, 7 May 2024 13:50:46 +0200 Subject: [PATCH 522/554] DOC Update release docs (#28965) --- doc/developers/maintainer.rst | 19 +++++++++---------- 1 file changed, 9 insertions(+), 10 deletions(-) diff --git a/doc/developers/maintainer.rst b/doc/developers/maintainer.rst index e82a7993997b2..70d132d2af604 100644 --- a/doc/developers/maintainer.rst +++ b/doc/developers/maintainer.rst @@ -105,14 +105,13 @@ in the description of the Pull Request to track progress. This PR will be used to push commits related to the release as explained in :ref:`making_a_release`. -You can also create a second PR from main and targeting main to increment -the ``__version__`` variable in `sklearn/__init__.py` to increment the dev -version. This means while we're in the release candidate period, the latest -stable is two versions behind the main branch, instead of one. In this PR -targeting main you should also include a new file for the matching version -under the ``doc/whats_new/`` folder so PRs that target the next version can -contribute their changelog entries to this file in parallel to the release -process. +You can also create a second PR from main and targeting main to increment the +``__version__`` variable in `sklearn/__init__.py` and in `pyproject.toml` to increment +the dev version. This means while we're in the release candidate period, the latest +stable is two versions behind the main branch, instead of one. In this PR targeting +main you should also include a new file for the matching version under the +``doc/whats_new/`` folder so PRs that target the next version can contribute their +changelog entries to this file in parallel to the release process. Minor version release (also known as bug-fix release) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -212,8 +211,8 @@ Making a release the old entries (two years or three releases are typically good enough) and to update the on-going development entry. -2. On the branch for releasing, update the version number in - ``sklearn/__init__.py``, the ``__version__``. +2. On the branch for releasing, update the version number in ``sklearn/__init__.py``, + the ``__version__`` variable, and in `pyproject.toml`. For major releases, please add a 0 at the end: `0.99.0` instead of `0.99`. From 1f364173b8affa02080f3019cadfe71f629029e1 Mon Sep 17 00:00:00 2001 From: Omar Salman Date: Tue, 7 May 2024 19:16:11 +0500 Subject: [PATCH 523/554] DOC updates for d2_log_loss_score (#28969) --- sklearn/metrics/_classification.py | 6 +++--- sklearn/metrics/tests/test_classification.py | 3 ++- 2 files changed, 5 insertions(+), 4 deletions(-) diff --git a/sklearn/metrics/_classification.py b/sklearn/metrics/_classification.py index 04894a4d7a7e7..b68f1593e317e 100644 --- a/sklearn/metrics/_classification.py +++ b/sklearn/metrics/_classification.py @@ -3277,10 +3277,10 @@ def d2_log_loss_score(y_true, y_pred, *, sample_weight=None, labels=None): :math:`D^2` score function, fraction of log loss explained. Best possible score is 1.0 and it can be negative (because the model can be - arbitrarily worse). A model that always uses the empirical mean of `y_true` as - constant prediction, disregarding the input features, gets a D^2 score of 0.0. + arbitrarily worse). A model that always predicts the per-class proportions + of `y_true`, disregarding the input features, gets a D^2 score of 0.0. - Read more in the :ref:`User Guide `. + Read more in the :ref:`User Guide `. .. versionadded:: 1.5 diff --git a/sklearn/metrics/tests/test_classification.py b/sklearn/metrics/tests/test_classification.py index 40b762bfa7308..b87e76ba2fb42 100644 --- a/sklearn/metrics/tests/test_classification.py +++ b/sklearn/metrics/tests/test_classification.py @@ -3048,7 +3048,8 @@ def test_d2_log_loss_score(): def test_d2_log_loss_score_raises(): - """Test that d2_log_loss raises error on invalid input.""" + """Test that d2_log_loss_score raises the appropriate errors on + invalid inputs.""" y_true = [0, 1, 2] y_pred = [[0.2, 0.8], [0.5, 0.5], [0.4, 0.6]] err = "contain different number of classes" From 21e8a2486e9d6accf2a44191052f53878f172194 Mon Sep 17 00:00:00 2001 From: Abdulaziz Aloqeely <52792999+Aloqeely@users.noreply.github.com> Date: Fri, 10 May 2024 01:11:16 +0300 Subject: [PATCH 524/554] Update supported python versions in docs (#28986) --- doc/install.rst | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/doc/install.rst b/doc/install.rst index c4a3548016021..89851171f4588 100644 --- a/doc/install.rst +++ b/doc/install.rst @@ -166,7 +166,8 @@ purpose. Scikit-learn 0.22 supported Python 3.5-3.8. Scikit-learn 0.23 - 0.24 require Python 3.6 or newer. Scikit-learn 1.0 supported Python 3.7-3.10. - Scikit-learn 1.1 and later requires Python 3.8 or newer. + Scikit-learn 1.1, 1.2 and 1.3 support Python 3.8-3.12 + Scikit-learn 1.4 requires Python 3.9 or newer. .. _install_by_distribution: From f53fd43f180d8ecc4d1711bc02a3a7934bcb30a3 Mon Sep 17 00:00:00 2001 From: Conrad Stevens Date: Sun, 12 May 2024 07:29:55 +1000 Subject: [PATCH 525/554] DOC fix gp predic doc typo (#28987) Co-authored-by: Conrad --- sklearn/gaussian_process/_gpr.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/sklearn/gaussian_process/_gpr.py b/sklearn/gaussian_process/_gpr.py index 67bba2e29c857..829c1e2fad2d8 100644 --- a/sklearn/gaussian_process/_gpr.py +++ b/sklearn/gaussian_process/_gpr.py @@ -384,7 +384,7 @@ def predict(self, X, return_std=False, return_cov=False): Returns ------- y_mean : ndarray of shape (n_samples,) or (n_samples, n_targets) - Mean of predictive distribution a query points. + Mean of predictive distribution at query points. y_std : ndarray of shape (n_samples,) or (n_samples, n_targets), optional Standard deviation of predictive distribution at query points. @@ -392,7 +392,7 @@ def predict(self, X, return_std=False, return_cov=False): y_cov : ndarray of shape (n_samples, n_samples) or \ (n_samples, n_samples, n_targets), optional - Covariance of joint predictive distribution a query points. + Covariance of joint predictive distribution at query points. Only returned when `return_cov` is True. """ if return_std and return_cov: From a660d89038434b9fd10a775baf86865e2f9c7310 Mon Sep 17 00:00:00 2001 From: Nathan Goldbaum Date: Mon, 13 May 2024 01:53:27 -0600 Subject: [PATCH 526/554] MAINT: specify C17 as C standard in meson.build (#28980) --- meson.build | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/meson.build b/meson.build index 3835a5099abb0..52c7deb962277 100644 --- a/meson.build +++ b/meson.build @@ -6,7 +6,7 @@ project( meson_version: '>= 1.1.0', default_options: [ 'buildtype=debugoptimized', - 'c_std=c99', + 'c_std=c17', 'cpp_std=c++14', ], ) From 47476144a4063813442eaf25908747e1a9c2dcc7 Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Mon, 13 May 2024 09:58:57 +0200 Subject: [PATCH 527/554] MNT remove author and license in GLM files (#28799) --- .../plot_poisson_regression_non_normal_loss.py | 7 ++----- .../plot_tweedie_regression_insurance_claims.py | 7 ++----- sklearn/linear_model/_glm/__init__.py | 4 ++-- sklearn/linear_model/_glm/_newton_solver.py | 5 ++--- sklearn/linear_model/_glm/glm.py | 6 ++---- sklearn/linear_model/_glm/tests/__init__.py | 3 ++- sklearn/linear_model/_glm/tests/test_glm.py | 6 ++---- 7 files changed, 14 insertions(+), 24 deletions(-) diff --git a/examples/linear_model/plot_poisson_regression_non_normal_loss.py b/examples/linear_model/plot_poisson_regression_non_normal_loss.py index 2a80c3db0ff40..180ee3b70671c 100644 --- a/examples/linear_model/plot_poisson_regression_non_normal_loss.py +++ b/examples/linear_model/plot_poisson_regression_non_normal_loss.py @@ -1,3 +1,5 @@ +# Authors: The scikit-learn developers +# SPDX-License-Identifier: BSD-3-Clause """ ====================================== Poisson regression and non-normal loss @@ -36,11 +38,6 @@ """ -# Authors: Christian Lorentzen -# Roman Yurchak -# Olivier Grisel -# License: BSD 3 clause - import matplotlib.pyplot as plt import numpy as np import pandas as pd diff --git a/examples/linear_model/plot_tweedie_regression_insurance_claims.py b/examples/linear_model/plot_tweedie_regression_insurance_claims.py index 96e32ee031190..31a91fb37c766 100644 --- a/examples/linear_model/plot_tweedie_regression_insurance_claims.py +++ b/examples/linear_model/plot_tweedie_regression_insurance_claims.py @@ -1,3 +1,5 @@ +# Authors: The scikit-learn developers +# SPDX-License-Identifier: BSD-3-Clause """ ====================================== Tweedie regression on insurance claims @@ -37,11 +39,6 @@ `_ """ -# Authors: Christian Lorentzen -# Roman Yurchak -# Olivier Grisel -# License: BSD 3 clause - # %% from functools import partial diff --git a/sklearn/linear_model/_glm/__init__.py b/sklearn/linear_model/_glm/__init__.py index 1b82bbd77bcf9..199b938b023d0 100644 --- a/sklearn/linear_model/_glm/__init__.py +++ b/sklearn/linear_model/_glm/__init__.py @@ -1,5 +1,5 @@ -# License: BSD 3 clause - +# Authors: The scikit-learn developers +# SPDX-License-Identifier: BSD-3-Clause from .glm import ( GammaRegressor, PoissonRegressor, diff --git a/sklearn/linear_model/_glm/_newton_solver.py b/sklearn/linear_model/_glm/_newton_solver.py index 20df35e6b48c2..b2be604d931c5 100644 --- a/sklearn/linear_model/_glm/_newton_solver.py +++ b/sklearn/linear_model/_glm/_newton_solver.py @@ -1,10 +1,9 @@ +# Authors: The scikit-learn developers +# SPDX-License-Identifier: BSD-3-Clause """ Newton solver for Generalized Linear Models """ -# Author: Christian Lorentzen -# License: BSD 3 clause - import warnings from abc import ABC, abstractmethod diff --git a/sklearn/linear_model/_glm/glm.py b/sklearn/linear_model/_glm/glm.py index 4cac889a4da51..14caa4fd733c2 100644 --- a/sklearn/linear_model/_glm/glm.py +++ b/sklearn/linear_model/_glm/glm.py @@ -1,11 +1,9 @@ +# Authors: The scikit-learn developers +# SPDX-License-Identifier: BSD-3-Clause """ Generalized Linear Models with Exponential Dispersion Family """ -# Author: Christian Lorentzen -# some parts and tricks stolen from other sklearn files. -# License: BSD 3 clause - from numbers import Integral, Real import numpy as np diff --git a/sklearn/linear_model/_glm/tests/__init__.py b/sklearn/linear_model/_glm/tests/__init__.py index 588cf7e93eef0..67dd18fb94b59 100644 --- a/sklearn/linear_model/_glm/tests/__init__.py +++ b/sklearn/linear_model/_glm/tests/__init__.py @@ -1 +1,2 @@ -# License: BSD 3 clause +# Authors: The scikit-learn developers +# SPDX-License-Identifier: BSD-3-Clause diff --git a/sklearn/linear_model/_glm/tests/test_glm.py b/sklearn/linear_model/_glm/tests/test_glm.py index 26f6bdc08d254..7f6ec64c15ad4 100644 --- a/sklearn/linear_model/_glm/tests/test_glm.py +++ b/sklearn/linear_model/_glm/tests/test_glm.py @@ -1,7 +1,5 @@ -# Authors: Christian Lorentzen -# -# License: BSD 3 clause - +# Authors: The scikit-learn developers +# SPDX-License-Identifier: BSD-3-Clause import itertools import warnings from functools import partial From 431f158e4f147075d4ecfc5c4239953ed267d66d Mon Sep 17 00:00:00 2001 From: Lucy Liu Date: Mon, 13 May 2024 18:49:23 +1000 Subject: [PATCH 528/554] DOC Update warm start example in ensemble user guide (#28998) --- doc/modules/ensemble.rst | 17 ++++++++++++++++- 1 file changed, 16 insertions(+), 1 deletion(-) diff --git a/doc/modules/ensemble.rst b/doc/modules/ensemble.rst index 4237d023973f7..d18dd2f65009e 100644 --- a/doc/modules/ensemble.rst +++ b/doc/modules/ensemble.rst @@ -603,7 +603,22 @@ fitted model. :: - >>> _ = est.set_params(n_estimators=200, warm_start=True) # set warm_start and new nr of trees + >>> import numpy as np + >>> from sklearn.metrics import mean_squared_error + >>> from sklearn.datasets import make_friedman1 + >>> from sklearn.ensemble import GradientBoostingRegressor + + >>> X, y = make_friedman1(n_samples=1200, random_state=0, noise=1.0) + >>> X_train, X_test = X[:200], X[200:] + >>> y_train, y_test = y[:200], y[200:] + >>> est = GradientBoostingRegressor( + ... n_estimators=100, learning_rate=0.1, max_depth=1, random_state=0, + ... loss='squared_error' + ... ) + >>> est = est.fit(X_train, y_train) # fit with 100 trees + >>> mean_squared_error(y_test, est.predict(X_test)) + 5.00... + >>> _ = est.set_params(n_estimators=200, warm_start=True) # set warm_start and increase num of trees >>> _ = est.fit(X_train, y_train) # fit additional 100 trees to est >>> mean_squared_error(y_test, est.predict(X_test)) 3.84... From 6ef4ef9aaed7f8d52ea06612e2cb6a88066c0a4a Mon Sep 17 00:00:00 2001 From: Christian Veenhuis <124370897+ChVeen@users.noreply.github.com> Date: Mon, 13 May 2024 11:43:52 +0200 Subject: [PATCH 529/554] MAINT fix redirected link for `Matthews Correlation Coefficient` (#28991) --- sklearn/metrics/_classification.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/sklearn/metrics/_classification.py b/sklearn/metrics/_classification.py index b68f1593e317e..1fb4c1d694be0 100644 --- a/sklearn/metrics/_classification.py +++ b/sklearn/metrics/_classification.py @@ -967,8 +967,8 @@ def matthews_corrcoef(y_true, y_pred, *, sample_weight=None): accuracy of prediction algorithms for classification: an overview. <10.1093/bioinformatics/16.5.412>` - .. [2] `Wikipedia entry for the Matthews Correlation Coefficient - `_. + .. [2] `Wikipedia entry for the Matthews Correlation Coefficient (phi coefficient) + `_. .. [3] `Gorodkin, (2004). Comparing two K-category assignments by a K-category correlation coefficient From 0b380137919cbc572d8aac096dae9e8a79627dd0 Mon Sep 17 00:00:00 2001 From: Ivan Wiryadi <44887783+strivn@users.noreply.github.com> Date: Mon, 13 May 2024 17:01:45 +0700 Subject: [PATCH 530/554] DOC Add links to digit denoising examples in docs and the user guide (#28929) --- doc/modules/decomposition.rst | 2 ++ sklearn/decomposition/_kernel_pca.py | 8 ++++++-- 2 files changed, 8 insertions(+), 2 deletions(-) diff --git a/doc/modules/decomposition.rst b/doc/modules/decomposition.rst index e8241a92cfc3b..e34818a322c7d 100644 --- a/doc/modules/decomposition.rst +++ b/doc/modules/decomposition.rst @@ -291,6 +291,8 @@ prediction (kernel dependency estimation). :class:`KernelPCA` supports both .. topic:: Examples: * :ref:`sphx_glr_auto_examples_decomposition_plot_kernel_pca.py` + * :ref:`sphx_glr_auto_examples_applications_plot_digits_denoising.py` + .. topic:: References: diff --git a/sklearn/decomposition/_kernel_pca.py b/sklearn/decomposition/_kernel_pca.py index edfd49c2e87a0..0f45bc7c9239c 100644 --- a/sklearn/decomposition/_kernel_pca.py +++ b/sklearn/decomposition/_kernel_pca.py @@ -30,7 +30,7 @@ class KernelPCA(ClassNamePrefixFeaturesOutMixin, TransformerMixin, BaseEstimator): - """Kernel Principal component analysis (KPCA) [1]_. + """Kernel Principal Component Analysis (KPCA) [1]_. Non-linear dimensionality reduction through the use of kernels (see :ref:`metrics`). @@ -41,9 +41,13 @@ class KernelPCA(ClassNamePrefixFeaturesOutMixin, TransformerMixin, BaseEstimator components to extract. It can also use a randomized truncated SVD by the method proposed in [3]_, see `eigen_solver`. - For a usage example, see + For a usage example and comparison between + Principal Components Analysis (PCA) and its kernelized version (KPCA), see :ref:`sphx_glr_auto_examples_decomposition_plot_kernel_pca.py`. + For a usage example in denoising images using KPCA, see + :ref:`sphx_glr_auto_examples_applications_plot_digits_denoising.py`. + Read more in the :ref:`User Guide `. Parameters From 39191c93d389acdddab651d1d1df096a2f58477d Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Mon, 13 May 2024 12:22:35 +0200 Subject: [PATCH 531/554] :lock: :robot: CI Update lock files for cirrus-arm CI build(s) :lock: :robot: (#29003) --- .../pymin_conda_forge_linux-aarch64_conda.lock | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock index 585a75c078d8c..660bc9de9ecda 100644 --- a/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock +++ b/build_tools/cirrus/pymin_conda_forge_linux-aarch64_conda.lock @@ -4,25 +4,25 @@ @EXPLICIT https://conda.anaconda.org/conda-forge/linux-aarch64/ca-certificates-2024.2.2-hcefe29a_0.conda#57c226edb90c4e973b9b7503537dd339 https://conda.anaconda.org/conda-forge/linux-aarch64/ld_impl_linux-aarch64-2.40-hba4e955_0.conda#b55c1cb33c63d23b542fa53f24541e56 -https://conda.anaconda.org/conda-forge/linux-aarch64/libstdcxx-ng-13.2.0-h3f4de04_6.conda#dfe2ae16945dc08f163307a6bb3e70e0 +https://conda.anaconda.org/conda-forge/linux-aarch64/libstdcxx-ng-13.2.0-h3f4de04_7.conda#2a54872c7fab2db99b0074212d8efe64 https://conda.anaconda.org/conda-forge/linux-aarch64/python_abi-3.9-4_cp39.conda#c191905a08694e4a5cb1238e90233878 https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h0c530f3_0.conda#161081fc7cec0bfda0d86d7cb595f8d8 https://conda.anaconda.org/conda-forge/linux-aarch64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#98a1185182fec3c434069fa74e6473d6 -https://conda.anaconda.org/conda-forge/linux-aarch64/libgcc-ng-13.2.0-he277a41_6.conda#5ca8651e635390d41004c847f03c2d3c +https://conda.anaconda.org/conda-forge/linux-aarch64/libgcc-ng-13.2.0-he277a41_7.conda#01c5b27ce46f50abab2dc8454842c792 https://conda.anaconda.org/conda-forge/linux-aarch64/bzip2-1.0.8-h31becfc_5.conda#a64e35f01e0b7a2a152eca87d33b9c87 https://conda.anaconda.org/conda-forge/linux-aarch64/lerc-4.0.0-h4de3ea5_0.tar.bz2#1a0ffc65e03ce81559dbcb0695ad1476 https://conda.anaconda.org/conda-forge/linux-aarch64/libbrotlicommon-1.1.0-h31becfc_1.conda#1b219fd801eddb7a94df5bd001053ad9 https://conda.anaconda.org/conda-forge/linux-aarch64/libdeflate-1.20-h31becfc_0.conda#018592a3d691662f451f89d0de474a20 https://conda.anaconda.org/conda-forge/linux-aarch64/libffi-3.4.2-h3557bc0_5.tar.bz2#dddd85f4d52121fab0a8b099c5e06501 -https://conda.anaconda.org/conda-forge/linux-aarch64/libgfortran5-13.2.0-h87d9d71_6.conda#a3fdb6378e561e73c735ec30207daa15 +https://conda.anaconda.org/conda-forge/linux-aarch64/libgfortran5-13.2.0-h87d9d71_7.conda#423eb7de085dd6b46928723edf5f8767 https://conda.anaconda.org/conda-forge/linux-aarch64/libjpeg-turbo-3.0.0-h31becfc_1.conda#ed24e702928be089d9ba3f05618515c6 https://conda.anaconda.org/conda-forge/linux-aarch64/libnsl-2.0.1-h31becfc_0.conda#c14f32510f694e3185704d89967ec422 https://conda.anaconda.org/conda-forge/linux-aarch64/libuuid-2.38.1-hb4cce97_0.conda#000e30b09db0b7c775b21695dff30969 https://conda.anaconda.org/conda-forge/linux-aarch64/libwebp-base-1.4.0-h31becfc_0.conda#5fd7ab3e5f382c70607fbac6335e6e19 https://conda.anaconda.org/conda-forge/linux-aarch64/libxcrypt-4.4.36-h31becfc_1.conda#b4df5d7d4b63579d081fd3a4cf99740e https://conda.anaconda.org/conda-forge/linux-aarch64/libzlib-1.2.13-h31becfc_5.conda#b213aa87eea9491ef7b129179322e955 -https://conda.anaconda.org/conda-forge/linux-aarch64/ncurses-6.4.20240210-h0425590_0.conda#c1a1612ddaee95c83abfa0b2ec858626 -https://conda.anaconda.org/conda-forge/linux-aarch64/ninja-1.12.0-h2a328a1_0.conda#c0f3f508baf69c8db8142466beaa0ccc +https://conda.anaconda.org/conda-forge/linux-aarch64/ncurses-6.5-h0425590_0.conda#38362af7bfac0efef69675acee564458 +https://conda.anaconda.org/conda-forge/linux-aarch64/ninja-1.12.1-h70be974_0.conda#216635cea46498d8045c7cf0f03eaf72 https://conda.anaconda.org/conda-forge/linux-aarch64/openssl-3.3.0-h31becfc_0.conda#36ca60a3afaf2ea2c460daeebd67430e https://conda.anaconda.org/conda-forge/linux-aarch64/pthread-stubs-0.4-hb9de7d4_1001.tar.bz2#d0183ec6ce0b5aaa3486df25fa5f0ded https://conda.anaconda.org/conda-forge/linux-aarch64/xorg-libxau-1.0.11-h31becfc_0.conda#13de34f69cb73165dbe08c1e9148bedb @@ -30,7 +30,7 @@ https://conda.anaconda.org/conda-forge/linux-aarch64/xorg-libxdmcp-1.1.3-h3557bc https://conda.anaconda.org/conda-forge/linux-aarch64/xz-5.2.6-h9cdd2b7_0.tar.bz2#83baad393a31d59c20b63ba4da6592df https://conda.anaconda.org/conda-forge/linux-aarch64/libbrotlidec-1.1.0-h31becfc_1.conda#8db7cff89510bec0b863a0a8ee6a7bce https://conda.anaconda.org/conda-forge/linux-aarch64/libbrotlienc-1.1.0-h31becfc_1.conda#ad3d3a826b5848d99936e4466ebbaa26 -https://conda.anaconda.org/conda-forge/linux-aarch64/libgfortran-ng-13.2.0-he9431aa_6.conda#c8ab19934c000ea8cc9cf1fc6c2aa83d +https://conda.anaconda.org/conda-forge/linux-aarch64/libgfortran-ng-13.2.0-he9431aa_7.conda#d714db6ba9d67d55d21cf96316714ec8 https://conda.anaconda.org/conda-forge/linux-aarch64/libpng-1.6.43-h194ca79_0.conda#1123e504d9254dd9494267ab9aba95f0 https://conda.anaconda.org/conda-forge/linux-aarch64/libsqlite-3.45.3-h194ca79_0.conda#fb35b8afbe9e92467ac7b5608d60b775 https://conda.anaconda.org/conda-forge/linux-aarch64/libxcb-1.15-h2a766a3_0.conda#eb3d8c8170e3d03f2564ed2024aa00c8 @@ -42,7 +42,7 @@ https://conda.anaconda.org/conda-forge/linux-aarch64/freetype-2.12.1-hf0a5ef3_2. https://conda.anaconda.org/conda-forge/linux-aarch64/libhiredis-1.0.2-h05efe27_0.tar.bz2#a87f068744fd20334cd41489eb163bee https://conda.anaconda.org/conda-forge/linux-aarch64/libopenblas-0.3.27-pthreads_h5a5ec62_0.conda#ffecca8f4f31cd50b92c0e6e6bfe4416 https://conda.anaconda.org/conda-forge/linux-aarch64/libtiff-4.6.0-hf980d43_3.conda#b6f3abf5726ae33094bee238b4eb492f -https://conda.anaconda.org/conda-forge/linux-aarch64/llvm-openmp-18.1.4-h767c9be_0.conda#2572130272fb725d825c9b52e5ce096b +https://conda.anaconda.org/conda-forge/linux-aarch64/llvm-openmp-18.1.5-h767c9be_0.conda#a9c2771c36671707f1992e4d0c32aa54 https://conda.anaconda.org/conda-forge/linux-aarch64/python-3.9.19-h4ac3b42_0_cpython.conda#1501507cd9451472ec8900d587ce872f https://conda.anaconda.org/conda-forge/linux-aarch64/brotli-1.1.0-h31becfc_1.conda#e41f5862ac746428407f3fd44d2ed01f https://conda.anaconda.org/conda-forge/linux-aarch64/ccache-4.9.1-h6552966_0.conda#758b202f61f6bbfd2c6adf0fde043276 From cad4b59e2b5cd02acbc2d58a2d2655b8cf265c23 Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Mon, 13 May 2024 12:23:28 +0200 Subject: [PATCH 532/554] :lock: :robot: CI Update lock files for main CI build(s) :lock: :robot: (#29005) --- ...latest_conda_forge_mkl_linux-64_conda.lock | 26 ++++----- ...pylatest_conda_forge_mkl_osx-64_conda.lock | 38 ++++++------ ...test_conda_mkl_no_openmp_osx-64_conda.lock | 10 ++-- ...st_pip_openblas_pandas_linux-64_conda.lock | 10 ++-- ...onda_defaults_openblas_linux-64_conda.lock | 14 ++--- .../pymin_conda_forge_mkl_win-64_conda.lock | 8 +-- ...e_openblas_ubuntu_2204_linux-64_conda.lock | 26 ++++----- build_tools/circle/doc_linux-64_conda.lock | 58 +++++++++---------- .../doc_min_dependencies_linux-64_conda.lock | 50 ++++++++-------- 9 files changed, 120 insertions(+), 120 deletions(-) diff --git a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock index 932fc6ad670f7..3d895fda71bc3 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock @@ -9,13 +9,13 @@ https://conda.anaconda.org/conda-forge/noarch/font-ttf-inconsolata-3.000-h77eed3 https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77eed37_0.tar.bz2#4d59c254e01d9cde7957100457e2d5fb https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_2.conda#cbbe59391138ea5ad3658c76912e147f https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h55db66e_0.conda#10569984e7db886e4f1abc2b47ad79a1 -https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_6.conda#2f18345bbc433c8a1ed887d7161e86a6 +https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_7.conda#53ebd4c833fa01cb2c6353e99f905406 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.11-4_cp311.conda#d786502c97404c94d7d58d258a445a65 https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h0c530f3_0.conda#161081fc7cec0bfda0d86d7cb595f8d8 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 -https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_6.conda#4398809ac84d0b8c28beebaaa83277f5 +https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_7.conda#72ec1b1b04c4d15d4204ece1ecea5978 https://conda.anaconda.org/conda-forge/linux-64/alsa-lib-1.2.11-hd590300_1.conda#0bb492cca54017ea314b809b1ee3a176 https://conda.anaconda.org/conda-forge/linux-64/attr-2.5.1-h166bdaf_1.tar.bz2#d9c69a24ad678ffce24c6543a0176b00 https://conda.anaconda.org/conda-forge/linux-64/aws-c-common-0.9.0-hd590300_0.conda#71b89db63b5b504e7afc8ad901172e1e @@ -37,7 +37,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libev-4.33-hd590300_2.conda#172b https://conda.anaconda.org/conda-forge/linux-64/libexpat-2.6.2-h59595ed_0.conda#e7ba12deb7020dd080c6c70e7b6f6a3d https://conda.anaconda.org/conda-forge/linux-64/libffi-3.4.2-h7f98852_5.tar.bz2#d645c6d2ac96843a2bfaccd2d62b3ac3 https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-0.22.5-h59595ed_2.conda#172bcc51059416e7ce99e7b528cede83 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-h43f5ff8_6.conda#e54a5ddc67e673f9105cf2a2e9c070b0 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-hca663fb_7.conda#c0bd771f09a326fdcd95a60b617795bf https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.17-hd590300_2.conda#d66573916ffcf376178462f1b61c941e https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.0.0-hd590300_1.conda#ea25936bb4080d843790b586850f82b8 https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.1-hd590300_0.conda#30fd6e37fe21f86f4bd26d6ee73eeec7 @@ -51,8 +51,8 @@ https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.cond https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.6-h59595ed_0.conda#9160cdeb523a1b20cf8d2a0bf821f45d -https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4.20240210-h59595ed_0.conda#97da8860a0da5413c7c98a3b3838a645 -https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.0-h00ab1b0_0.conda#b048701d52e7cbb5f59ddd4d3b17bbf5 +https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.5-h59595ed_0.conda#fcea371545eda051b6deafb24889fc69 +https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.1-h297d8ca_0.conda#3aa1c7e292afeff25a0091ddd7c69b72 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.3.0-hd590300_0.conda#c0f3abb4a16477208bbd43a39bd56f18 https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.2-h59595ed_0.conda#71004cbf7924e19c02746ccde9fd7123 @@ -83,7 +83,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libcap-2.69-h0f662aa_0.conda#25c https://conda.anaconda.org/conda-forge/linux-64/libedit-3.1.20191231-he28a2e2_2.tar.bz2#4d331e44109e3f0e19b4cb8f9b82f3e1 https://conda.anaconda.org/conda-forge/linux-64/libevent-2.1.12-hf998b51_1.conda#a1cfcc585f0c42bf8d5546bb1dfb668d https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-devel-0.22.5-h59595ed_2.conda#b63d9b6da3653179a278077f0de20014 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_6.conda#3666a850342f8f3be88f9a93d948d027 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_7.conda#1b84f26d9f4f6026e179e7805d5a15cd https://conda.anaconda.org/conda-forge/linux-64/libnghttp2-1.58.0-h47da74e_1.conda#700ac6ea6d53d5510591c4344d5c989a https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#009981dd9cfcaa4dbfa25ffaed86bcae https://conda.anaconda.org/conda-forge/linux-64/libprotobuf-3.21.12-hfc55251_2.conda#e3a7d4ba09b8dc939b98fef55f539220 @@ -106,7 +106,7 @@ https://conda.anaconda.org/conda-forge/linux-64/brotli-bin-1.0.9-h166bdaf_9.cond https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-h267a509_2.conda#9ae35c3d96db2c94ce0cef86efdfa2cb https://conda.anaconda.org/conda-forge/linux-64/gettext-0.22.5-h59595ed_2.conda#219ba82e95d7614cf7140d2a4afc0926 https://conda.anaconda.org/conda-forge/linux-64/krb5-1.21.2-h659d440_0.conda#cd95826dbd331ed1be26bdf401432844 -https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.0-hf2295e7_6.conda#9342e7c44c38bea649490f72d92c382d +https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.2-hf974151_0.conda#72724f6a78ecb15559396966226d5838 https://conda.anaconda.org/conda-forge/linux-64/libgrpc-1.54.3-hb20ce57_0.conda#7af7c59ab24db007dfd82e0a3a343f66 https://conda.anaconda.org/conda-forge/linux-64/libhiredis-1.0.2-h2cc385e_0.tar.bz2#b34907d3a81a3cd8095ee83d174c074a https://conda.anaconda.org/conda-forge/linux-64/libhwloc-2.10.0-default_h2fb2949_1000.conda#7e3726e647a619c6ce5939014dfde86d @@ -114,9 +114,9 @@ https://conda.anaconda.org/conda-forge/linux-64/libllvm15-15.0.7-hb3ce162_4.cond https://conda.anaconda.org/conda-forge/linux-64/libllvm18-18.1.5-hb77312f_0.conda#efd221d3668077ca067a206269418dec https://conda.anaconda.org/conda-forge/linux-64/libthrift-0.18.1-h8fd135c_2.conda#bbf65f7688512872f063810623b755dc https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-h1dd3fc0_3.conda#66f03896ffbe1a110ffda05c7a856504 -https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.4-ha31de31_0.conda#48b9991e66abc186a7ad7975e97bd4d0 +https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.5-ha31de31_0.conda#b923cdb6e567ada84f991ffcc5848afb https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.3.0-hca2cd23_4.conda#1b50eebe2a738a3146c154d2eceaa8b6 -https://conda.anaconda.org/conda-forge/linux-64/nss-3.98-h1d7d5a4_0.conda#54b56c2fdf973656b748e0378900ec13 +https://conda.anaconda.org/conda-forge/linux-64/nss-3.100-hca3bf56_0.conda#949c4a82290ee58b3c970cef4bcfd4ad https://conda.anaconda.org/conda-forge/linux-64/orc-1.9.0-h2f23424_1.conda#9571eb3eb0f7fe8b59956a7786babbcd https://conda.anaconda.org/conda-forge/linux-64/python-3.11.9-hb806964_0_cpython.conda#ac68acfa8b558ed406c75e98d3428d7b https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 @@ -137,7 +137,7 @@ https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2#e https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_2.conda#8d652ea2ee8eaee02ed8dc820bc794aa https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda#15dda3cdbf330abfe9f555d22f66db46 https://conda.anaconda.org/conda-forge/linux-64/fontconfig-2.14.2-h14ed4e7_0.conda#0f69b688f52ff6da70bccb7ff7001d1d -https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.0-hde27a5a_6.conda#a9d23c02485c5cf055f9ac90eb9c9c63 +https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.2-hb6ce0ca_0.conda#a965aeaf060289528a3fbe09326edae2 https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda#f800d2da156d08e289b14e87e43c1ae5 https://conda.anaconda.org/conda-forge/linux-64/kiwisolver-1.4.5-py311h9547e67_1.conda#2c65bdf442b0d37aad080c8a4e0d452f https://conda.anaconda.org/conda-forge/linux-64/lcms2-2.16-hb7c19ff_0.conda#51bb7010fc86f70eee639b4bb7a894f5 @@ -147,7 +147,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libcups-2.3.3-h4637d8d_4.conda#d https://conda.anaconda.org/conda-forge/linux-64/libcurl-8.7.1-hca28451_0.conda#755c7f876815003337d2c61ff5d047e5 https://conda.anaconda.org/conda-forge/linux-64/libflac-1.4.3-h59595ed_0.conda#ee48bf17cc83a00f59ca1494d5646869 https://conda.anaconda.org/conda-forge/linux-64/libgpg-error-1.49-h4f305b6_0.conda#dfcfd72c7a430d3616763ecfbefe4ca9 -https://conda.anaconda.org/conda-forge/linux-64/libpq-16.2-h33b98f1_1.conda#9e49ec2a61d02623b379dc332eb6889d +https://conda.anaconda.org/conda-forge/linux-64/libpq-16.3-ha72fbe1_0.conda#bac737ae28b79cfbafd515258d97d29e https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2#2ba8498c1018c1e9c61eb99b973dfe19 https://conda.anaconda.org/conda-forge/linux-64/openjpeg-2.5.2-h488ebb8_0.conda#7f2e286780f072ed750df46dc2631138 https://conda.anaconda.org/conda-forge/noarch/packaging-24.0-pyhd8ed1ab_0.conda#248f521b64ce055e7feae3105e7abeb8 @@ -174,7 +174,7 @@ https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.9.3-hb447be9_1.cond https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f907bb958910dc404647326ca80c263e https://conda.anaconda.org/conda-forge/linux-64/coverage-7.5.1-py311h331c9d8_0.conda#9f35e13e3b9e05e153b78f42662061f6 https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.51.0-py311h459d7ec_0.conda#17e1997cc17c571d5ad27bd0159f616c -https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.0-hf2295e7_6.conda#a1e026a82a562b443845db5614ca568a +https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.2-hf974151_0.conda#d427988dc3dbd0a4c136f52db356cc6a https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda#25df261d4523d9f9783bcdb7208d872f https://conda.anaconda.org/conda-forge/linux-64/libgcrypt-1.10.3-hd590300_0.conda#32d16ad533c59bb0a3c5ffaf16110829 https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.12.0-hac9eb74_1.conda#0dee716254497604762957076ac76540 @@ -210,7 +210,7 @@ https://conda.anaconda.org/conda-forge/noarch/array-api-strict-1.1.1-pyhd8ed1ab_ https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.1-py311h9547e67_0.conda#74ad0ae64f1ef565e27eda87fa749e84 https://conda.anaconda.org/conda-forge/linux-64/libarrow-12.0.1-hb87d912_8_cpu.conda#3f3b11398fe79b578e3c44dd00a44e4a https://conda.anaconda.org/conda-forge/linux-64/pandas-2.2.2-py311h320fe9a_0.conda#c79e96ece4110fdaf2657c9f8e16f749 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.23-py311h00856b1_0.conda#c000e1629d890ad00bb8c20963028d9f +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.25-py311h00856b1_0.conda#84ad7fa8742f6d34784a961337622c55 https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py311hf0fb5b6_5.conda#ec7e45bc76d9d0b69a74a2075932b8e8 https://conda.anaconda.org/conda-forge/linux-64/pytorch-1.13.1-cpu_py311h410fd25_1.conda#ddd2fadddf89e3dc3d541a2537fce010 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.13.0-py311h517d4fd_1.conda#a86b8bea39e292a23b2cf9a750f49ea1 diff --git a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock index 7f3e749a5728d..ce2d5e2c383a3 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock @@ -6,7 +6,6 @@ https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-h10d778d_5.conda#6097a https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.2.2-h8857fd0_0.conda#f2eacee8c33c43692f1ccfd33d0f50b1 https://conda.anaconda.org/conda-forge/osx-64/icu-73.2-hf5e326d_0.conda#5cc301d759ec03f28328428e28f65591 https://conda.anaconda.org/conda-forge/osx-64/libbrotlicommon-1.1.0-h0dc2134_1.conda#9e6c31441c9aa24e41ace40d6151aab6 -https://conda.anaconda.org/conda-forge/osx-64/libcxx-16.0.6-hd57cbcb_0.conda#7d6972792161077908b62971802f289a https://conda.anaconda.org/conda-forge/osx-64/libdeflate-1.20-h49d49c5_0.conda#d46104f6a896a0bc6a1d37b88b2edf5c https://conda.anaconda.org/conda-forge/osx-64/libexpat-2.6.2-h73e2aa4_0.conda#3d1d51c8f716d97c864d12f7af329526 https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2#ccb34fb14960ad8b125962d3d79b31a9 @@ -16,39 +15,38 @@ https://conda.anaconda.org/conda-forge/osx-64/libjpeg-turbo-3.0.0-h0dc2134_1.con https://conda.anaconda.org/conda-forge/osx-64/libwebp-base-1.4.0-h10d778d_0.conda#b2c0047ea73819d992484faacbbe1c24 https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.2.13-h8a1eda9_5.conda#4a3ad23f6e16f99c04e166767193d700 https://conda.anaconda.org/conda-forge/osx-64/mkl-include-2023.2.0-h6bab518_50500.conda#835abb8ded5e26f23ea6996259c7972e -https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.4.20240210-h73e2aa4_0.conda#50f28c512e9ad78589e3eab34833f762 +https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-h5846eda_0.conda#02a888433d165c99bf09784a7b14d900 https://conda.anaconda.org/conda-forge/osx-64/pthread-stubs-0.4-hc929b4f_1001.tar.bz2#addd19059de62181cd11ae8f4ef26084 https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.12-4_cp312.conda#87201ac4314b911b74197e588cca3639 https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h0c530f3_0.conda#161081fc7cec0bfda0d86d7cb595f8d8 https://conda.anaconda.org/conda-forge/osx-64/xorg-libxau-1.0.11-h0dc2134_0.conda#9566b4c29274125b0266d0177b5eb97b https://conda.anaconda.org/conda-forge/osx-64/xorg-libxdmcp-1.1.3-h35c211d_0.tar.bz2#86ac76d6bf1cbb9621943eb3bd9ae36e https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2#a72f9d4ea13d55d745ff1ed594747f10 -https://conda.anaconda.org/conda-forge/osx-64/gmp-6.3.0-h73e2aa4_1.conda#92f8d748d95d97f92fc26cfac9bb5b6e -https://conda.anaconda.org/conda-forge/osx-64/isl-0.26-imath32_h2e86a7b_101.conda#d06222822a9144918333346f145b68c6 -https://conda.anaconda.org/conda-forge/osx-64/lerc-4.0.0-hb486fe8_0.tar.bz2#f9d6a4c82889d5ecedec1d90eb673c55 https://conda.anaconda.org/conda-forge/osx-64/libbrotlidec-1.1.0-h0dc2134_1.conda#9ee0bab91b2ca579e10353738be36063 https://conda.anaconda.org/conda-forge/osx-64/libbrotlienc-1.1.0-h0dc2134_1.conda#8a421fe09c6187f0eb5e2338a8a8be6d +https://conda.anaconda.org/conda-forge/osx-64/libcxx-17.0.6-h88467a6_0.conda#0fe355aecb8d24b8bc07c763209adbd9 https://conda.anaconda.org/conda-forge/osx-64/libpng-1.6.43-h92b6c6a_0.conda#65dcddb15965c9de2c0365cb14910532 https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.45.3-h92b6c6a_0.conda#68e462226209f35182ef66eda0f794ff https://conda.anaconda.org/conda-forge/osx-64/libxcb-1.15-hb7f2c08_0.conda#5513f57e0238c87c12dffedbcc9c1a4a https://conda.anaconda.org/conda-forge/osx-64/libxml2-2.12.6-hc0ae0f7_2.conda#50b997370584f2c83ca0c38e9028eab9 -https://conda.anaconda.org/conda-forge/osx-64/llvm-openmp-18.1.4-h2c61cee_0.conda#0619a2dda8b7e25b78abc0b3d872744f -https://conda.anaconda.org/conda-forge/osx-64/ninja-1.12.0-h7728843_0.conda#1ac079f6ecddd2c336f3acb7b371851f +https://conda.anaconda.org/conda-forge/osx-64/llvm-openmp-18.1.5-h39e0ece_0.conda#ee12a644568269838b91f901b2537425 https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.0-hd75f5a5_0.conda#eb8c33aa7929a7714eab8b90c1d88afe https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda#f17f77f2acf4d344734bda76829ce14e -https://conda.anaconda.org/conda-forge/osx-64/tapi-1100.0.11-h9ce4665_0.tar.bz2#f9ff42ccf809a21ba6f8607f8de36108 https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda#bf830ba5afc507c6232d4ef0fb1a882d https://conda.anaconda.org/conda-forge/osx-64/zlib-1.2.13-h8a1eda9_5.conda#75a8a98b1c4671c5d2897975731da42d https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda#4cb2cd56f039b129bb0e491c1164167e https://conda.anaconda.org/conda-forge/osx-64/brotli-bin-1.1.0-h0dc2134_1.conda#ece565c215adcc47fc1db4e651ee094b https://conda.anaconda.org/conda-forge/osx-64/freetype-2.12.1-h60636b9_2.conda#25152fce119320c980e5470e64834b50 +https://conda.anaconda.org/conda-forge/osx-64/gmp-6.3.0-h73e2aa4_1.conda#92f8d748d95d97f92fc26cfac9bb5b6e +https://conda.anaconda.org/conda-forge/osx-64/isl-0.26-imath32_h2e86a7b_101.conda#d06222822a9144918333346f145b68c6 +https://conda.anaconda.org/conda-forge/osx-64/lerc-4.0.0-hb486fe8_0.tar.bz2#f9d6a4c82889d5ecedec1d90eb673c55 https://conda.anaconda.org/conda-forge/osx-64/libgfortran5-13.2.0-h2873a65_3.conda#e4fb4d23ec2870ff3c40d10afe305aec https://conda.anaconda.org/conda-forge/osx-64/libhwloc-2.10.0-default_h1321489_1000.conda#6f5fe4374d1003e116e2573022178da6 https://conda.anaconda.org/conda-forge/osx-64/libllvm16-16.0.6-hbedff68_3.conda#8fd56c0adc07a37f93bd44aa61a97c90 -https://conda.anaconda.org/conda-forge/osx-64/libtiff-4.6.0-h129831d_3.conda#568593071d2e6cea7b5fc1f75bfa10ca -https://conda.anaconda.org/conda-forge/osx-64/mpfr-4.2.1-h4f6b447_1.conda#b90df08f0deb2f58631447c1462c92a7 +https://conda.anaconda.org/conda-forge/osx-64/ninja-1.12.1-h3c5361c_0.conda#a0ebabd021c8191aeb82793fe43cfdcb https://conda.anaconda.org/conda-forge/osx-64/python-3.12.3-h1411813_0_cpython.conda#df1448ec6cbf8eceb03d29003cf72ae6 https://conda.anaconda.org/conda-forge/osx-64/sigtool-0.1.3-h88f4db0_0.tar.bz2#fbfb84b9de9a6939cb165c02c69b1865 +https://conda.anaconda.org/conda-forge/osx-64/tapi-1100.0.11-h9ce4665_0.tar.bz2#f9ff42ccf809a21ba6f8607f8de36108 https://conda.anaconda.org/conda-forge/osx-64/brotli-1.1.0-h0dc2134_1.conda#9272dd3b19c4e8212f8542cefd5c3d67 https://conda.anaconda.org/conda-forge/noarch/certifi-2024.2.2-pyhd8ed1ab_0.conda#0876280e409658fc6f9e75d035960333 https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2#3faab06a954c2a04039983f2c4a50d99 @@ -58,14 +56,13 @@ https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_2. https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda#15dda3cdbf330abfe9f555d22f66db46 https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda#f800d2da156d08e289b14e87e43c1ae5 https://conda.anaconda.org/conda-forge/osx-64/kiwisolver-1.4.5-py312h49ebfd2_1.conda#21f174a5cfb5964069c374171a979157 -https://conda.anaconda.org/conda-forge/osx-64/lcms2-2.16-ha2f27b4_0.conda#1442db8f03517834843666c422238c9b https://conda.anaconda.org/conda-forge/osx-64/ld64_osx-64-711-ha20a434_0.conda#a8b41eb97c8a9d618243a79ba78fdc3c https://conda.anaconda.org/conda-forge/osx-64/libclang-cpp16-16.0.6-default_h7151d67_6.conda#7eaad118ab797d1427f8745c861d1925 https://conda.anaconda.org/conda-forge/osx-64/libgfortran-5.0.0-13_2_0_h97931a8_3.conda#0b6e23a012ee7a9a5f6b244f5a92c1d5 +https://conda.anaconda.org/conda-forge/osx-64/libtiff-4.6.0-h129831d_3.conda#568593071d2e6cea7b5fc1f75bfa10ca https://conda.anaconda.org/conda-forge/osx-64/llvm-tools-16.0.6-hbedff68_3.conda#e9356b0807462e8f84c1384a8da539a5 -https://conda.anaconda.org/conda-forge/osx-64/mpc-1.3.1-h81bd1dd_0.conda#c752c0eb6c250919559172c011e5f65b +https://conda.anaconda.org/conda-forge/osx-64/mpfr-4.2.1-h4f6b447_1.conda#b90df08f0deb2f58631447c1462c92a7 https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2#2ba8498c1018c1e9c61eb99b973dfe19 -https://conda.anaconda.org/conda-forge/osx-64/openjpeg-2.5.2-h7310d3a_0.conda#05a14cc9d725dd74995927968d6547e3 https://conda.anaconda.org/conda-forge/noarch/packaging-24.0-pyhd8ed1ab_0.conda#248f521b64ce055e7feae3105e7abeb8 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda#d3483c8fc2dc2cc3f5cf43e26d60cabf https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.2-pyhd8ed1ab_0.conda#b9a4dacf97241704529131a0dfc0494f @@ -83,13 +80,14 @@ https://conda.anaconda.org/conda-forge/osx-64/cctools_osx-64-986-ha1c5b94_0.cond https://conda.anaconda.org/conda-forge/osx-64/clang-16-16.0.6-default_h7151d67_6.conda#1c298568c30efe7d9369c7c15b748461 https://conda.anaconda.org/conda-forge/osx-64/coverage-7.5.1-py312h520dd33_0.conda#afc8c7b237683760a3c35e49bcc04deb https://conda.anaconda.org/conda-forge/osx-64/fonttools-4.51.0-py312h41838bb_0.conda#ebe40134b860cf704ddaf81f684f95a5 -https://conda.anaconda.org/conda-forge/osx-64/gfortran_impl_osx-64-12.3.0-hc328e78_3.conda#b3d751dc7073bbfdfa9d863e39b9685d https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda#25df261d4523d9f9783bcdb7208d872f +https://conda.anaconda.org/conda-forge/osx-64/lcms2-2.16-ha2f27b4_0.conda#1442db8f03517834843666c422238c9b https://conda.anaconda.org/conda-forge/osx-64/ld64-711-ha02d983_0.conda#3ae4930ec076735cce481e906f5192e0 https://conda.anaconda.org/conda-forge/osx-64/libhiredis-1.0.2-h2beb688_0.tar.bz2#524282b2c46c9dedf051b3bc2ae05494 https://conda.anaconda.org/conda-forge/noarch/meson-1.4.0-pyhd8ed1ab_0.conda#52a0660cfa40b45bf254ecc3374cb2e0 https://conda.anaconda.org/conda-forge/osx-64/mkl-2023.2.0-h54c2260_50500.conda#0a342ccdc79e4fcd359245ac51941e7b -https://conda.anaconda.org/conda-forge/osx-64/pillow-10.3.0-py312h0c923fa_0.conda#6f0591ae972e9b815739da3392fbb3c3 +https://conda.anaconda.org/conda-forge/osx-64/mpc-1.3.1-h81bd1dd_0.conda#c752c0eb6c250919559172c011e5f65b +https://conda.anaconda.org/conda-forge/osx-64/openjpeg-2.5.2-h7310d3a_0.conda#05a14cc9d725dd74995927968d6547e3 https://conda.anaconda.org/conda-forge/noarch/pip-24.0-pyhd8ed1ab_0.conda#f586ac1e56c8638b64f9c8122a7b8a67 https://conda.anaconda.org/conda-forge/noarch/pyproject-metadata-0.8.0-pyhd8ed1ab_0.conda#573fe09d7bd0cd4bcc210d8369b5ca47 https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 @@ -97,9 +95,11 @@ https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0 https://conda.anaconda.org/conda-forge/osx-64/ccache-4.9.1-h41adc32_0.conda#45aaf96b67840bd98a928de8679098fa https://conda.anaconda.org/conda-forge/osx-64/cctools-986-h40f6528_0.conda#b7a2ca0062a6ee8bc4e83ec887bef942 https://conda.anaconda.org/conda-forge/osx-64/clang-16.0.6-hdae98eb_6.conda#884e7b24306e4f21b7ee08dabadb2ecc +https://conda.anaconda.org/conda-forge/osx-64/gfortran_impl_osx-64-12.3.0-hc328e78_3.conda#b3d751dc7073bbfdfa9d863e39b9685d https://conda.anaconda.org/conda-forge/osx-64/libblas-3.9.0-20_osx64_mkl.conda#160fdc97a51d66d51dc782fb67d35205 https://conda.anaconda.org/conda-forge/noarch/meson-python-0.16.0-pyh0c530f3_0.conda#e16f0dbf502da873be9f9adb0dc52547 https://conda.anaconda.org/conda-forge/osx-64/mkl-devel-2023.2.0-h694c41f_50500.conda#1b4d0235ef253a1e19459351badf4f9f +https://conda.anaconda.org/conda-forge/osx-64/pillow-10.3.0-py312h0c923fa_0.conda#6f0591ae972e9b815739da3392fbb3c3 https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda#c54c0107057d67ddf077751339ec2c63 https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/osx-64/clangxx-16.0.6-default_h7151d67_6.conda#cc8c007a529a7cfaa5d29d8599df3fe6 @@ -114,15 +114,15 @@ https://conda.anaconda.org/conda-forge/osx-64/contourpy-1.2.1-py312h9230928_0.co https://conda.anaconda.org/conda-forge/osx-64/pandas-2.2.2-py312h83c8a23_0.conda#b422a5d39ff0cd72923aef807f280145 https://conda.anaconda.org/conda-forge/osx-64/scipy-1.13.0-py312h741d2f9_1.conda#c416453a8ea3b38d823fe8dcecdb6a12 https://conda.anaconda.org/conda-forge/osx-64/blas-2.120-mkl.conda#b041a7677a412f3d925d8208936cb1e2 -https://conda.anaconda.org/conda-forge/osx-64/clang_impl_osx-64-16.0.6-h8787910_12.conda#fe1a78dddda2c0b32fac9fbd7fa05c5f +https://conda.anaconda.org/conda-forge/osx-64/clang_impl_osx-64-16.0.6-h8787910_14.conda#fc1a7d3f1bf236f63c58bab6e36844cb https://conda.anaconda.org/conda-forge/osx-64/matplotlib-base-3.8.4-py312h1fe5000_0.conda#3e3097734a5042cb6d2675e69bf1fc5a https://conda.anaconda.org/conda-forge/osx-64/pyamg-5.1.0-py312h3db3e91_0.conda#c6d6248b99fc11b15c9becea581a1462 -https://conda.anaconda.org/conda-forge/osx-64/clang_osx-64-16.0.6-hb91bd55_12.conda#4ef6f9a82654ad497e2334471832e774 +https://conda.anaconda.org/conda-forge/osx-64/clang_osx-64-16.0.6-hb91bd55_14.conda#3d0d9c725912bb0cb4cd301d2a5d31d7 https://conda.anaconda.org/conda-forge/osx-64/matplotlib-3.8.4-py312hb401068_0.conda#187ee42addd449b4899b55c304012436 https://conda.anaconda.org/conda-forge/osx-64/c-compiler-1.7.0-h282daa2_1.conda#d27411cb82bc1b76b9f487da6ae97f1d -https://conda.anaconda.org/conda-forge/osx-64/clangxx_impl_osx-64-16.0.6-h6d92fbe_12.conda#c1b8987b40123346ee3fe120c3b66b3d +https://conda.anaconda.org/conda-forge/osx-64/clangxx_impl_osx-64-16.0.6-h6d92fbe_14.conda#66b9f06d5f0d0ea47ffcb3a9ca65774a https://conda.anaconda.org/conda-forge/osx-64/gfortran_osx-64-12.3.0-h18f7dce_1.conda#436af2384c47aedb94af78a128e174f1 -https://conda.anaconda.org/conda-forge/osx-64/clangxx_osx-64-16.0.6-hb91bd55_12.conda#4e8cca2283e843a8df8b2e747d36226d +https://conda.anaconda.org/conda-forge/osx-64/clangxx_osx-64-16.0.6-hb91bd55_14.conda#a4504c1a7beab8875d6f765941e77248 https://conda.anaconda.org/conda-forge/osx-64/gfortran-12.3.0-h2c809b3_1.conda#c48adbaa8944234b80ef287c37e329b0 https://conda.anaconda.org/conda-forge/osx-64/cxx-compiler-1.7.0-h7728843_1.conda#e04cb15a20553b973dd068c2dc81d682 https://conda.anaconda.org/conda-forge/osx-64/fortran-compiler-1.7.0-h6c2ab21_1.conda#48319058089f492d5059e04494b81ed9 diff --git a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock index c687f8fb76fb1..ec92612048448 100644 --- a/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock @@ -29,7 +29,7 @@ https://repo.anaconda.com/pkgs/main/osx-64/ninja-base-1.10.2-haf03e11_5.conda#c8 https://repo.anaconda.com/pkgs/main/osx-64/openssl-3.0.13-hca72f7f_1.conda#e526d7e2e79132a11b4746cf305c45b5 https://repo.anaconda.com/pkgs/main/osx-64/readline-8.2-hca72f7f_0.conda#971667436260e523f6f7355fdfa238bf https://repo.anaconda.com/pkgs/main/osx-64/tbb-2021.8.0-ha357a0b_0.conda#fb48530a3eea681c11dafb95b3387c0f -https://repo.anaconda.com/pkgs/main/osx-64/tk-8.6.12-h5d9f67b_0.conda#047f0af5486d19163e37fd7f8ae3d29f +https://repo.anaconda.com/pkgs/main/osx-64/tk-8.6.14-h4d00af3_0.conda#a2c03940c2ae54614301ec82e6a98d75 https://repo.anaconda.com/pkgs/main/osx-64/brotli-bin-1.0.9-h6c40b1e_8.conda#11053f9c6b8d8a8348d0c33450c23ce9 https://repo.anaconda.com/pkgs/main/osx-64/freetype-2.12.1-hd8bbffd_0.conda#1f276af321375ee7fe8056843044fa76 https://repo.anaconda.com/pkgs/main/osx-64/libgfortran-5.0.0-11_3_0_hecd8cb5_28.conda#2eb13b680803f1064e53873ae0aaafb3 @@ -38,7 +38,7 @@ https://repo.anaconda.com/pkgs/main/osx-64/sqlite-3.45.3-h6c40b1e_0.conda#2edf90 https://repo.anaconda.com/pkgs/main/osx-64/zstd-1.5.5-hc035e20_2.conda#c033bf68c12f8c71fd916f000f3dc118 https://repo.anaconda.com/pkgs/main/osx-64/brotli-1.0.9-h6c40b1e_8.conda#10f89677a3898d0113dc354adf643df3 https://repo.anaconda.com/pkgs/main/osx-64/libtiff-4.5.1-hcec6c5f_0.conda#e127a800ffd9d300ed7d5e1b026944ec -https://repo.anaconda.com/pkgs/main/osx-64/python-3.12.3-hd58486a_0.conda#1a287cfa37c5a92972f5f527b6af7eed +https://repo.anaconda.com/pkgs/main/osx-64/python-3.12.3-hd58486a_1.conda#cdc61e8f6c2d77b3b263e720048c4b54 https://repo.anaconda.com/pkgs/main/osx-64/coverage-7.2.2-py312h6c40b1e_0.conda#b6e4b9fba325047c07f3c9211ae91d1c https://repo.anaconda.com/pkgs/main/noarch/cycler-0.11.0-pyhd3eb1b0_0.conda#f5e365d2cdb66d547eb8c3ab93843aab https://repo.anaconda.com/pkgs/main/noarch/execnet-1.9.0-pyhd3eb1b0_0.conda#f895937671af67cebb8af617494b3513 @@ -54,7 +54,7 @@ https://repo.anaconda.com/pkgs/main/osx-64/pluggy-1.0.0-py312hecd8cb5_1.conda#64 https://repo.anaconda.com/pkgs/main/osx-64/pyparsing-3.0.9-py312hecd8cb5_0.conda#d85cf2b81c6d9326a57a6418e14db258 https://repo.anaconda.com/pkgs/main/noarch/python-tzdata-2023.3-pyhd3eb1b0_0.conda#479c037de0186d114b9911158427624e https://repo.anaconda.com/pkgs/main/osx-64/pytz-2024.1-py312hecd8cb5_0.conda#2b28ec0e0d07f5c0c701f75200b1e8b6 -https://repo.anaconda.com/pkgs/main/osx-64/setuptools-68.2.2-py312hecd8cb5_0.conda#64235f0c451427d86808c70c1c31cb8b +https://repo.anaconda.com/pkgs/main/osx-64/setuptools-69.5.1-py312hecd8cb5_0.conda#5c7c7ef1e0762e3ca1f543d28310946f https://repo.anaconda.com/pkgs/main/noarch/six-1.16.0-pyhd3eb1b0_1.conda#34586824d411d36af2fa40e799c172d0 https://repo.anaconda.com/pkgs/main/noarch/toml-0.10.2-pyhd3eb1b0_0.conda#cda05f5f6d8509529d1a2743288d197a https://repo.anaconda.com/pkgs/main/osx-64/tornado-6.3.3-py312h6c40b1e_0.conda#49173b5a36c9134865221f29d4a73fb6 @@ -64,10 +64,10 @@ https://repo.anaconda.com/pkgs/main/osx-64/fonttools-4.51.0-py312h6c40b1e_0.cond https://repo.anaconda.com/pkgs/main/osx-64/meson-1.3.1-py312hecd8cb5_0.conda#43963a2b38becce4caa95434b8c96837 https://repo.anaconda.com/pkgs/main/osx-64/numpy-base-1.26.4-py312h6f81483_0.conda#87f73efbf26ab2e2ea7c32481a71bd47 https://repo.anaconda.com/pkgs/main/osx-64/pillow-10.3.0-py312h6c40b1e_0.conda#fe883fa4247d35fe6de49f713529ca02 -https://repo.anaconda.com/pkgs/main/osx-64/pip-23.3.1-py312hecd8cb5_0.conda#efc3db40cac09f74bb480d28d3a0b260 +https://repo.anaconda.com/pkgs/main/osx-64/pip-24.0-py312hecd8cb5_0.conda#7a8e0b1d3742ddf1c8aa97fbaa158039 https://repo.anaconda.com/pkgs/main/osx-64/pyproject-metadata-0.7.1-py312hecd8cb5_0.conda#e91ce37477d24dcdf7e0a8b93c5e72fd https://repo.anaconda.com/pkgs/main/osx-64/pytest-7.4.0-py312hecd8cb5_0.conda#b816a2439ba9b87524aec74d58e55b0a -https://repo.anaconda.com/pkgs/main/noarch/python-dateutil-2.8.2-pyhd3eb1b0_0.conda#211ee00320b08a1ac9fea6677649f6c9 +https://repo.anaconda.com/pkgs/main/osx-64/python-dateutil-2.9.0post0-py312hecd8cb5_0.conda#b3ed54eb118325785284dd18bfceca19 https://repo.anaconda.com/pkgs/main/osx-64/meson-python-0.15.0-py312h6c40b1e_0.conda#688ab56b9d8e5a2e3f018ca3ce34e061 https://repo.anaconda.com/pkgs/main/osx-64/pytest-cov-4.1.0-py312hecd8cb5_1.conda#a33a24eb20359f464938e75b2f57e23a https://repo.anaconda.com/pkgs/main/osx-64/pytest-xdist-3.5.0-py312hecd8cb5_0.conda#d1ecfb3691cceecb1f16bcfdf0b67bb5 diff --git a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock index c497709ca347e..46fd0d308eaa2 100644 --- a/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_openblas_pandas_linux-64_conda.lock @@ -17,12 +17,12 @@ https://repo.anaconda.com/pkgs/main/linux-64/xz-5.4.6-h5eee18b_1.conda#1562802f8 https://repo.anaconda.com/pkgs/main/linux-64/zlib-1.2.13-h5eee18b_1.conda#92e42d8310108b0a440fb2e60b2b2a25 https://repo.anaconda.com/pkgs/main/linux-64/ccache-3.7.9-hfe4627d_0.conda#bef6fc681c273bb7bd0c67d1a591365e https://repo.anaconda.com/pkgs/main/linux-64/readline-8.2-h5eee18b_0.conda#be42180685cce6e6b0329201d9f48efb -https://repo.anaconda.com/pkgs/main/linux-64/tk-8.6.12-h1ccaba5_0.conda#fa10ff4aa631fa4aa090a6234d7770b9 +https://repo.anaconda.com/pkgs/main/linux-64/tk-8.6.14-h39e8969_0.conda#78dbc5e3c69143ebc037fc5d5b22e597 https://repo.anaconda.com/pkgs/main/linux-64/sqlite-3.45.3-h5eee18b_0.conda#acf93d6aceb74d6110e20b44cc45939e -https://repo.anaconda.com/pkgs/main/linux-64/python-3.9.19-h955ad1f_0.conda#33cb019c40e3409df392c99e3c34f352 -https://repo.anaconda.com/pkgs/main/linux-64/setuptools-68.2.2-py39h06a4308_0.conda#5b42cae5548732ae5c167bb1066085de +https://repo.anaconda.com/pkgs/main/linux-64/python-3.9.19-h955ad1f_1.conda#4b453281859c293c9d577271f3b18a0d +https://repo.anaconda.com/pkgs/main/linux-64/setuptools-69.5.1-py39h06a4308_0.conda#3eb144d481b39c0fbbced789dd9b76b3 https://repo.anaconda.com/pkgs/main/linux-64/wheel-0.43.0-py39h06a4308_0.conda#40bb60408c7433d767fd8c65b35bc4a0 -https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685007e3dae59d211620f19926577bd6 +https://repo.anaconda.com/pkgs/main/linux-64/pip-24.0-py39h06a4308_0.conda#7f8ce3af15cfecd12e4dda8c5cef5fb7 # pip alabaster @ https://files.pythonhosted.org/packages/32/34/d4e1c02d3bee589efb5dfa17f88ea08bdb3e3eac12bc475462aec52ed223/alabaster-0.7.16-py3-none-any.whl#sha256=b46733c07dce03ae4e150330b975c75737fa60f0a7c591b6c8bf4928a28e2c92 # pip babel @ https://files.pythonhosted.org/packages/27/45/377f7e32a5c93d94cd56542349b34efab5ca3f9e2fd5a68c5e93169aa32d/Babel-2.15.0-py3-none-any.whl#sha256=08706bdad8d0a3413266ab61bd6c34d0c28d6e1e7badf40a2cebe67644e2e1fb # pip certifi @ https://files.pythonhosted.org/packages/ba/06/a07f096c664aeb9f01624f858c3add0a4e913d6c96257acb4fce61e7de14/certifi-2024.2.2-py3-none-any.whl#sha256=dc383c07b76109f368f6106eee2b593b04a011ea4d55f652c6ca24a754d1cdd1 @@ -75,7 +75,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685 # pip python-dateutil @ https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl#sha256=a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427 # pip requests @ https://files.pythonhosted.org/packages/70/8e/0e2d847013cb52cd35b38c009bb167a1a26b2ce6cd6965bf26b47bc0bf44/requests-2.31.0-py3-none-any.whl#sha256=58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f # pip scipy @ https://files.pythonhosted.org/packages/c6/ba/a778e6c0020d728c119b0379805a357135fe8c9bc87fdb7e0750ca11319f/scipy-1.13.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=28e286bf9ac422d6beb559bc61312c348ca9b0f0dae0d7c5afde7f722d6ea13d -# pip tifffile @ https://files.pythonhosted.org/packages/c1/cf/dd1cdf85db58c811816377afd6ba8a240f4611e16f4085201598fb2d5578/tifffile-2024.5.3-py3-none-any.whl#sha256=cac4d939156ff7f16d65fd689637808a7b5b3ad58f9c73327fc009b0aa32c7d5 +# pip tifffile @ https://files.pythonhosted.org/packages/c1/79/29d0fa40017f7b749ce344759dcc21e2ec9bbb81fc69ca2ce06e261f83f0/tifffile-2024.5.10-py3-none-any.whl#sha256=4154f091aa24d4e75bfad9ab2d5424a68c70e67b8220188066dc61946d4551bd # pip lightgbm @ https://files.pythonhosted.org/packages/ba/11/cb8b67f3cbdca05b59a032bb57963d4fe8c8d18c3870f30bed005b7f174d/lightgbm-4.3.0-py3-none-manylinux_2_28_x86_64.whl#sha256=104496a3404cb2452d3412cbddcfbfadbef9c372ea91e3a9b8794bcc5183bf07 # pip matplotlib @ https://files.pythonhosted.org/packages/5e/2c/513395a63a9e1124a5648addbf73be23cc603f955af026b04416da98dc96/matplotlib-3.8.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=606e3b90897554c989b1e38a258c626d46c873523de432b1462f295db13de6f9 # pip meson-python @ https://files.pythonhosted.org/packages/91/c0/104cb6244c83fe6bc3886f144cc433db0c0c78efac5dc00e409a5a08c87d/meson_python-0.16.0-py3-none-any.whl#sha256=842dc9f5dc29e55fc769ff1b6fe328412fe6c870220fc321060a1d2d395e69e8 diff --git a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock index ff7bcd028c7f6..6e46719df47c4 100644 --- a/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_defaults_openblas_linux-64_conda.lock @@ -39,7 +39,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/libpng-1.6.39-h5eee18b_0.conda#f6ae https://repo.anaconda.com/pkgs/main/linux-64/libxml2-2.10.4-hfdd30dd_2.conda#ff7a0e3b92afb3c99b82c9f0ba8b5670 https://repo.anaconda.com/pkgs/main/linux-64/pcre2-10.42-hebb0a14_1.conda#727e15c3cfa02b032da4eb0c1123e977 https://repo.anaconda.com/pkgs/main/linux-64/readline-8.2-h5eee18b_0.conda#be42180685cce6e6b0329201d9f48efb -https://repo.anaconda.com/pkgs/main/linux-64/tk-8.6.12-h1ccaba5_0.conda#fa10ff4aa631fa4aa090a6234d7770b9 +https://repo.anaconda.com/pkgs/main/linux-64/tk-8.6.14-h39e8969_0.conda#78dbc5e3c69143ebc037fc5d5b22e597 https://repo.anaconda.com/pkgs/main/linux-64/zstd-1.5.5-hc292b87_2.conda#3b7fe809e5b429b4f90fe064842a2370 https://repo.anaconda.com/pkgs/main/linux-64/freetype-2.12.1-h4a9f257_0.conda#bdc7b5952e9c5dca01bc2f4ccef2f974 https://repo.anaconda.com/pkgs/main/linux-64/krb5-1.20.1-h143b758_1.conda#cf1accc86321fa25d6b978cc748039ae @@ -55,7 +55,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/lcms2-2.12-h3be6417_0.conda#719db47 https://repo.anaconda.com/pkgs/main/linux-64/libclang-14.0.6-default_hc6dbbc7_1.conda#8f12583c4027b2861cff470f6b8837c4 https://repo.anaconda.com/pkgs/main/linux-64/libpq-12.17-hdbd6064_0.conda#6bed363e25859faff66bf546a11c10e8 https://repo.anaconda.com/pkgs/main/linux-64/openjpeg-2.4.0-h3ad879b_0.conda#86baecb47ecaa7f7ff2657a1f03b90c9 -https://repo.anaconda.com/pkgs/main/linux-64/python-3.9.19-h955ad1f_0.conda#33cb019c40e3409df392c99e3c34f352 +https://repo.anaconda.com/pkgs/main/linux-64/python-3.9.19-h955ad1f_1.conda#4b453281859c293c9d577271f3b18a0d https://repo.anaconda.com/pkgs/main/linux-64/certifi-2024.2.2-py39h06a4308_0.conda#2bc1db9166ecbb968f61252e6f08c2ce https://repo.anaconda.com/pkgs/main/noarch/cycler-0.11.0-pyhd3eb1b0_0.conda#f5e365d2cdb66d547eb8c3ab93843aab https://repo.anaconda.com/pkgs/main/linux-64/cython-3.0.10-py39h5eee18b_0.conda#1419a658ed2b4d5c3ac1964f33143b64 @@ -66,14 +66,14 @@ https://repo.anaconda.com/pkgs/main/noarch/iniconfig-1.1.1-pyhd3eb1b0_0.tar.bz2# https://repo.anaconda.com/pkgs/main/linux-64/joblib-1.2.0-py39h06a4308_0.conda#ac1f5687d70aa1128cbecb26bc9e559d https://repo.anaconda.com/pkgs/main/linux-64/kiwisolver-1.4.4-py39h6a678d5_0.conda#3d57aedbfbd054ce57fb3c1e4448828c https://repo.anaconda.com/pkgs/main/linux-64/mysql-5.7.24-h721c034_2.conda#dfc19ca2466d275c4c1f73b62c57f37b -https://repo.anaconda.com/pkgs/main/linux-64/numpy-base-1.21.6-py39h375b286_0.conda#4ceaa5d6e6307fe06961d555f78b266f +https://repo.anaconda.com/pkgs/main/linux-64/numpy-base-1.21.6-py39h375b286_1.conda#0061d9193658774ab79fc85d143a94fc https://repo.anaconda.com/pkgs/main/linux-64/packaging-23.2-py39h06a4308_0.conda#b3f88f45f31bde016e49be3e941e5272 https://repo.anaconda.com/pkgs/main/linux-64/pillow-10.3.0-py39h5eee18b_0.conda#b346d6c71267c1553b6c18d3db5fdf6d https://repo.anaconda.com/pkgs/main/linux-64/pluggy-1.0.0-py39h06a4308_1.conda#fb4fed11ed43cf727dbd51883cc1d9fa https://repo.anaconda.com/pkgs/main/linux-64/ply-3.11-py39h06a4308_0.conda#6c89bf6d2fdf6d24126e34cb83fd10f1 https://repo.anaconda.com/pkgs/main/linux-64/pyparsing-3.0.9-py39h06a4308_0.conda#3a0537468e59760404f63b4f04369828 https://repo.anaconda.com/pkgs/main/linux-64/pyqt5-sip-12.13.0-py39h5eee18b_0.conda#256840c3841b52346ea5743be8490ede -https://repo.anaconda.com/pkgs/main/linux-64/setuptools-68.2.2-py39h06a4308_0.conda#5b42cae5548732ae5c167bb1066085de +https://repo.anaconda.com/pkgs/main/linux-64/setuptools-69.5.1-py39h06a4308_0.conda#3eb144d481b39c0fbbced789dd9b76b3 https://repo.anaconda.com/pkgs/main/noarch/six-1.16.0-pyhd3eb1b0_1.conda#34586824d411d36af2fa40e799c172d0 https://repo.anaconda.com/pkgs/main/noarch/toml-0.10.2-pyhd3eb1b0_0.conda#cda05f5f6d8509529d1a2743288d197a https://repo.anaconda.com/pkgs/main/linux-64/tomli-2.0.1-py39h06a4308_0.conda#b06dffe7ddca2645ed72f5116f0a087d @@ -82,10 +82,10 @@ https://repo.anaconda.com/pkgs/main/linux-64/wheel-0.43.0-py39h06a4308_0.conda#4 https://repo.anaconda.com/pkgs/main/linux-64/coverage-7.2.2-py39h5eee18b_0.conda#e9da151b7e1f56be2cb569c65949a1d2 https://repo.anaconda.com/pkgs/main/linux-64/dbus-1.13.18-hb2f20db_0.conda#6a6a6f1391f807847404344489ef6cf4 https://repo.anaconda.com/pkgs/main/linux-64/gstreamer-1.14.1-h5eee18b_1.conda#f2f26e6f869b5d87f41bd059fae47c3e -https://repo.anaconda.com/pkgs/main/linux-64/numpy-1.21.6-py39hac523dd_0.conda#a03c1fe16cf2558bca3838062c334d7d -https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py39h06a4308_0.conda#685007e3dae59d211620f19926577bd6 +https://repo.anaconda.com/pkgs/main/linux-64/numpy-1.21.6-py39hac523dd_1.conda#f379f92039f666828a193fadd18c9819 +https://repo.anaconda.com/pkgs/main/linux-64/pip-24.0-py39h06a4308_0.conda#7f8ce3af15cfecd12e4dda8c5cef5fb7 https://repo.anaconda.com/pkgs/main/linux-64/pytest-7.4.0-py39h06a4308_0.conda#99d92a7a39f7e615de84f8cc5606c49a -https://repo.anaconda.com/pkgs/main/noarch/python-dateutil-2.8.2-pyhd3eb1b0_0.conda#211ee00320b08a1ac9fea6677649f6c9 +https://repo.anaconda.com/pkgs/main/linux-64/python-dateutil-2.9.0post0-py39h06a4308_0.conda#bb2c65e53e610ec258e03771cd79ad17 https://repo.anaconda.com/pkgs/main/linux-64/sip-6.7.12-py39h6a678d5_0.conda#6988a3e12fcacfedcac523c1e4c3167c https://repo.anaconda.com/pkgs/main/linux-64/gst-plugins-base-1.14.1-h6a678d5_1.conda#afd9cbe949d670d24cc0a007aaec1fe1 https://repo.anaconda.com/pkgs/main/linux-64/matplotlib-base-3.3.4-py39h62a2d02_0.conda#dbab28222c740af8e21a3e5e2882c178 diff --git a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock index 88bc53dd94e1a..d95e56378ae56 100644 --- a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock @@ -28,7 +28,7 @@ https://conda.anaconda.org/conda-forge/win-64/libsqlite-3.45.3-hcfcfb64_0.conda# https://conda.anaconda.org/conda-forge/win-64/libwebp-base-1.4.0-hcfcfb64_0.conda#abd61d0ab127ec5cd68f62c2969e6f34 https://conda.anaconda.org/conda-forge/win-64/libzlib-1.2.13-hcfcfb64_5.conda#5fdb9c6a113b6b6cb5e517fd972d5f41 https://conda.anaconda.org/conda-forge/win-64/m2w64-gcc-libgfortran-5.3.0-6.tar.bz2#066552ac6b907ec6d72c0ddab29050dc -https://conda.anaconda.org/conda-forge/win-64/ninja-1.12.0-h91493d7_0.conda#e67ab00f4d2c089864c2b8dcccf4dc58 +https://conda.anaconda.org/conda-forge/win-64/ninja-1.12.1-hc790b64_0.conda#a557dde55343e03c68cd7e29e7f87279 https://conda.anaconda.org/conda-forge/win-64/openssl-3.3.0-hcfcfb64_0.conda#a6c544c9f060740c625dbf6d92cf3495 https://conda.anaconda.org/conda-forge/win-64/pthreads-win32-2.9.1-hfa6e2cd_3.tar.bz2#e2da8758d7d51ff6aa78a14dfb9dbed4 https://conda.anaconda.org/conda-forge/win-64/tk-8.6.13-h5226925_1.conda#fc048363eb8f03cd1737600a5d08aafe @@ -55,7 +55,7 @@ https://conda.anaconda.org/conda-forge/win-64/freetype-2.12.1-hdaf720e_2.conda#3 https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda#f800d2da156d08e289b14e87e43c1ae5 https://conda.anaconda.org/conda-forge/win-64/kiwisolver-1.4.5-py39h1f6ef14_1.conda#4fc5bd0a7b535252028c647cc27d6c87 https://conda.anaconda.org/conda-forge/win-64/libclang13-18.1.5-default_hf64faad_0.conda#8a662434c6be1f40e2d5d2506d05a41d -https://conda.anaconda.org/conda-forge/win-64/libglib-2.80.0-h39d0aa6_6.conda#cd5c6efbe213c089f78575c98ab9a0ed +https://conda.anaconda.org/conda-forge/win-64/libglib-2.80.2-h0df6a38_0.conda#ef9ae80bb2a15aee7a30180c057678ea https://conda.anaconda.org/conda-forge/win-64/libhwloc-2.10.0-default_h2fffb23_1000.conda#ee944f0d41d9e2048f9d7492c1623ca3 https://conda.anaconda.org/conda-forge/win-64/libintl-devel-0.22.5-h5728263_2.conda#a2ad82fae23975e4ccbfab2847d31d48 https://conda.anaconda.org/conda-forge/win-64/libtiff-4.6.0-hddb2be6_3.conda#6d1828c9039929e2f185c5fa9d133018 @@ -78,7 +78,7 @@ https://conda.anaconda.org/conda-forge/win-64/xorg-libxdmcp-1.1.3-hcd874cb_0.tar https://conda.anaconda.org/conda-forge/noarch/zipp-3.17.0-pyhd8ed1ab_0.conda#2e4d6bc0b14e10f895fc6791a7d9b26a https://conda.anaconda.org/conda-forge/win-64/brotli-1.1.0-hcfcfb64_1.conda#f47f6db2528e38321fb00ae31674c133 https://conda.anaconda.org/conda-forge/win-64/coverage-7.5.1-py39ha55e580_0.conda#e8f43ea91f0f17d92d5575cfab41a42f -https://conda.anaconda.org/conda-forge/win-64/glib-tools-2.80.0-h0a98069_6.conda#40d452e4012c00f644b1dd6319fcdbcf +https://conda.anaconda.org/conda-forge/win-64/glib-tools-2.80.2-h2f9d560_0.conda#42fc785d9db7ab051a206fbf882ecf2e https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.4.0-pyhd8ed1ab_0.conda#c5d3907ad8bd7bf557521a1833cf7e6d https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda#25df261d4523d9f9783bcdb7208d872f https://conda.anaconda.org/conda-forge/win-64/lcms2-2.16-h67d730c_0.conda#d3592435917b62a8becff3a60db674f6 @@ -92,7 +92,7 @@ https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0 https://conda.anaconda.org/conda-forge/win-64/sip-6.7.12-py39h99910a6_0.conda#0cc5774390ada632ed7975203057c91c https://conda.anaconda.org/conda-forge/win-64/tbb-2021.12.0-h91493d7_0.conda#21745fdd12f01b41178596143cbecffd https://conda.anaconda.org/conda-forge/win-64/fonttools-4.51.0-py39ha55989b_0.conda#5d19302bab29e347116b743e793aa7d6 -https://conda.anaconda.org/conda-forge/win-64/glib-2.80.0-h39d0aa6_6.conda#a4036d0bc6f499ebe9fef7b887f3ca0f +https://conda.anaconda.org/conda-forge/win-64/glib-2.80.2-h0df6a38_0.conda#a728ca6f04c33ecb0f39eeda5fbd0e23 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.4.0-pyhd8ed1ab_0.conda#dcbadab7a68738a028e195ab68ab2d2e https://conda.anaconda.org/conda-forge/noarch/meson-python-0.16.0-pyh0c530f3_0.conda#e16f0dbf502da873be9f9adb0dc52547 https://conda.anaconda.org/conda-forge/win-64/mkl-2024.1.0-h66d3029_692.conda#b43ec7ed045323edeff31e348eea8652 diff --git a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock index abdaeaee81527..231cd528ecd0e 100644 --- a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock @@ -9,13 +9,13 @@ https://conda.anaconda.org/conda-forge/noarch/font-ttf-inconsolata-3.000-h77eed3 https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77eed37_0.tar.bz2#4d59c254e01d9cde7957100457e2d5fb https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_2.conda#cbbe59391138ea5ad3658c76912e147f https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h55db66e_0.conda#10569984e7db886e4f1abc2b47ad79a1 -https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_6.conda#2f18345bbc433c8a1ed887d7161e86a6 +https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_7.conda#53ebd4c833fa01cb2c6353e99f905406 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_cp39.conda#bfe4b3259a8ac6cdf0037752904da6a7 https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h0c530f3_0.conda#161081fc7cec0bfda0d86d7cb595f8d8 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 -https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_6.conda#4398809ac84d0b8c28beebaaa83277f5 +https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_7.conda#72ec1b1b04c4d15d4204ece1ecea5978 https://conda.anaconda.org/conda-forge/linux-64/alsa-lib-1.2.11-hd590300_1.conda#0bb492cca54017ea314b809b1ee3a176 https://conda.anaconda.org/conda-forge/linux-64/attr-2.5.1-h166bdaf_1.tar.bz2#d9c69a24ad678ffce24c6543a0176b00 https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-hd590300_5.conda#69b8b6202a07720f448be700e300ccf4 @@ -31,7 +31,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libdeflate-1.20-hd590300_0.conda https://conda.anaconda.org/conda-forge/linux-64/libexpat-2.6.2-h59595ed_0.conda#e7ba12deb7020dd080c6c70e7b6f6a3d https://conda.anaconda.org/conda-forge/linux-64/libffi-3.4.2-h7f98852_5.tar.bz2#d645c6d2ac96843a2bfaccd2d62b3ac3 https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-0.22.5-h59595ed_2.conda#172bcc51059416e7ce99e7b528cede83 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-h43f5ff8_6.conda#e54a5ddc67e673f9105cf2a2e9c070b0 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-hca663fb_7.conda#c0bd771f09a326fdcd95a60b617795bf https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.17-hd590300_2.conda#d66573916ffcf376178462f1b61c941e https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.0.0-hd590300_1.conda#ea25936bb4080d843790b586850f82b8 https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.1-hd590300_0.conda#30fd6e37fe21f86f4bd26d6ee73eeec7 @@ -43,8 +43,8 @@ https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.cond https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.6-h59595ed_0.conda#9160cdeb523a1b20cf8d2a0bf821f45d -https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4.20240210-h59595ed_0.conda#97da8860a0da5413c7c98a3b3838a645 -https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.0-h00ab1b0_0.conda#b048701d52e7cbb5f59ddd4d3b17bbf5 +https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.5-h59595ed_0.conda#fcea371545eda051b6deafb24889fc69 +https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.1-h297d8ca_0.conda#3aa1c7e292afeff25a0091ddd7c69b72 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.3.0-hd590300_0.conda#c0f3abb4a16477208bbd43a39bd56f18 https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.2-h59595ed_0.conda#71004cbf7924e19c02746ccde9fd7123 @@ -66,7 +66,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libcap-2.69-h0f662aa_0.conda#25c https://conda.anaconda.org/conda-forge/linux-64/libedit-3.1.20191231-he28a2e2_2.tar.bz2#4d331e44109e3f0e19b4cb8f9b82f3e1 https://conda.anaconda.org/conda-forge/linux-64/libevent-2.1.12-hf998b51_1.conda#a1cfcc585f0c42bf8d5546bb1dfb668d https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-devel-0.22.5-h59595ed_2.conda#b63d9b6da3653179a278077f0de20014 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_6.conda#3666a850342f8f3be88f9a93d948d027 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_7.conda#1b84f26d9f4f6026e179e7805d5a15cd https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#009981dd9cfcaa4dbfa25ffaed86bcae https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.conda#b3316cbe90249da4f8e84cd66e1cc55b https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 @@ -83,15 +83,15 @@ https://conda.anaconda.org/conda-forge/linux-64/brotli-bin-1.1.0-hd590300_1.cond https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-h267a509_2.conda#9ae35c3d96db2c94ce0cef86efdfa2cb https://conda.anaconda.org/conda-forge/linux-64/gettext-0.22.5-h59595ed_2.conda#219ba82e95d7614cf7140d2a4afc0926 https://conda.anaconda.org/conda-forge/linux-64/krb5-1.21.2-h659d440_0.conda#cd95826dbd331ed1be26bdf401432844 -https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.0-hf2295e7_6.conda#9342e7c44c38bea649490f72d92c382d +https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.2-hf974151_0.conda#72724f6a78ecb15559396966226d5838 https://conda.anaconda.org/conda-forge/linux-64/libhiredis-1.0.2-h2cc385e_0.tar.bz2#b34907d3a81a3cd8095ee83d174c074a https://conda.anaconda.org/conda-forge/linux-64/libllvm15-15.0.7-hb3ce162_4.conda#8a35df3cbc0c8b12cc8af9473ae75eef https://conda.anaconda.org/conda-forge/linux-64/libllvm18-18.1.5-hb77312f_0.conda#efd221d3668077ca067a206269418dec https://conda.anaconda.org/conda-forge/linux-64/libopenblas-0.3.27-pthreads_h413a1c8_0.conda#a356024784da6dfd4683dc5ecf45b155 https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-h1dd3fc0_3.conda#66f03896ffbe1a110ffda05c7a856504 -https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.4-ha31de31_0.conda#48b9991e66abc186a7ad7975e97bd4d0 +https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.5-ha31de31_0.conda#b923cdb6e567ada84f991ffcc5848afb https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.3.0-hca2cd23_4.conda#1b50eebe2a738a3146c154d2eceaa8b6 -https://conda.anaconda.org/conda-forge/linux-64/nss-3.98-h1d7d5a4_0.conda#54b56c2fdf973656b748e0378900ec13 +https://conda.anaconda.org/conda-forge/linux-64/nss-3.100-hca3bf56_0.conda#949c4a82290ee58b3c970cef4bcfd4ad https://conda.anaconda.org/conda-forge/linux-64/python-3.9.19-h0755675_0_cpython.conda#d9ee3647fbd9e8595b8df759b2bbefb8 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 @@ -112,7 +112,7 @@ https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_2.conda#8d652ea2ee8eaee02ed8dc820bc794aa https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda#15dda3cdbf330abfe9f555d22f66db46 https://conda.anaconda.org/conda-forge/linux-64/fontconfig-2.14.2-h14ed4e7_0.conda#0f69b688f52ff6da70bccb7ff7001d1d -https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.0-hde27a5a_6.conda#a9d23c02485c5cf055f9ac90eb9c9c63 +https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.2-hb6ce0ca_0.conda#a965aeaf060289528a3fbe09326edae2 https://conda.anaconda.org/conda-forge/noarch/idna-3.7-pyhd8ed1ab_0.conda#c0cc1420498b17414d8617d0b9f506ca https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2#7de5386c8fea29e76b303f37dde4c352 https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda#f800d2da156d08e289b14e87e43c1ae5 @@ -124,7 +124,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libclang13-18.1.5-default_h5d682 https://conda.anaconda.org/conda-forge/linux-64/libcups-2.3.3-h4637d8d_4.conda#d4529f4dff3057982a7617c7ac58fde3 https://conda.anaconda.org/conda-forge/linux-64/libflac-1.4.3-h59595ed_0.conda#ee48bf17cc83a00f59ca1494d5646869 https://conda.anaconda.org/conda-forge/linux-64/libgpg-error-1.49-h4f305b6_0.conda#dfcfd72c7a430d3616763ecfbefe4ca9 -https://conda.anaconda.org/conda-forge/linux-64/libpq-16.2-h33b98f1_1.conda#9e49ec2a61d02623b379dc332eb6889d +https://conda.anaconda.org/conda-forge/linux-64/libpq-16.3-ha72fbe1_0.conda#bac737ae28b79cfbafd515258d97d29e https://conda.anaconda.org/conda-forge/linux-64/markupsafe-2.1.5-py39hd1e30aa_0.conda#9a9a22eb1f83c44953319ee3b027769f https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2#2ba8498c1018c1e9c61eb99b973dfe19 https://conda.anaconda.org/conda-forge/linux-64/openblas-0.3.27-pthreads_h7a3da1a_0.conda#4b422ebe8fc6a5320d0c1c22e5a46032 @@ -156,10 +156,10 @@ https://conda.anaconda.org/conda-forge/noarch/zipp-3.17.0-pyhd8ed1ab_0.conda#2e4 https://conda.anaconda.org/conda-forge/noarch/babel-2.14.0-pyhd8ed1ab_0.conda#9669586875baeced8fc30c0826c3270e https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f907bb958910dc404647326ca80c263e https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.51.0-py39hd1e30aa_0.conda#79f5dd8778873faa54e8f7b2729fe8a6 -https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.0-hf2295e7_6.conda#a1e026a82a562b443845db5614ca568a +https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.2-hf974151_0.conda#d427988dc3dbd0a4c136f52db356cc6a https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.1.0-pyha770c72_0.conda#0896606848b2dc5cebdf111b6543aa04 https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.4.0-pyhd8ed1ab_0.conda#c5d3907ad8bd7bf557521a1833cf7e6d -https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.3-pyhd8ed1ab_0.conda#e7d8df6509ba635247ff9aea31134262 +https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.4-pyhd8ed1ab_0.conda#7b86ecb7d3557821c649b3c31e3eb9f2 https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda#25df261d4523d9f9783bcdb7208d872f https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-22_linux64_openblas.conda#4b31699e0ec5de64d5896e580389c9a1 https://conda.anaconda.org/conda-forge/linux-64/libgcrypt-1.10.3-hd590300_0.conda#32d16ad533c59bb0a3c5ffaf16110829 diff --git a/build_tools/circle/doc_linux-64_conda.lock b/build_tools/circle/doc_linux-64_conda.lock index 7ca02c7cdb159..e2584c2d27333 100644 --- a/build_tools/circle/doc_linux-64_conda.lock +++ b/build_tools/circle/doc_linux-64_conda.lock @@ -10,22 +10,22 @@ https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77 https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_2.conda#cbbe59391138ea5ad3658c76912e147f https://conda.anaconda.org/conda-forge/noarch/kernel-headers_linux-64-2.6.32-he073ed8_17.conda#d731b543793afc0433c4fd593e693fce https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h55db66e_0.conda#10569984e7db886e4f1abc2b47ad79a1 -https://conda.anaconda.org/conda-forge/noarch/libgcc-devel_linux-64-12.3.0-h0223996_106.conda#304f58c690e7ba23b67a4b5c8e99a062 -https://conda.anaconda.org/conda-forge/noarch/libstdcxx-devel_linux-64-12.3.0-h0223996_106.conda#dfb9aac785d6b25b46be7850d974a72e -https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_6.conda#2f18345bbc433c8a1ed887d7161e86a6 +https://conda.anaconda.org/conda-forge/noarch/libgcc-devel_linux-64-12.3.0-h0223996_107.conda#851e9651c9e4cd5dc19f80398eba9a1c +https://conda.anaconda.org/conda-forge/noarch/libstdcxx-devel_linux-64-12.3.0-h0223996_107.conda#167a1f5d77d8f3c2a638f7eb418429f1 +https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_7.conda#53ebd4c833fa01cb2c6353e99f905406 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_cp39.conda#bfe4b3259a8ac6cdf0037752904da6a7 https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h0c530f3_0.conda#161081fc7cec0bfda0d86d7cb595f8d8 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 -https://conda.anaconda.org/conda-forge/linux-64/libgomp-13.2.0-h77fa898_6.conda#e733e0573651a1f0639fa8ce066a286e +https://conda.anaconda.org/conda-forge/linux-64/libgomp-13.2.0-h77fa898_7.conda#abf3fec87c2563697defa759dec3d639 https://conda.anaconda.org/conda-forge/noarch/sysroot_linux-64-2.12-he073ed8_17.conda#595db67e32b276298ff3d94d07d47fbf https://conda.anaconda.org/conda-forge/linux-64/binutils_impl_linux-64-2.40-ha885e6a_0.conda#800a4c872b5bc06fa83888d112fe6c4f https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab https://conda.anaconda.org/conda-forge/linux-64/binutils-2.40-h4852527_0.conda#a05c7712be80622934f7011e0a1d43fc https://conda.anaconda.org/conda-forge/linux-64/binutils_linux-64-2.40-hdade7a5_3.conda#2d9a60578bc28469d9aeef9aea5520c3 https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 -https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_6.conda#4398809ac84d0b8c28beebaaa83277f5 +https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_7.conda#72ec1b1b04c4d15d4204ece1ecea5978 https://conda.anaconda.org/conda-forge/linux-64/alsa-lib-1.2.11-hd590300_1.conda#0bb492cca54017ea314b809b1ee3a176 -https://conda.anaconda.org/conda-forge/linux-64/aom-3.8.2-h59595ed_0.conda#625e1fed28a5139aed71b3a76117ef84 +https://conda.anaconda.org/conda-forge/linux-64/aom-3.9.0-hac33072_0.conda#93a3bf248e5bc729807db198a9c89f07 https://conda.anaconda.org/conda-forge/linux-64/attr-2.5.1-h166bdaf_1.tar.bz2#d9c69a24ad678ffce24c6543a0176b00 https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-hd590300_5.conda#69b8b6202a07720f448be700e300ccf4 https://conda.anaconda.org/conda-forge/linux-64/charls-2.4.2-h59595ed_0.conda#4336bd67920dd504cd8c6761d6a99645 @@ -45,14 +45,14 @@ https://conda.anaconda.org/conda-forge/linux-64/libdeflate-1.20-hd590300_0.conda https://conda.anaconda.org/conda-forge/linux-64/libexpat-2.6.2-h59595ed_0.conda#e7ba12deb7020dd080c6c70e7b6f6a3d https://conda.anaconda.org/conda-forge/linux-64/libffi-3.4.2-h7f98852_5.tar.bz2#d645c6d2ac96843a2bfaccd2d62b3ac3 https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-0.22.5-h59595ed_2.conda#172bcc51059416e7ce99e7b528cede83 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-h43f5ff8_6.conda#e54a5ddc67e673f9105cf2a2e9c070b0 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-hca663fb_7.conda#c0bd771f09a326fdcd95a60b617795bf https://conda.anaconda.org/conda-forge/linux-64/libhwy-1.1.0-h00ab1b0_0.conda#88928158ccfe797eac29ef5e03f7d23d https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.17-hd590300_2.conda#d66573916ffcf376178462f1b61c941e https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.0.0-hd590300_1.conda#ea25936bb4080d843790b586850f82b8 https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.1-hd590300_0.conda#30fd6e37fe21f86f4bd26d6ee73eeec7 https://conda.anaconda.org/conda-forge/linux-64/libogg-1.3.4-h7f98852_1.tar.bz2#6e8cc2173440d77708196c5b93771680 https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2#15345e56d527b330e1cacbdf58676e8f -https://conda.anaconda.org/conda-forge/linux-64/libsanitizer-12.3.0-hb8811af_6.conda#a9a764e2e753ed038da59343560d8a66 +https://conda.anaconda.org/conda-forge/linux-64/libsanitizer-12.3.0-hb8811af_7.conda#ee573415c47ce17f65101d0b3fba396d https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.4.0-hd590300_0.conda#b26e8aa824079e1be0294e7152ca4559 https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc @@ -60,8 +60,8 @@ https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda# https://conda.anaconda.org/conda-forge/linux-64/libzopfli-1.0.3-h9c3ff4c_0.tar.bz2#c66fe2d123249af7651ebde8984c51c2 https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.6-h59595ed_0.conda#9160cdeb523a1b20cf8d2a0bf821f45d -https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4.20240210-h59595ed_0.conda#97da8860a0da5413c7c98a3b3838a645 -https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.0-h00ab1b0_0.conda#b048701d52e7cbb5f59ddd4d3b17bbf5 +https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.5-h59595ed_0.conda#fcea371545eda051b6deafb24889fc69 +https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.1-h297d8ca_0.conda#3aa1c7e292afeff25a0091ddd7c69b72 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.3.0-hd590300_0.conda#c0f3abb4a16477208bbd43a39bd56f18 https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.2-h59595ed_0.conda#71004cbf7924e19c02746ccde9fd7123 @@ -81,16 +81,16 @@ https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2#2161 https://conda.anaconda.org/conda-forge/linux-64/zfp-1.0.1-h59595ed_0.conda#fd486bffbf0d6841cf1456a8f2e3a995 https://conda.anaconda.org/conda-forge/linux-64/zlib-ng-2.0.7-h0b41bf4_0.conda#49e8329110001f04923fe7e864990b0c https://conda.anaconda.org/conda-forge/linux-64/expat-2.6.2-h59595ed_0.conda#53fb86322bdb89496d7579fe3f02fd61 -https://conda.anaconda.org/conda-forge/linux-64/gcc_impl_linux-64-12.3.0-h58ffeeb_6.conda#53914a98926ce169b83726cb78366a6c +https://conda.anaconda.org/conda-forge/linux-64/gcc_impl_linux-64-12.3.0-h58ffeeb_7.conda#95f78565a09852783d3e90e0389cfa5f https://conda.anaconda.org/conda-forge/linux-64/libasprintf-devel-0.22.5-h661eb56_2.conda#02e41ab5834dcdcc8590cf29d9526f50 -https://conda.anaconda.org/conda-forge/linux-64/libavif16-1.0.4-hd9d6309_2.conda#a8c65cba5f77abc1f2e85ab9a0e614aa +https://conda.anaconda.org/conda-forge/linux-64/libavif16-1.0.4-hfa3d5b6_3.conda#3518d00de414c39b46d87dcc1ff65661 https://conda.anaconda.org/conda-forge/linux-64/libbrotlidec-1.1.0-hd590300_1.conda#f07002e225d7a60a694d42a7bf5ff53f https://conda.anaconda.org/conda-forge/linux-64/libbrotlienc-1.1.0-hd590300_1.conda#5fc11c6020d421960607d821310fcd4d https://conda.anaconda.org/conda-forge/linux-64/libcap-2.69-h0f662aa_0.conda#25cb5999faa414e5ccb2c1388f62d3d5 https://conda.anaconda.org/conda-forge/linux-64/libedit-3.1.20191231-he28a2e2_2.tar.bz2#4d331e44109e3f0e19b4cb8f9b82f3e1 https://conda.anaconda.org/conda-forge/linux-64/libevent-2.1.12-hf998b51_1.conda#a1cfcc585f0c42bf8d5546bb1dfb668d https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-devel-0.22.5-h59595ed_2.conda#b63d9b6da3653179a278077f0de20014 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_6.conda#3666a850342f8f3be88f9a93d948d027 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_7.conda#1b84f26d9f4f6026e179e7805d5a15cd https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#009981dd9cfcaa4dbfa25ffaed86bcae https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.conda#b3316cbe90249da4f8e84cd66e1cc55b https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 @@ -107,21 +107,21 @@ https://conda.anaconda.org/conda-forge/linux-64/blosc-1.21.5-hc2324a3_1.conda#11 https://conda.anaconda.org/conda-forge/linux-64/brotli-bin-1.1.0-hd590300_1.conda#39f910d205726805a958da408ca194ba https://conda.anaconda.org/conda-forge/linux-64/c-blosc2-2.14.4-hb4ffafa_1.conda#84eb54e92644c328e087e1c725773317 https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-h267a509_2.conda#9ae35c3d96db2c94ce0cef86efdfa2cb -https://conda.anaconda.org/conda-forge/linux-64/gcc-12.3.0-h915e2ae_6.conda#ec683e084ea08ef94528f15d30fa1e03 +https://conda.anaconda.org/conda-forge/linux-64/gcc-12.3.0-h915e2ae_7.conda#84b1c5cebd0a0443f3d7f90a4be93fc6 https://conda.anaconda.org/conda-forge/linux-64/gcc_linux-64-12.3.0-h6477408_3.conda#7a53f84c45bdf4656ba27b9e9ed68b3d https://conda.anaconda.org/conda-forge/linux-64/gettext-0.22.5-h59595ed_2.conda#219ba82e95d7614cf7140d2a4afc0926 -https://conda.anaconda.org/conda-forge/linux-64/gfortran_impl_linux-64-12.3.0-h1645026_6.conda#664d4e904674f1173752580ffdc24d46 -https://conda.anaconda.org/conda-forge/linux-64/gxx_impl_linux-64-12.3.0-h2a574ab_6.conda#aab48c86452d78a416992deeee901a52 +https://conda.anaconda.org/conda-forge/linux-64/gfortran_impl_linux-64-12.3.0-h1645026_7.conda#2d9d4058c433c9ce2a811c76658c4efd +https://conda.anaconda.org/conda-forge/linux-64/gxx_impl_linux-64-12.3.0-h2a574ab_7.conda#265caa78b979f112fc241cecd0015c91 https://conda.anaconda.org/conda-forge/linux-64/krb5-1.21.2-h659d440_0.conda#cd95826dbd331ed1be26bdf401432844 -https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.0-hf2295e7_6.conda#9342e7c44c38bea649490f72d92c382d +https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.2-hf974151_0.conda#72724f6a78ecb15559396966226d5838 https://conda.anaconda.org/conda-forge/linux-64/libjxl-0.10.2-hcae5a98_0.conda#901db891e1e21afd8524cd636a8c8e3b https://conda.anaconda.org/conda-forge/linux-64/libllvm15-15.0.7-hb3ce162_4.conda#8a35df3cbc0c8b12cc8af9473ae75eef https://conda.anaconda.org/conda-forge/linux-64/libllvm18-18.1.5-hb77312f_0.conda#efd221d3668077ca067a206269418dec https://conda.anaconda.org/conda-forge/linux-64/libopenblas-0.3.27-pthreads_h413a1c8_0.conda#a356024784da6dfd4683dc5ecf45b155 https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-h1dd3fc0_3.conda#66f03896ffbe1a110ffda05c7a856504 -https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.4-ha31de31_0.conda#48b9991e66abc186a7ad7975e97bd4d0 +https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.5-ha31de31_0.conda#b923cdb6e567ada84f991ffcc5848afb https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.3.0-hca2cd23_4.conda#1b50eebe2a738a3146c154d2eceaa8b6 -https://conda.anaconda.org/conda-forge/linux-64/nss-3.98-h1d7d5a4_0.conda#54b56c2fdf973656b748e0378900ec13 +https://conda.anaconda.org/conda-forge/linux-64/nss-3.100-hca3bf56_0.conda#949c4a82290ee58b3c970cef4bcfd4ad https://conda.anaconda.org/conda-forge/linux-64/python-3.9.19-h0755675_0_cpython.conda#d9ee3647fbd9e8595b8df759b2bbefb8 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 @@ -142,10 +142,10 @@ https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_2.conda#8d652ea2ee8eaee02ed8dc820bc794aa https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda#15dda3cdbf330abfe9f555d22f66db46 https://conda.anaconda.org/conda-forge/linux-64/fontconfig-2.14.2-h14ed4e7_0.conda#0f69b688f52ff6da70bccb7ff7001d1d -https://conda.anaconda.org/conda-forge/linux-64/gfortran-12.3.0-h915e2ae_6.conda#84b517f4f53e56256dbd65133aae04ac +https://conda.anaconda.org/conda-forge/linux-64/gfortran-12.3.0-h915e2ae_7.conda#8efa768f7f74085629f3e1090e7f0569 https://conda.anaconda.org/conda-forge/linux-64/gfortran_linux-64-12.3.0-h617cb40_3.conda#3a9e5b8a6f651ff14e74d896d8f04ab6 -https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.0-hde27a5a_6.conda#a9d23c02485c5cf055f9ac90eb9c9c63 -https://conda.anaconda.org/conda-forge/linux-64/gxx-12.3.0-h915e2ae_6.conda#0d977804df65082e17c860600ca2894b +https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.2-hb6ce0ca_0.conda#a965aeaf060289528a3fbe09326edae2 +https://conda.anaconda.org/conda-forge/linux-64/gxx-12.3.0-h915e2ae_7.conda#721c5433122a02bf3a081db10a2e68e2 https://conda.anaconda.org/conda-forge/linux-64/gxx_linux-64-12.3.0-h4a1b8e8_3.conda#9ec22c7c544f4a4f6d660f0a3b0fd15c https://conda.anaconda.org/conda-forge/noarch/idna-3.7-pyhd8ed1ab_0.conda#c0cc1420498b17414d8617d0b9f506ca https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2#7de5386c8fea29e76b303f37dde4c352 @@ -158,7 +158,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libclang13-18.1.5-default_h5d682 https://conda.anaconda.org/conda-forge/linux-64/libcups-2.3.3-h4637d8d_4.conda#d4529f4dff3057982a7617c7ac58fde3 https://conda.anaconda.org/conda-forge/linux-64/libflac-1.4.3-h59595ed_0.conda#ee48bf17cc83a00f59ca1494d5646869 https://conda.anaconda.org/conda-forge/linux-64/libgpg-error-1.49-h4f305b6_0.conda#dfcfd72c7a430d3616763ecfbefe4ca9 -https://conda.anaconda.org/conda-forge/linux-64/libpq-16.2-h33b98f1_1.conda#9e49ec2a61d02623b379dc332eb6889d +https://conda.anaconda.org/conda-forge/linux-64/libpq-16.3-ha72fbe1_0.conda#bac737ae28b79cfbafd515258d97d29e https://conda.anaconda.org/conda-forge/linux-64/markupsafe-2.1.5-py39hd1e30aa_0.conda#9a9a22eb1f83c44953319ee3b027769f https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2#2ba8498c1018c1e9c61eb99b973dfe19 https://conda.anaconda.org/conda-forge/noarch/networkx-3.2.1-pyhd8ed1ab_0.conda#425fce3b531bed6ec3c74fab3e5f0a1c @@ -179,7 +179,7 @@ https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5 https://conda.anaconda.org/conda-forge/noarch/snowballstemmer-2.2.0-pyhd8ed1ab_0.tar.bz2#4d22a9315e78c6827f806065957d566e https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-jsmath-1.0.1-pyhd8ed1ab_0.conda#da1d979339e2714c30a8e806a33ec087 https://conda.anaconda.org/conda-forge/noarch/tabulate-0.9.0-pyhd8ed1ab_1.tar.bz2#4759805cce2d914c38472f70bf4d8bcb -https://conda.anaconda.org/conda-forge/noarch/tenacity-8.2.3-pyhd8ed1ab_0.conda#1482e77f87c6a702a7e05ef22c9b197b +https://conda.anaconda.org/conda-forge/noarch/tenacity-8.3.0-pyhd8ed1ab_0.conda#216cfa8e32bcd1447646768351df6059 https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda#df68d78237980a159bd7149f33c0e8fd https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2#f832c45a477c78bebd107098db465095 https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2#5844808ffab9ebdb694585b50ba02a96 @@ -198,10 +198,10 @@ https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f9 https://conda.anaconda.org/conda-forge/linux-64/cxx-compiler-1.7.0-h00ab1b0_1.conda#28de2e073db9ca9b72858bee9fb6f571 https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.51.0-py39hd1e30aa_0.conda#79f5dd8778873faa54e8f7b2729fe8a6 https://conda.anaconda.org/conda-forge/linux-64/fortran-compiler-1.7.0-heb67821_1.conda#cf4b0e7c4c78bb0662aed9b27c414a3c -https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.0-hf2295e7_6.conda#a1e026a82a562b443845db5614ca568a +https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.2-hf974151_0.conda#d427988dc3dbd0a4c136f52db356cc6a https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.1.0-pyha770c72_0.conda#0896606848b2dc5cebdf111b6543aa04 https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.4.0-pyhd8ed1ab_0.conda#c5d3907ad8bd7bf557521a1833cf7e6d -https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.3-pyhd8ed1ab_0.conda#e7d8df6509ba635247ff9aea31134262 +https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.4-pyhd8ed1ab_0.conda#7b86ecb7d3557821c649b3c31e3eb9f2 https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda#25df261d4523d9f9783bcdb7208d872f https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-22_linux64_openblas.conda#4b31699e0ec5de64d5896e580389c9a1 https://conda.anaconda.org/conda-forge/linux-64/libgcrypt-1.10.3-hd590300_0.conda#32d16ad533c59bb0a3c5ffaf16110829 @@ -237,7 +237,7 @@ https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-2024.1.1-py39ha98d97 https://conda.anaconda.org/conda-forge/noarch/imageio-2.34.1-pyh4b66e23_0.conda#bcf6a6f4c6889ca083e8d33afbafb8d5 https://conda.anaconda.org/conda-forge/linux-64/pandas-2.2.2-py39hddac248_0.conda#259c4e76e6bda8888aefc098ae1ba749 https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.6-pyhd8ed1ab_0.conda#a5b55d1cb110cdcedc748b5c3e16e687 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.23-py39ha963410_0.conda#4871f09d653e979d598d2d4cd5fa868d +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.25-py39ha963410_0.conda#d14227f0e141af743374d845fd4f5ccd https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.1-pyhd8ed1ab_0.conda#d15917f33140f8d2ac9ca44db7ec8a25 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-17.0-hb77b528_0.conda#07f45f1be1c25345faddb8db0de8039b https://conda.anaconda.org/conda-forge/linux-64/pywavelets-1.4.1-py39h44dd56e_1.conda#d037c20e3da2e85f03ebd20ad480c359 @@ -247,7 +247,7 @@ https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.8.4-py39he9076 https://conda.anaconda.org/conda-forge/linux-64/pyamg-5.1.0-py39hda80f44_0.conda#f225666c47726329201b604060f1436c https://conda.anaconda.org/conda-forge/linux-64/qt-main-5.15.8-hc9dc06e_21.conda#b325046180590c868ce0dbf267b82eb8 https://conda.anaconda.org/conda-forge/linux-64/statsmodels-0.14.1-py39h44dd56e_0.conda#dc565186b972bd87e49b9c35390ddd8c -https://conda.anaconda.org/conda-forge/noarch/tifffile-2024.5.3-pyhd8ed1ab_0.conda#0658fd78a808b6f3508917ba66b20f75 +https://conda.anaconda.org/conda-forge/noarch/tifffile-2024.5.10-pyhd8ed1ab_0.conda#125438a8b679e4c08ee8f244177216c9 https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py39h52134e7_5.conda#e1f148e57d071b09187719df86f513c1 https://conda.anaconda.org/conda-forge/linux-64/scikit-image-0.22.0-py39hddac248_2.conda#8d502a4d2cbe5a45ff35ca8af8cbec0a https://conda.anaconda.org/conda-forge/noarch/seaborn-base-0.13.2-pyhd8ed1ab_2.conda#b713b116feaf98acdba93ad4d7f90ca1 @@ -282,7 +282,7 @@ https://conda.anaconda.org/conda-forge/noarch/sphinxext-opengraph-0.9.1-pyhd8ed1 # pip python-json-logger @ https://files.pythonhosted.org/packages/35/a6/145655273568ee78a581e734cf35beb9e33a370b29c5d3c8fee3744de29f/python_json_logger-2.0.7-py3-none-any.whl#sha256=f380b826a991ebbe3de4d897aeec42760035ac760345e57b812938dc8b35e2bd # pip pyyaml @ https://files.pythonhosted.org/packages/7d/39/472f2554a0f1e825bd7c5afc11c817cd7a2f3657460f7159f691fbb37c51/PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c # pip rfc3986-validator @ https://files.pythonhosted.org/packages/9e/51/17023c0f8f1869d8806b979a2bffa3f861f26a3f1a66b094288323fba52f/rfc3986_validator-0.1.1-py2.py3-none-any.whl#sha256=2f235c432ef459970b4306369336b9d5dbdda31b510ca1e327636e01f528bfa9 -# pip rpds-py @ https://files.pythonhosted.org/packages/fd/ea/92231b62681961812e9fbd8ef9be7137856784406bf6a384976bb7b46472/rpds_py-0.18.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=ddc2f4dfd396c7bfa18e6ce371cba60e4cf9d2e5cdb71376aa2da264605b60b9 +# pip rpds-py @ https://files.pythonhosted.org/packages/97/b1/12238bd8cdf3cef71e85188af133399bfde1bddf319007361cc869d6f6a7/rpds_py-0.18.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=e4c39ad2f512b4041343ea3c7894339e4ca7839ac38ca83d68a832fc8b3748ab # pip send2trash @ https://files.pythonhosted.org/packages/40/b0/4562db6223154aa4e22f939003cb92514c79f3d4dccca3444253fd17f902/Send2Trash-1.8.3-py3-none-any.whl#sha256=0c31227e0bd08961c7665474a3d1ef7193929fedda4233843689baa056be46c9 # pip sniffio @ https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl#sha256=2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2 # pip soupsieve @ https://files.pythonhosted.org/packages/4c/f3/038b302fdfbe3be7da016777069f26ceefe11a681055ea1f7817546508e3/soupsieve-2.5-py3-none-any.whl#sha256=eaa337ff55a1579b6549dc679565eac1e3d000563bcb1c8ab0d0fefbc0c2cdc7 diff --git a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock index dd291f8882efb..e08a14c235079 100644 --- a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock +++ b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock @@ -10,21 +10,21 @@ https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77 https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_2.conda#cbbe59391138ea5ad3658c76912e147f https://conda.anaconda.org/conda-forge/noarch/kernel-headers_linux-64-2.6.32-he073ed8_17.conda#d731b543793afc0433c4fd593e693fce https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.40-h55db66e_0.conda#10569984e7db886e4f1abc2b47ad79a1 -https://conda.anaconda.org/conda-forge/noarch/libgcc-devel_linux-64-12.3.0-h0223996_106.conda#304f58c690e7ba23b67a4b5c8e99a062 -https://conda.anaconda.org/conda-forge/noarch/libstdcxx-devel_linux-64-12.3.0-h0223996_106.conda#dfb9aac785d6b25b46be7850d974a72e -https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_6.conda#2f18345bbc433c8a1ed887d7161e86a6 +https://conda.anaconda.org/conda-forge/noarch/libgcc-devel_linux-64-12.3.0-h0223996_107.conda#851e9651c9e4cd5dc19f80398eba9a1c +https://conda.anaconda.org/conda-forge/noarch/libstdcxx-devel_linux-64-12.3.0-h0223996_107.conda#167a1f5d77d8f3c2a638f7eb418429f1 +https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_7.conda#53ebd4c833fa01cb2c6353e99f905406 https://conda.anaconda.org/conda-forge/linux-64/mkl-include-2024.1.0-ha957f24_692.conda#b35af3f0f25498f4e9fc4c471910346c https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_cp39.conda#bfe4b3259a8ac6cdf0037752904da6a7 https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h0c530f3_0.conda#161081fc7cec0bfda0d86d7cb595f8d8 https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2#f766549260d6815b0c52253f1fb1bb29 -https://conda.anaconda.org/conda-forge/linux-64/libgomp-13.2.0-h77fa898_6.conda#e733e0573651a1f0639fa8ce066a286e +https://conda.anaconda.org/conda-forge/linux-64/libgomp-13.2.0-h77fa898_7.conda#abf3fec87c2563697defa759dec3d639 https://conda.anaconda.org/conda-forge/noarch/sysroot_linux-64-2.12-he073ed8_17.conda#595db67e32b276298ff3d94d07d47fbf https://conda.anaconda.org/conda-forge/linux-64/binutils_impl_linux-64-2.40-ha885e6a_0.conda#800a4c872b5bc06fa83888d112fe6c4f https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2#fee5683a3f04bd15cbd8318b096a27ab https://conda.anaconda.org/conda-forge/linux-64/binutils-2.40-h4852527_0.conda#a05c7712be80622934f7011e0a1d43fc https://conda.anaconda.org/conda-forge/linux-64/binutils_linux-64-2.40-hdade7a5_3.conda#2d9a60578bc28469d9aeef9aea5520c3 https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 -https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_6.conda#4398809ac84d0b8c28beebaaa83277f5 +https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_7.conda#72ec1b1b04c4d15d4204ece1ecea5978 https://conda.anaconda.org/conda-forge/linux-64/alsa-lib-1.2.11-hd590300_1.conda#0bb492cca54017ea314b809b1ee3a176 https://conda.anaconda.org/conda-forge/linux-64/attr-2.5.1-h166bdaf_1.tar.bz2#d9c69a24ad678ffce24c6543a0176b00 https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-hd590300_5.conda#69b8b6202a07720f448be700e300ccf4 @@ -39,21 +39,21 @@ https://conda.anaconda.org/conda-forge/linux-64/libdeflate-1.20-hd590300_0.conda https://conda.anaconda.org/conda-forge/linux-64/libexpat-2.6.2-h59595ed_0.conda#e7ba12deb7020dd080c6c70e7b6f6a3d https://conda.anaconda.org/conda-forge/linux-64/libffi-3.4.2-h7f98852_5.tar.bz2#d645c6d2ac96843a2bfaccd2d62b3ac3 https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-0.22.5-h59595ed_2.conda#172bcc51059416e7ce99e7b528cede83 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-h43f5ff8_6.conda#e54a5ddc67e673f9105cf2a2e9c070b0 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-hca663fb_7.conda#c0bd771f09a326fdcd95a60b617795bf https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.17-hd590300_2.conda#d66573916ffcf376178462f1b61c941e https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.0.0-hd590300_1.conda#ea25936bb4080d843790b586850f82b8 https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.1-hd590300_0.conda#30fd6e37fe21f86f4bd26d6ee73eeec7 https://conda.anaconda.org/conda-forge/linux-64/libogg-1.3.4-h7f98852_1.tar.bz2#6e8cc2173440d77708196c5b93771680 https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2#15345e56d527b330e1cacbdf58676e8f -https://conda.anaconda.org/conda-forge/linux-64/libsanitizer-12.3.0-hb8811af_6.conda#a9a764e2e753ed038da59343560d8a66 +https://conda.anaconda.org/conda-forge/linux-64/libsanitizer-12.3.0-hb8811af_7.conda#ee573415c47ce17f65101d0b3fba396d https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.38.1-h0b41bf4_0.conda#40b61aab5c7ba9ff276c41cfffe6b80b https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.4.0-hd590300_0.conda#b26e8aa824079e1be0294e7152ca4559 https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda#5aa797f8787fe7a17d1b0821485b5adc https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.4-hcb278e6_0.conda#318b08df404f9c9be5712aaa5a6f0bb0 https://conda.anaconda.org/conda-forge/linux-64/mpg123-1.32.6-h59595ed_0.conda#9160cdeb523a1b20cf8d2a0bf821f45d -https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4.20240210-h59595ed_0.conda#97da8860a0da5413c7c98a3b3838a645 -https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.0-h00ab1b0_0.conda#b048701d52e7cbb5f59ddd4d3b17bbf5 +https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.5-h59595ed_0.conda#fcea371545eda051b6deafb24889fc69 +https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.1-h297d8ca_0.conda#3aa1c7e292afeff25a0091ddd7c69b72 https://conda.anaconda.org/conda-forge/linux-64/nspr-4.35-h27087fc_0.conda#da0ec11a6454ae19bff5b02ed881a2b1 https://conda.anaconda.org/conda-forge/linux-64/openssl-3.3.0-hd590300_0.conda#c0f3abb4a16477208bbd43a39bd56f18 https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.2-h59595ed_0.conda#71004cbf7924e19c02746ccde9fd7123 @@ -69,13 +69,13 @@ https://conda.anaconda.org/conda-forge/linux-64/xorg-xproto-7.0.31-h7f98852_1007 https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2#2161070d867d1b1204ea749c8eec4ef0 https://conda.anaconda.org/conda-forge/linux-64/yaml-0.2.5-h7f98852_2.tar.bz2#4cb3ad778ec2d5a7acbdf254eb1c42ae https://conda.anaconda.org/conda-forge/linux-64/expat-2.6.2-h59595ed_0.conda#53fb86322bdb89496d7579fe3f02fd61 -https://conda.anaconda.org/conda-forge/linux-64/gcc_impl_linux-64-12.3.0-h58ffeeb_6.conda#53914a98926ce169b83726cb78366a6c +https://conda.anaconda.org/conda-forge/linux-64/gcc_impl_linux-64-12.3.0-h58ffeeb_7.conda#95f78565a09852783d3e90e0389cfa5f https://conda.anaconda.org/conda-forge/linux-64/libasprintf-devel-0.22.5-h661eb56_2.conda#02e41ab5834dcdcc8590cf29d9526f50 https://conda.anaconda.org/conda-forge/linux-64/libcap-2.69-h0f662aa_0.conda#25cb5999faa414e5ccb2c1388f62d3d5 https://conda.anaconda.org/conda-forge/linux-64/libedit-3.1.20191231-he28a2e2_2.tar.bz2#4d331e44109e3f0e19b4cb8f9b82f3e1 https://conda.anaconda.org/conda-forge/linux-64/libevent-2.1.12-hf998b51_1.conda#a1cfcc585f0c42bf8d5546bb1dfb668d https://conda.anaconda.org/conda-forge/linux-64/libgettextpo-devel-0.22.5-h59595ed_2.conda#b63d9b6da3653179a278077f0de20014 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_6.conda#3666a850342f8f3be88f9a93d948d027 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_7.conda#1b84f26d9f4f6026e179e7805d5a15cd https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#009981dd9cfcaa4dbfa25ffaed86bcae https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.conda#b3316cbe90249da4f8e84cd66e1cc55b https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 @@ -89,20 +89,20 @@ https://conda.anaconda.org/conda-forge/linux-64/xorg-libsm-1.2.4-h7391055_0.cond https://conda.anaconda.org/conda-forge/linux-64/zlib-1.2.13-hd590300_5.conda#68c34ec6149623be41a1933ab996a209 https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda#4d056880988120e29d75bfff282e0f45 https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-h267a509_2.conda#9ae35c3d96db2c94ce0cef86efdfa2cb -https://conda.anaconda.org/conda-forge/linux-64/gcc-12.3.0-h915e2ae_6.conda#ec683e084ea08ef94528f15d30fa1e03 +https://conda.anaconda.org/conda-forge/linux-64/gcc-12.3.0-h915e2ae_7.conda#84b1c5cebd0a0443f3d7f90a4be93fc6 https://conda.anaconda.org/conda-forge/linux-64/gcc_linux-64-12.3.0-h6477408_3.conda#7a53f84c45bdf4656ba27b9e9ed68b3d https://conda.anaconda.org/conda-forge/linux-64/gettext-0.22.5-h59595ed_2.conda#219ba82e95d7614cf7140d2a4afc0926 -https://conda.anaconda.org/conda-forge/linux-64/gfortran_impl_linux-64-12.3.0-h1645026_6.conda#664d4e904674f1173752580ffdc24d46 -https://conda.anaconda.org/conda-forge/linux-64/gxx_impl_linux-64-12.3.0-h2a574ab_6.conda#aab48c86452d78a416992deeee901a52 +https://conda.anaconda.org/conda-forge/linux-64/gfortran_impl_linux-64-12.3.0-h1645026_7.conda#2d9d4058c433c9ce2a811c76658c4efd +https://conda.anaconda.org/conda-forge/linux-64/gxx_impl_linux-64-12.3.0-h2a574ab_7.conda#265caa78b979f112fc241cecd0015c91 https://conda.anaconda.org/conda-forge/linux-64/krb5-1.21.2-h659d440_0.conda#cd95826dbd331ed1be26bdf401432844 -https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.0-hf2295e7_6.conda#9342e7c44c38bea649490f72d92c382d +https://conda.anaconda.org/conda-forge/linux-64/libglib-2.80.2-hf974151_0.conda#72724f6a78ecb15559396966226d5838 https://conda.anaconda.org/conda-forge/linux-64/libhwloc-2.10.0-default_h2fb2949_1000.conda#7e3726e647a619c6ce5939014dfde86d https://conda.anaconda.org/conda-forge/linux-64/libllvm15-15.0.7-hb3ce162_4.conda#8a35df3cbc0c8b12cc8af9473ae75eef https://conda.anaconda.org/conda-forge/linux-64/libllvm18-18.1.5-hb77312f_0.conda#efd221d3668077ca067a206269418dec https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-h1dd3fc0_3.conda#66f03896ffbe1a110ffda05c7a856504 -https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.4-ha31de31_0.conda#48b9991e66abc186a7ad7975e97bd4d0 +https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.5-ha31de31_0.conda#b923cdb6e567ada84f991ffcc5848afb https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.3.0-hca2cd23_4.conda#1b50eebe2a738a3146c154d2eceaa8b6 -https://conda.anaconda.org/conda-forge/linux-64/nss-3.98-h1d7d5a4_0.conda#54b56c2fdf973656b748e0378900ec13 +https://conda.anaconda.org/conda-forge/linux-64/nss-3.100-hca3bf56_0.conda#949c4a82290ee58b3c970cef4bcfd4ad https://conda.anaconda.org/conda-forge/linux-64/python-3.9.19-h0755675_0_cpython.conda#d9ee3647fbd9e8595b8df759b2bbefb8 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.0-hd590300_1.conda#9bfac7ccd94d54fd21a0501296d60424 https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.0-h8ee46fc_1.conda#632413adcd8bc16b515cab87a2932913 @@ -125,10 +125,10 @@ https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.0-pyhd8ed1ab_2. https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda#15dda3cdbf330abfe9f555d22f66db46 https://conda.anaconda.org/conda-forge/linux-64/fontconfig-2.14.2-h14ed4e7_0.conda#0f69b688f52ff6da70bccb7ff7001d1d https://conda.anaconda.org/conda-forge/noarch/fsspec-2024.3.1-pyhca7485f_0.conda#b7f0662ef2c9d4404f0af9eef5ed2fde -https://conda.anaconda.org/conda-forge/linux-64/gfortran-12.3.0-h915e2ae_6.conda#84b517f4f53e56256dbd65133aae04ac +https://conda.anaconda.org/conda-forge/linux-64/gfortran-12.3.0-h915e2ae_7.conda#8efa768f7f74085629f3e1090e7f0569 https://conda.anaconda.org/conda-forge/linux-64/gfortran_linux-64-12.3.0-h617cb40_3.conda#3a9e5b8a6f651ff14e74d896d8f04ab6 -https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.0-hde27a5a_6.conda#a9d23c02485c5cf055f9ac90eb9c9c63 -https://conda.anaconda.org/conda-forge/linux-64/gxx-12.3.0-h915e2ae_6.conda#0d977804df65082e17c860600ca2894b +https://conda.anaconda.org/conda-forge/linux-64/glib-tools-2.80.2-hb6ce0ca_0.conda#a965aeaf060289528a3fbe09326edae2 +https://conda.anaconda.org/conda-forge/linux-64/gxx-12.3.0-h915e2ae_7.conda#721c5433122a02bf3a081db10a2e68e2 https://conda.anaconda.org/conda-forge/linux-64/gxx_linux-64-12.3.0-h4a1b8e8_3.conda#9ec22c7c544f4a4f6d660f0a3b0fd15c https://conda.anaconda.org/conda-forge/noarch/idna-3.7-pyhd8ed1ab_0.conda#c0cc1420498b17414d8617d0b9f506ca https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2#7de5386c8fea29e76b303f37dde4c352 @@ -140,7 +140,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libclang13-18.1.5-default_h5d682 https://conda.anaconda.org/conda-forge/linux-64/libcups-2.3.3-h4637d8d_4.conda#d4529f4dff3057982a7617c7ac58fde3 https://conda.anaconda.org/conda-forge/linux-64/libflac-1.4.3-h59595ed_0.conda#ee48bf17cc83a00f59ca1494d5646869 https://conda.anaconda.org/conda-forge/linux-64/libgpg-error-1.49-h4f305b6_0.conda#dfcfd72c7a430d3616763ecfbefe4ca9 -https://conda.anaconda.org/conda-forge/linux-64/libpq-16.2-h33b98f1_1.conda#9e49ec2a61d02623b379dc332eb6889d +https://conda.anaconda.org/conda-forge/linux-64/libpq-16.3-ha72fbe1_0.conda#bac737ae28b79cfbafd515258d97d29e https://conda.anaconda.org/conda-forge/noarch/locket-1.0.0-pyhd8ed1ab_0.tar.bz2#91e27ef3d05cc772ce627e51cff111c4 https://conda.anaconda.org/conda-forge/linux-64/markupsafe-2.1.5-py39hd1e30aa_0.conda#9a9a22eb1f83c44953319ee3b027769f https://conda.anaconda.org/conda-forge/noarch/networkx-3.2-pyhd8ed1ab_0.conda#cec8cc498664cc00a070676aa89e69a7 @@ -160,7 +160,7 @@ https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5 https://conda.anaconda.org/conda-forge/noarch/snowballstemmer-2.2.0-pyhd8ed1ab_0.tar.bz2#4d22a9315e78c6827f806065957d566e https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-jsmath-1.0.1-pyhd8ed1ab_0.conda#da1d979339e2714c30a8e806a33ec087 https://conda.anaconda.org/conda-forge/linux-64/tbb-2021.12.0-h00ab1b0_0.conda#f1b776cff1b426e7e7461a8502a3b731 -https://conda.anaconda.org/conda-forge/noarch/tenacity-8.2.3-pyhd8ed1ab_0.conda#1482e77f87c6a702a7e05ef22c9b197b +https://conda.anaconda.org/conda-forge/noarch/tenacity-8.3.0-pyhd8ed1ab_0.conda#216cfa8e32bcd1447646768351df6059 https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda#df68d78237980a159bd7149f33c0e8fd https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2#f832c45a477c78bebd107098db465095 https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2#5844808ffab9ebdb694585b50ba02a96 @@ -178,9 +178,9 @@ https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.0-h3faef2a_0.conda#f9 https://conda.anaconda.org/conda-forge/linux-64/cxx-compiler-1.7.0-h00ab1b0_1.conda#28de2e073db9ca9b72858bee9fb6f571 https://conda.anaconda.org/conda-forge/linux-64/cytoolz-0.12.3-py39hd1e30aa_0.conda#dc0fb8e157c7caba4c98f1e1f9d2e5f4 https://conda.anaconda.org/conda-forge/linux-64/fortran-compiler-1.7.0-heb67821_1.conda#cf4b0e7c4c78bb0662aed9b27c414a3c -https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.0-hf2295e7_6.conda#a1e026a82a562b443845db5614ca568a +https://conda.anaconda.org/conda-forge/linux-64/glib-2.80.2-hf974151_0.conda#d427988dc3dbd0a4c136f52db356cc6a https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-7.1.0-pyha770c72_0.conda#0896606848b2dc5cebdf111b6543aa04 -https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.3-pyhd8ed1ab_0.conda#e7d8df6509ba635247ff9aea31134262 +https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.4-pyhd8ed1ab_0.conda#7b86ecb7d3557821c649b3c31e3eb9f2 https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda#25df261d4523d9f9783bcdb7208d872f https://conda.anaconda.org/conda-forge/linux-64/libgcrypt-1.10.3-hd590300_0.conda#32d16ad533c59bb0a3c5ffaf16110829 https://conda.anaconda.org/conda-forge/linux-64/libsndfile-1.2.2-hc60ed4a_1.conda#ef1910918dd895516a769ed36b5b3a4e @@ -188,7 +188,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.7.0-h662e7e4_0.co https://conda.anaconda.org/conda-forge/noarch/memory_profiler-0.61.0-pyhd8ed1ab_0.tar.bz2#8b45f9f2b2f7a98b0ec179c8991a4a9b https://conda.anaconda.org/conda-forge/noarch/meson-1.4.0-pyhd8ed1ab_0.conda#52a0660cfa40b45bf254ecc3374cb2e0 https://conda.anaconda.org/conda-forge/linux-64/mkl-2024.1.0-ha957f24_692.conda#e7f5c5cda17c6f5047db27d44367c19d -https://conda.anaconda.org/conda-forge/noarch/partd-1.4.1-pyhd8ed1ab_0.conda#acf4b7c0bcd5fa3b0e05801c4d2accd6 +https://conda.anaconda.org/conda-forge/noarch/partd-1.4.2-pyhd8ed1ab_0.conda#0badf9c54e24cecfb0ad2f99d680c163 https://conda.anaconda.org/conda-forge/linux-64/pillow-10.3.0-py39h90c7501_0.conda#1e3b6af9592be71ce19f0a6aae05d97b https://conda.anaconda.org/conda-forge/noarch/pip-24.0-pyhd8ed1ab_0.conda#f586ac1e56c8638b64f9c8122a7b8a67 https://conda.anaconda.org/conda-forge/noarch/plotly-5.14.0-pyhd8ed1ab_0.conda#6a7bcc42ef58dd6cf3da9333ea102433 From e0579ee40c75aa1a0842de60a8b026f53d5ec616 Mon Sep 17 00:00:00 2001 From: Olivier Grisel Date: Mon, 13 May 2024 13:28:17 +0200 Subject: [PATCH 533/554] FIX 1d sparse array validation (#28988) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Jérémie du Boisberranger Co-authored-by: Christian Lorentzen --- doc/whats_new/v1.5.rst | 4 ++++ sklearn/preprocessing/tests/test_data.py | 4 ++++ sklearn/utils/tests/test_validation.py | 8 ++++++++ sklearn/utils/validation.py | 7 +++++++ 4 files changed, 23 insertions(+) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index e50309a330e39..55a5546453f5f 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -67,6 +67,10 @@ Changes impacting many modules :class:`pipeline.Pipeline` and :class:`preprocessing.KBinsDiscretizer`. :pr:`28756` by :user:`Will Dean `. +- |Fix| Raise `ValueError` with an informative error message when passing 1D + sparse arrays to methods that expect 2D sparse inputs. + :pr:`28988` by :user:`Olivier Grisel `. + Support for Array API --------------------- diff --git a/sklearn/preprocessing/tests/test_data.py b/sklearn/preprocessing/tests/test_data.py index b7e8e4e40686e..3810e485ae301 100644 --- a/sklearn/preprocessing/tests/test_data.py +++ b/sklearn/preprocessing/tests/test_data.py @@ -595,6 +595,10 @@ def test_standard_scaler_partial_fit_numerical_stability(sparse_container): scaler_incr = StandardScaler(with_mean=False) for chunk in X: + if chunk.ndim == 1: + # Sparse arrays can be 1D (in scipy 1.14 and later) while old + # sparse matrix instances are always 2D. + chunk = chunk.reshape(1, -1) scaler_incr = scaler_incr.partial_fit(chunk) # Regardless of magnitude, they must not differ more than of 6 digits diff --git a/sklearn/utils/tests/test_validation.py b/sklearn/utils/tests/test_validation.py index 4b4eed2522102..92fff950e875e 100644 --- a/sklearn/utils/tests/test_validation.py +++ b/sklearn/utils/tests/test_validation.py @@ -361,6 +361,14 @@ def test_check_array(): with pytest.raises(ValueError, match="Expected 2D array, got scalar array instead"): check_array(10, ensure_2d=True) + # ensure_2d=True with 1d sparse array + if hasattr(sp, "csr_array"): + sparse_row = next(iter(sp.csr_array(X))) + if sparse_row.ndim == 1: + # In scipy 1.14 and later, sparse row is 1D while it was 2D before. + with pytest.raises(ValueError, match="Expected 2D input, got"): + check_array(sparse_row, accept_sparse=True, ensure_2d=True) + # don't allow ndim > 3 X_ndim = np.arange(8).reshape(2, 2, 2) with pytest.raises(ValueError): diff --git a/sklearn/utils/validation.py b/sklearn/utils/validation.py index 5fac2ae6ae6c2..cdda749ec70a2 100644 --- a/sklearn/utils/validation.py +++ b/sklearn/utils/validation.py @@ -973,6 +973,13 @@ def is_sparse(dtype): estimator_name=estimator_name, input_name=input_name, ) + if ensure_2d and array.ndim < 2: + raise ValueError( + f"Expected 2D input, got input with shape {array.shape}.\n" + "Reshape your data either using array.reshape(-1, 1) if " + "your data has a single feature or array.reshape(1, -1) " + "if it contains a single sample." + ) else: # If np.array(..) gives ComplexWarning, then we convert the warning # to an error. This is needed because specifying a non complex From c201a0f1891746d28d27598838dfa16185365dec Mon Sep 17 00:00:00 2001 From: scikit-learn-bot Date: Mon, 13 May 2024 15:19:29 +0200 Subject: [PATCH 534/554] :lock: :robot: CI Update lock files for scipy-dev CI build(s) :lock: :robot: (#29004) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Jérémie du Boisberranger --- .../azure/pylatest_pip_scipy_dev_linux-64_conda.lock | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock index 8324d1edb36b7..c1a50c7c8c140 100644 --- a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock @@ -20,12 +20,12 @@ https://repo.anaconda.com/pkgs/main/linux-64/xz-5.4.6-h5eee18b_1.conda#1562802f8 https://repo.anaconda.com/pkgs/main/linux-64/zlib-1.2.13-h5eee18b_1.conda#92e42d8310108b0a440fb2e60b2b2a25 https://repo.anaconda.com/pkgs/main/linux-64/ccache-3.7.9-hfe4627d_0.conda#bef6fc681c273bb7bd0c67d1a591365e https://repo.anaconda.com/pkgs/main/linux-64/readline-8.2-h5eee18b_0.conda#be42180685cce6e6b0329201d9f48efb -https://repo.anaconda.com/pkgs/main/linux-64/tk-8.6.12-h1ccaba5_0.conda#fa10ff4aa631fa4aa090a6234d7770b9 +https://repo.anaconda.com/pkgs/main/linux-64/tk-8.6.14-h39e8969_0.conda#78dbc5e3c69143ebc037fc5d5b22e597 https://repo.anaconda.com/pkgs/main/linux-64/sqlite-3.45.3-h5eee18b_0.conda#acf93d6aceb74d6110e20b44cc45939e -https://repo.anaconda.com/pkgs/main/linux-64/python-3.12.3-h996f2a0_0.conda#77af2bd351a8311d1e780bcfa7819bb8 -https://repo.anaconda.com/pkgs/main/linux-64/setuptools-68.2.2-py312h06a4308_0.conda#83ba634cde4f30d9e0b88e4ac9716ca4 +https://repo.anaconda.com/pkgs/main/linux-64/python-3.12.3-h996f2a0_1.conda#0e22ed7e6df024e4f7467e75c8575301 +https://repo.anaconda.com/pkgs/main/linux-64/setuptools-69.5.1-py312h06a4308_0.conda#ce85d9a864a73e0b12d31a97733c9fca https://repo.anaconda.com/pkgs/main/linux-64/wheel-0.43.0-py312h06a4308_0.conda#18d5f3b68a175c72576876db4afc9e9e -https://repo.anaconda.com/pkgs/main/linux-64/pip-23.3.1-py312h06a4308_0.conda#e1d44bca4a257e84af33503233491107 +https://repo.anaconda.com/pkgs/main/linux-64/pip-24.0-py312h06a4308_0.conda#6d9697bb8b9f3212be10b3b8e01a12b9 # pip alabaster @ https://files.pythonhosted.org/packages/32/34/d4e1c02d3bee589efb5dfa17f88ea08bdb3e3eac12bc475462aec52ed223/alabaster-0.7.16-py3-none-any.whl#sha256=b46733c07dce03ae4e150330b975c75737fa60f0a7c591b6c8bf4928a28e2c92 # pip babel @ https://files.pythonhosted.org/packages/27/45/377f7e32a5c93d94cd56542349b34efab5ca3f9e2fd5a68c5e93169aa32d/Babel-2.15.0-py3-none-any.whl#sha256=08706bdad8d0a3413266ab61bd6c34d0c28d6e1e7badf40a2cebe67644e2e1fb # pip certifi @ https://files.pythonhosted.org/packages/ba/06/a07f096c664aeb9f01624f858c3add0a4e913d6c96257acb4fce61e7de14/certifi-2024.2.2-py3-none-any.whl#sha256=dc383c07b76109f368f6106eee2b593b04a011ea4d55f652c6ca24a754d1cdd1 From e81412776957629b2e5f058fe0bf6cf14f8dc41b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Mon, 13 May 2024 16:34:37 +0200 Subject: [PATCH 535/554] CI Fix wheel builder windows (#29006) --- .github/workflows/wheels.yml | 2 -- build_tools/github/repair_windows_wheels.sh | 1 + 2 files changed, 1 insertion(+), 2 deletions(-) diff --git a/.github/workflows/wheels.yml b/.github/workflows/wheels.yml index d30f85ff3d1e6..8bd7ffc17beca 100644 --- a/.github/workflows/wheels.yml +++ b/.github/workflows/wheels.yml @@ -53,8 +53,6 @@ jobs: matrix: include: # Window 64 bit - # Note: windows-2019 is needed for older Python versions: - # https://github.com/scikit-learn/scikit-learn/issues/22530 - os: windows-latest python: 39 platform_id: win_amd64 diff --git a/build_tools/github/repair_windows_wheels.sh b/build_tools/github/repair_windows_wheels.sh index cdd0c0c79d8c4..8f51a34d4039b 100755 --- a/build_tools/github/repair_windows_wheels.sh +++ b/build_tools/github/repair_windows_wheels.sh @@ -8,6 +8,7 @@ DEST_DIR=$2 # By default, the Windows wheels are not repaired. # In this case, we need to vendor VCRUNTIME140.dll +pip install wheel wheel unpack "$WHEEL" WHEEL_DIRNAME=$(ls -d scikit_learn-*) python build_tools/github/vendor.py "$WHEEL_DIRNAME" From c828bb0e5953eec61315bc64f9fe748201a6608d Mon Sep 17 00:00:00 2001 From: Adrin Jalali Date: Mon, 13 May 2024 16:40:26 +0200 Subject: [PATCH 536/554] DOC persistence page revamp (#28889) --- doc/model_persistence.rst | 643 +++++++++++++++++++++----------------- 1 file changed, 349 insertions(+), 294 deletions(-) diff --git a/doc/model_persistence.rst b/doc/model_persistence.rst index afd492d805e58..0c11349a68e22 100644 --- a/doc/model_persistence.rst +++ b/doc/model_persistence.rst @@ -1,294 +1,349 @@ -.. Places parent toc into the sidebar - -:parenttoc: True - -.. _model_persistence: - -================= -Model persistence -================= - -After training a scikit-learn model, it is desirable to have a way to persist -the model for future use without having to retrain. This can be accomplished -using `pickle `_, `joblib -`_, `skops -`_, `ONNX `_, -or `PMML `_. In most cases -`pickle` can be used to persist a trained scikit-learn model. Once all -transitive scikit-learn dependencies have been pinned, the trained model can -then be loaded and executed under conditions similar to those in which it was -originally pinned. The following sections will give you some hints on how to -persist a scikit-learn model and will provide details on what each alternative -can offer. - -Workflow Overview ------------------ - -In this section we present a general workflow on how to persist a -scikit-learn model. We will demonstrate this with a simple example using -Python's built-in persistence module, namely `pickle -`_. - -Storing the model in an artifact -................................ - -Once the model training process in completed, the trained model can be stored -as an artifact with the help of `pickle`. The model can be saved using the -process of serialization, where the Python object hierarchy is converted into -a byte stream. We can persist a trained model in the following manner:: - - >>> from sklearn import svm - >>> from sklearn import datasets - >>> import pickle - >>> clf = svm.SVC() - >>> X, y = datasets.load_iris(return_X_y=True) - >>> clf.fit(X, y) - SVC() - >>> s = pickle.dumps(clf) - -Replicating the training environment in production -.................................................. - -The versions of the dependencies used may differ from training to production. -This may result in unexpected behaviour and errors while using the trained -model. To prevent such situations it is recommended to use the same -dependencies and versions in both the training and production environment. -These transitive dependencies can be pinned with the help of `pip`, `conda`, -`poetry`, `conda-lock`, `pixi`, etc. - -.. note:: - - To execute a pickled scikit-learn model in a reproducible environment it is - advisable to pin all transitive scikit-learn dependencies. This prevents - any incompatibility issues that may arise while trying to load the pickled - model. You can read more about persisting models with `pickle` over - :ref:`here `. - -Loading the model artifact -.......................... - -The saved scikit-learn model can be loaded using `pickle` for future use -without having to re-train the entire model from scratch. The saved model -artifact can be unpickled by converting the byte stream into an object -hierarchy. This can be done with the help of `pickle` as follows:: - - >>> clf2 = pickle.loads(s) # doctest:+SKIP - >>> clf2.predict(X[0:1]) # doctest:+SKIP - array([0]) - >>> y[0] # doctest:+SKIP - 0 - -Serving the model artifact -.......................... - -The last step after training a scikit-learn model is serving the model. -Once the trained model is successfully loaded it can be served to manage -different prediction requests. This can involve deploying the model as a -web service using containerization, or other model deployment strategies, -according to the specifications. In the next sections, we will explore -different approaches to persist a trained scikit-learn model. - -.. _persisting_models_with_pickle: - -Persisting models with pickle ------------------------------ - -As demonstrated in the previous section, `pickle` uses serialization and -deserialization to persist scikit-learn models. Instead of using `dumps` and -`loads`, `dump` and `load` can also be used in the following way:: - - >>> from sklearn.tree import DecisionTreeClassifier - >>> from sklearn import datasets - >>> clf = DecisionTreeClassifier() - >>> X, y = datasets.load_iris(return_X_y=True) - >>> clf.fit(X, y) - DecisionTreeClassifier() - >>> from pickle import dump, load - >>> with open('filename.pkl', 'wb') as f: dump(clf, f) # doctest:+SKIP - >>> with open('filename.pkl', 'rb') as f: clf2 = load(f) # doctest:+SKIP - >>> clf2.predict(X[0:1]) # doctest:+SKIP - array([0]) - >>> y[0] - 0 - -For applications that involve writing and loading the serialized object to or -from a file, `dump` and `load` can be used instead of `dumps` and `loads`. When -file operations are not required the pickled representation of the object can -be returned as a bytes object with the help of the `dumps` function. The -reconstituted object hierarchy of the pickled data can then be returned using -the `loads` function. - -Persisting models with joblib ------------------------------ - -In the specific case of scikit-learn, it may be better to use joblib's -replacement of pickle (``dump`` & ``load``), which is more efficient on -objects that carry large numpy arrays internally as is often the case for -fitted scikit-learn estimators, but can only pickle to the disk and not to a -string:: - - >>> from joblib import dump, load - >>> dump(clf, 'filename.joblib') # doctest:+SKIP - -Later you can load back the pickled model (possibly in another Python process) -with:: - - >>> clf = load('filename.joblib') # doctest:+SKIP - -.. note:: - - ``dump`` and ``load`` functions also accept file-like object - instead of filenames. More information on data persistence with Joblib is - available `here - `_. - -|details-start| -**InconsistentVersionWarning** -|details-split| - -When an estimator is unpickled with a scikit-learn version that is inconsistent -with the version the estimator was pickled with, a -:class:`~sklearn.exceptions.InconsistentVersionWarning` is raised. This warning -can be caught to obtain the original version the estimator was pickled with:: - - from sklearn.exceptions import InconsistentVersionWarning - warnings.simplefilter("error", InconsistentVersionWarning) - - try: - est = pickle.loads("model_from_prevision_version.pickle") - except InconsistentVersionWarning as w: - print(w.original_sklearn_version) - -|details-end| - -.. _persistence_limitations: - -Security & maintainability limitations for pickle and joblib ------------------------------------------------------------- - -pickle (and joblib by extension), has some issues regarding maintainability -and security. Because of this, - -* Never unpickle untrusted data as it could lead to malicious code being - executed upon loading. -* While models saved using one version of scikit-learn might load in - other versions, this is entirely unsupported and inadvisable. It should - also be kept in mind that operations performed on such data could give - different and unexpected results. - -In order to rebuild a similar model with future versions of scikit-learn, -additional metadata should be saved along the pickled model: - -* The training data, e.g. a reference to an immutable snapshot -* The python source code used to generate the model -* The versions of scikit-learn and its dependencies -* The cross validation score obtained on the training data - -This should make it possible to check that the cross-validation score is in the -same range as before. - -Aside for a few exceptions, pickled models should be portable across -architectures assuming the same versions of dependencies and Python are used. -If you encounter an estimator that is not portable please open an issue on -GitHub. Pickled models are often deployed in production using containers, like -Docker, in order to freeze the environment and dependencies. - -If you want to know more about these issues and explore other possible -serialization methods, please refer to this -`talk by Alex Gaynor -`_. - -Persisting models with a more secure format using skops -------------------------------------------------------- - -`skops `__ provides a more secure -format via the :mod:`skops.io` module. It avoids using :mod:`pickle` and only -loads files which have types and references to functions which are trusted -either by default or by the user. - -|details-start| -**Using skops** -|details-split| - -The API is very similar to ``pickle``, and -you can persist your models as explain in the `docs -`__ using -:func:`skops.io.dump` and :func:`skops.io.dumps`:: - - import skops.io as sio - obj = sio.dumps(clf) - -And you can load them back using :func:`skops.io.load` and -:func:`skops.io.loads`. However, you need to specify the types which are -trusted by you. You can get existing unknown types in a dumped object / file -using :func:`skops.io.get_untrusted_types`, and after checking its contents, -pass it to the load function:: - - unknown_types = sio.get_untrusted_types(data=obj) - clf = sio.loads(obj, trusted=unknown_types) - -If you trust the source of the file / object, you can pass ``trusted=True``:: - - clf = sio.loads(obj, trusted=True) - -Please report issues and feature requests related to this format on the `skops -issue tracker `__. - -|details-end| - -Persisting models with interoperable formats --------------------------------------------- - -For reproducibility and quality control needs, when different architectures -and environments should be taken into account, exporting the model in -`Open Neural Network -Exchange `_ format or `Predictive Model Markup Language -(PMML) `_ format -might be a better approach than using `pickle` alone. -These are helpful where you may want to use your model for prediction in a -different environment from where the model was trained. - -ONNX is a binary serialization of the model. It has been developed to improve -the usability of the interoperable representation of data models. -It aims to facilitate the conversion of the data -models between different machine learning frameworks, and to improve their -portability on different computing architectures. More details are available -from the `ONNX tutorial `_. -To convert scikit-learn model to ONNX a specific tool `sklearn-onnx -`_ has been developed. - -PMML is an implementation of the `XML -`_ document standard -defined to represent data models together with the data used to generate them. -Being human and machine readable, -PMML is a good option for model validation on different platforms and -long term archiving. On the other hand, as XML in general, its verbosity does -not help in production when performance is critical. -To convert scikit-learn model to PMML you can use for example `sklearn2pmml -`_ distributed under the Affero GPLv3 -license. - -Summarizing the keypoints -------------------------- - -Based on the different approaches for model persistence, the keypoints for each -approach can be summarized as follows: - -* `pickle`: It is native to Python and any Python object can be serialized and - deserialized using `pickle`, including custom Python classes and objects. - While `pickle` can be used to easily save and load scikit-learn models, - unpickling of untrusted data might lead to security issues. -* `joblib`: Efficient storage and memory mapping techniques make it faster - when working with large machine learning models or large numpy arrays. However, - it may trigger the execution of malicious code while loading untrusted data. -* `skops`: Trained scikit-learn models can be easily shared and put into - production using `skops`. It is more secure compared to alternate approaches - as it allows users to load data from trusted sources. It however, does not - allow for persistence of arbitrary Python code. -* `ONNX`: It provides a uniform format for persisting any machine learning - or deep learning model (other than scikit-learn) and is useful - for model inference. It can however, result in compatibility issues with - different frameworks. -* `PMML`: Platform independent format that can be used to persist models - and reduce the risk of vendor lock-ins. The complexity and verbosity of - this format might make it harder to use for larger models. \ No newline at end of file +.. Places parent toc into the sidebar + +:parenttoc: True + +.. _model_persistence: + +================= +Model persistence +================= + +After training a scikit-learn model, it is desirable to have a way to persist +the model for future use without having to retrain. Based on your use-case, +there are a few different ways to persist a scikit-learn model, and here we +help you decide which one suits you best. In order to make a decision, you need +to answer the following questions: + +1. Do you need the Python object after persistence, or do you only need to + persist in order to serve the model and get predictions out of it? + +If you only need to serve the model and no further investigation on the Python +object itself is required, then :ref:`ONNX ` might be the +best fit for you. Note that not all models are supported by ONNX. + +In case ONNX is not suitable for your use-case, the next question is: + +2. Do you absolutely trust the source of the model, or are there any security + concerns regarding where the persisted model comes from? + +If you have security concerns, then you should consider using :ref:`skops.io +` which gives you back the Python object, but unlike +`pickle` based persistence solutions, loading the persisted model doesn't +automatically allow arbitrary code execution. Note that this requires manual +investigation of the persisted file, which :mod:`skops.io` allows you to do. + +The other solutions assume you absolutely trust the source of the file to be +loaded, as they are all susceptible to arbitrary code execution upon loading +the persisted file since they all use the pickle protocol under the hood. + +3. Do you care about the performance of loading the model, and sharing it + between processes where a memory mapped object on disk is beneficial? + +If yes, then you can consider using :ref:`joblib `. If this +is not a major concern for you, then you can use the built-in :mod:`pickle` +module. + +4. Did you try :mod:`pickle` or :mod:`joblib` and found that the model cannot + be persisted? It can happen for instance when you have user defined + functions in your model. + +If yes, then you can use `cloudpickle`_ which can serialize certain objects +which cannot be serialized by :mod:`pickle` or :mod:`joblib`. + + +Workflow Overview +----------------- + +In a typical workflow, the first step is to train the model using scikit-learn +and scikit-learn compatible libraries. Note that support for scikit-learn and +third party estimators varies across the different persistence methods. + +Train and Persist the Model +........................... + +Creating an appropriate model depends on your use-case. As an example, here we +train a :class:`sklearn.ensemble.HistGradientBoostingClassifier` on the iris +dataset:: + + >>> from sklearn import ensemble + >>> from sklearn import datasets + >>> clf = ensemble.HistGradientBoostingClassifier() + >>> X, y = datasets.load_iris(return_X_y=True) + >>> clf.fit(X, y) + HistGradientBoostingClassifier() + +Once the model is trained, you can persist it using your desired method, and +then you can load the model in a separate environment and get predictions from +it given input data. Here there are two major paths depending on how you +persist and plan to serve the model: + +- :ref:`ONNX `: You need an `ONNX` runtime and an environment + with appropriate dependencies installed to load the model and use the runtime + to get predictions. This environment can be minimal and does not necessarily + even require `python` to be installed. + +- :mod:`skops.io`, :mod:`pickle`, :mod:`joblib`, `cloudpickle`_: You need a + Python environment with the appropriate dependencies installed to load the + model and get predictions from it. This environment should have the same + **packages** and the same **versions** as the environment where the model was + trained. Note that none of these methods support loading a model trained with + a different version of scikit-learn, and possibly different versions of other + dependencies such as `numpy` and `scipy`. Another concern would be running + the persisted model on a different hardware, and in most cases you should be + able to load your persisted model on a different hardware. + + +.. _onnx_persistence: + +ONNX +---- + +`ONNX`, or `Open Neural Network Exchange `__ format is best +suitable in use-cases where one needs to persist the model and then use the +persisted artifact to get predictions without the need to load the Python +object itself. It is also useful in cases where the serving environment needs +to be lean and minimal, since the `ONNX` runtime does not require `python`. + +`ONNX` is a binary serialization of the model. It has been developed to improve +the usability of the interoperable representation of data models. It aims to +facilitate the conversion of the data models between different machine learning +frameworks, and to improve their portability on different computing +architectures. More details are available from the `ONNX tutorial +`__. To convert scikit-learn model to `ONNX` +`sklearn-onnx `__ has been developed. However, +not all scikit-learn models are supported, and it is limited to the core +scikit-learn and does not support most third party estimators. One can write a +custom converter for third party or custom estimators, but the documentation to +do that is sparse and it might be challenging to do so. + +|details-start| +**Using ONNX** +|details-split| + +To convert the model to `ONNX` format, you need to give the converter some +information about the input as well, about which you can read more `here +`__:: + + from skl2onnx import to_onnx + onx = to_onnx(clf, X[:1].astype(numpy.float32), target_opset=12) + with open("filename.onnx", "wb") as f: + f.write(onx.SerializeToString()) + +You can load the model in Python and use the `ONNX` runtime to get +predictions:: + + from onnxruntime import InferenceSession + with open("filename.onnx", "rb") as f: + onx = f.read() + sess = InferenceSession(onx, providers=["CPUExecutionProvider"]) + pred_ort = sess.run(None, {"X": X_test.astype(numpy.float32)})[0] + + +|details-end| + +.. _skops_persistence: + +`skops.io` +---------- + +:mod:`skops.io` avoids using :mod:`pickle` and only loads files which have types +and references to functions which are trusted either by default or by the user. +Therefore it provides a more secure format than :mod:`pickle`, :mod:`joblib`, +and `cloudpickle`_. + + +|details-start| +**Using skops** +|details-split| + +The API is very similar to :mod:`pickle`, and you can persist your models as +explained in the `documentation +`__ using +:func:`skops.io.dump` and :func:`skops.io.dumps`:: + + import skops.io as sio + obj = sio.dump(clf, "filename.skops") + +And you can load them back using :func:`skops.io.load` and +:func:`skops.io.loads`. However, you need to specify the types which are +trusted by you. You can get existing unknown types in a dumped object / file +using :func:`skops.io.get_untrusted_types`, and after checking its contents, +pass it to the load function:: + + unknown_types = sio.get_untrusted_types(file="filename.skops") + # investigate the contents of unknown_types, and only load if you trust + # everything you see. + clf = sio.load("filename.skops", trusted=unknown_types) + +Please report issues and feature requests related to this format on the `skops +issue tracker `__. + +|details-end| + +.. _pickle_persistence: + +`pickle`, `joblib`, and `cloudpickle` +------------------------------------- + +These three modules / packages, use the `pickle` protocol under the hood, but +come with slight variations: + +- :mod:`pickle` is a module from the Python Standard Library. It can serialize + and deserialize any Python object, including custom Python classes and + objects. +- :mod:`joblib` is more efficient than `pickle` when working with large machine + learning models or large numpy arrays. +- `cloudpickle`_ can serialize certain objects which cannot be serialized by + :mod:`pickle` or :mod:`joblib`, such as user defined functions and lambda + functions. This can happen for instance, when using a + :class:`~sklearn.preprocessing.FunctionTransformer` and using a custom + function to transform the data. + +|details-start| +**Using** ``pickle``, ``joblib``, **or** ``cloudpickle`` +|details-split| + +Depending on your use-case, you can choose one of these three methods to +persist and load your scikit-learn model, and they all follow the same API:: + + # Here you can replace pickle with joblib or cloudpickle + from pickle import dump + with open('filename.pkl', 'wb') as f: dump(clf, f) + +And later when needed, you can load the same object from the persisted file:: + + # Here you can replace pickle with joblib or cloudpickle + from pickle import load + with open('filename.pkl', 'rb') as f: clf = load(f) + +|details-end| + +.. _persistence_limitations: + +Security & Maintainability Limitations +-------------------------------------- + +:mod:`pickle` (and :mod:`joblib` and :mod:`clouldpickle` by extension), has +many documented security vulnerabilities and should only be used if the +artifact, i.e. the pickle-file, is coming from a trusted and verified source. + +Also note that arbitrary computations can be represented using the `ONNX` +format, and therefore a sandbox used to serve models using `ONNX` also needs to +safeguard against computational and memory exploits. + +Also note that there are no supported ways to load a model trained with a +different version of scikit-learn. While using :mod:`skops.io`, :mod:`joblib`, +:mod:`pickle`, or `cloudpickle`_, models saved using one version of +scikit-learn might load in other versions, however, this is entirely +unsupported and inadvisable. It should also be kept in mind that operations +performed on such data could give different and unexpected results, or even +crash your Python process. + +In order to rebuild a similar model with future versions of scikit-learn, +additional metadata should be saved along the pickled model: + +* The training data, e.g. a reference to an immutable snapshot +* The Python source code used to generate the model +* The versions of scikit-learn and its dependencies +* The cross validation score obtained on the training data + +This should make it possible to check that the cross-validation score is in the +same range as before. + +Aside for a few exceptions, persisted models should be portable across +operating systems and hardware architectures assuming the same versions of +dependencies and Python are used. If you encounter an estimator that is not +portable, please open an issue on GitHub. Persisted models are often deployed +in production using containers like Docker, in order to freeze the environment +and dependencies. + +If you want to know more about these issues, please refer to these talks: + +- `Adrin Jalali: Let's exploit pickle, and skops to the rescue! | PyData + Amsterdam 2023 `__. +- `Alex Gaynor: Pickles are for Delis, not Software - PyCon 2014 + `__. + + +.. _serving_environment: + +Replicating the training environment in production +.................................................. + +If the versions of the dependencies used may differ from training to +production, it may result in unexpected behaviour and errors while using the +trained model. To prevent such situations it is recommended to use the same +dependencies and versions in both the training and production environment. +These transitive dependencies can be pinned with the help of package management +tools like `pip`, `mamba`, `conda`, `poetry`, `conda-lock`, `pixi`, etc. + +It is not always possible to load an model trained with older versions of the +scikit-learn library and its dependencies in an updated software environment. +Instead, you might need to retrain the model with the new versions of the all +the libraries. So when training a model, it is important to record the training +recipe (e.g. a Python script) and training set information, and metadata about +all the dependencies to be able to automatically reconstruct the same training +environment for the updated software. + +|details-start| +**InconsistentVersionWarning** +|details-split| + +When an estimator is loaded with a scikit-learn version that is inconsistent +with the version the estimator was pickled with, a +:class:`~sklearn.exceptions.InconsistentVersionWarning` is raised. This warning +can be caught to obtain the original version the estimator was pickled with:: + + from sklearn.exceptions import InconsistentVersionWarning + warnings.simplefilter("error", InconsistentVersionWarning) + + try: + est = pickle.loads("model_from_prevision_version.pickle") + except InconsistentVersionWarning as w: + print(w.original_sklearn_version) + +|details-end| + + +Serving the model artifact +.......................... + +The last step after training a scikit-learn model is serving the model. +Once the trained model is successfully loaded, it can be served to manage +different prediction requests. This can involve deploying the model as a +web service using containerization, or other model deployment strategies, +according to the specifications. + + +Summarizing the key points +-------------------------- + +Based on the different approaches for model persistence, the key points for +each approach can be summarized as follows: + +* `ONNX`: It provides a uniform format for persisting any machine learning or + deep learning model (other than scikit-learn) and is useful for model + inference (predictions). It can however, result in compatibility issues with + different frameworks. +* :mod:`skops.io`: Trained scikit-learn models can be easily shared and put + into production using :mod:`skops.io`. It is more secure compared to + alternate approaches based on :mod:`pickle` because it does not load + arbitrary code unless explicitly asked for by the user. +* :mod:`joblib`: Efficient memory mapping techniques make it faster when using + the same persisted model in multiple Python processes. It also gives easy + shortcuts to compress and decompress the persisted object without the need + for extra code. However, it may trigger the execution of malicious code while + untrusted data as any other pickle-based persistence mechanism. +* :mod:`pickle`: It is native to Python and any Python object can be serialized + and deserialized using :mod:`pickle`, including custom Python classes and + objects. While :mod:`pickle` can be used to easily save and load scikit-learn + models, it may trigger the execution of malicious code while loading + untrusted data. +* `cloudpickle`_: It is slower than :mod:`pickle` and :mod:`joblib`, and is + more insecure than :mod:`pickle` and :mod:`joblib` since it can serialize + arbitrary code. However, in certain cases it might be a last resort to + persist certain models. Note that this is discouraged by `cloudpickle`_ + itself since there are no forward compatibility guarantees and you might need + the same version of `cloudpickle`_ to load the persisted model. + +.. _cloudpickle: https://github.com/cloudpipe/cloudpickle From 3dc8b30fcad7831717a66d05881cbd31b980f7b7 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Mon, 13 May 2024 18:06:05 +0200 Subject: [PATCH 537/554] DOC Mention that Meson is the main supported way to build scikit-learn (#29008) Co-authored-by: Tim Head --- doc/whats_new/v1.5.rst | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 55a5546453f5f..5fdc0707ffbee 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -95,12 +95,16 @@ See :ref:`array_api` for more details. Support for building with Meson ------------------------------- -Meson is now supported as a build backend, see :ref:`Building from source -` for more details. +From scikit-learn 1.5 onwards, Meson is the main supported way to build +scikit-learn, see :ref:`Building from source ` for more +details. -:pr:`28040` by :user:`Loïc Estève ` +Unless we discover a major blocker, setuptools support will be dropped in +scikit-learn 1.6. The 1.5.x releases will support building scikit-learn with +setuptools. -TODO Fill more details before the 1.5 release, when the Meson story has settled down. +Meson support for building scikit-learn was added in :pr:`28040` by :user:`Loïc +Estève ` Metadata Routing ---------------- From 94bbcc614ab35a52ff84781b647ada5f2a15e13f Mon Sep 17 00:00:00 2001 From: Olivier Grisel Date: Tue, 14 May 2024 12:17:05 +0200 Subject: [PATCH 538/554] DOC More improvements to the documentation on model persistence (#29011) Co-authored-by: Adrin Jalali --- doc/model_persistence.rst | 72 ++++++++++++++++++++++++++------------- 1 file changed, 48 insertions(+), 24 deletions(-) diff --git a/doc/model_persistence.rst b/doc/model_persistence.rst index 0c11349a68e22..0bc7384ec3d46 100644 --- a/doc/model_persistence.rst +++ b/doc/model_persistence.rst @@ -80,7 +80,9 @@ persist and plan to serve the model: - :ref:`ONNX `: You need an `ONNX` runtime and an environment with appropriate dependencies installed to load the model and use the runtime to get predictions. This environment can be minimal and does not necessarily - even require `python` to be installed. + even require Python to be installed to load the model and compute + predictions. Also note that `onnxruntime` typically requires much less RAM + than Python to to compute predictions from small models. - :mod:`skops.io`, :mod:`pickle`, :mod:`joblib`, `cloudpickle`_: You need a Python environment with the appropriate dependencies installed to load the @@ -208,13 +210,20 @@ persist and load your scikit-learn model, and they all follow the same API:: # Here you can replace pickle with joblib or cloudpickle from pickle import dump - with open('filename.pkl', 'wb') as f: dump(clf, f) + with open("filename.pkl", "wb") as f: + dump(clf, f, protocol=5) + +Using `protocol=5` is recommended to reduce memory usage and make it faster to +store and load any large NumPy array stored as a fitted attribute in the model. +You can alternatively pass `protocol=pickle.HIGHEST_PROTOCOL` which is +equivalent to `protocol=5` in Python 3.8 and later (at the time of writing). And later when needed, you can load the same object from the persisted file:: # Here you can replace pickle with joblib or cloudpickle from pickle import load - with open('filename.pkl', 'rb') as f: clf = load(f) + with open("filename.pkl", "rb") as f: + clf = load(f) |details-end| @@ -224,12 +233,14 @@ Security & Maintainability Limitations -------------------------------------- :mod:`pickle` (and :mod:`joblib` and :mod:`clouldpickle` by extension), has -many documented security vulnerabilities and should only be used if the -artifact, i.e. the pickle-file, is coming from a trusted and verified source. +many documented security vulnerabilities by design and should only be used if +the artifact, i.e. the pickle-file, is coming from a trusted and verified +source. You should never load a pickle file from an untrusted source, similarly +to how you should never execute code from an untrusted source. Also note that arbitrary computations can be represented using the `ONNX` -format, and therefore a sandbox used to serve models using `ONNX` also needs to -safeguard against computational and memory exploits. +format, and it is therefore recommended to serve models using `ONNX` in a +sandboxed environment to safeguard against computational and memory exploits. Also note that there are no supported ways to load a model trained with a different version of scikit-learn. While using :mod:`skops.io`, :mod:`joblib`, @@ -298,7 +309,8 @@ can be caught to obtain the original version the estimator was pickled with:: warnings.simplefilter("error", InconsistentVersionWarning) try: - est = pickle.loads("model_from_prevision_version.pickle") + with open("model_from_prevision_version.pickle", "rb") as f: + est = pickle.load(f) except InconsistentVersionWarning as w: print(w.original_sklearn_version) @@ -328,22 +340,34 @@ each approach can be summarized as follows: * :mod:`skops.io`: Trained scikit-learn models can be easily shared and put into production using :mod:`skops.io`. It is more secure compared to alternate approaches based on :mod:`pickle` because it does not load - arbitrary code unless explicitly asked for by the user. + arbitrary code unless explicitly asked for by the user. Such code needs to be + packaged and importable in the target Python environment. * :mod:`joblib`: Efficient memory mapping techniques make it faster when using - the same persisted model in multiple Python processes. It also gives easy - shortcuts to compress and decompress the persisted object without the need - for extra code. However, it may trigger the execution of malicious code while - untrusted data as any other pickle-based persistence mechanism. -* :mod:`pickle`: It is native to Python and any Python object can be serialized - and deserialized using :mod:`pickle`, including custom Python classes and - objects. While :mod:`pickle` can be used to easily save and load scikit-learn - models, it may trigger the execution of malicious code while loading - untrusted data. -* `cloudpickle`_: It is slower than :mod:`pickle` and :mod:`joblib`, and is - more insecure than :mod:`pickle` and :mod:`joblib` since it can serialize - arbitrary code. However, in certain cases it might be a last resort to - persist certain models. Note that this is discouraged by `cloudpickle`_ - itself since there are no forward compatibility guarantees and you might need - the same version of `cloudpickle`_ to load the persisted model. + the same persisted model in multiple Python processes when using + `mmap_mode="r"`. It also gives easy shortcuts to compress and decompress the + persisted object without the need for extra code. However, it may trigger the + execution of malicious code when loading a model from an untrusted source as + any other pickle-based persistence mechanism. +* :mod:`pickle`: It is native to Python and most Python objects can be + serialized and deserialized using :mod:`pickle`, including custom Python + classes and functions as long as they are defined in a package that can be + imported in the target environment. While :mod:`pickle` can be used to easily + save and load scikit-learn models, it may trigger the execution of malicious + code while loading a model from an untrusted source. :mod:`pickle` can also + be very efficient memorywise if the model was persisted with `protocol=5` but + it does not support memory mapping. +* `cloudpickle`_: It has comparable loading efficiency as :mod:`pickle` and + :mod:`joblib` (without memory mapping), but offers additional flexibility to + serialize custom Python code such as lambda expressions and interactively + defined functions and classes. It might be a last resort to persist pipelines + with custom Python components such as a + :class:`sklearn.preprocessing.FunctionTransformer` that wraps a function + defined in the training script itself or more generally outside of any + importable Python package. Note that `cloudpickle`_ offers no forward + compatibility guarantees and you might need the same version of + `cloudpickle`_ to load the persisted model along with the same version of all + the libraries used to define the model. As the other pickle-based persistence + mechanisms, it may trigger the execution of malicious code while loading + a model from an untrusted source. .. _cloudpickle: https://github.com/cloudpipe/cloudpickle From 8e7eceedbdce9b8a12cebde0551fedd468655ee0 Mon Sep 17 00:00:00 2001 From: Lucy Liu Date: Tue, 14 May 2024 22:30:17 +1000 Subject: [PATCH 539/554] DOC Add warm start section for tree ensembles (#29001) --- doc/modules/ensemble.rst | 37 +++++++++++++++++++++++++++++++++++++ sklearn/ensemble/_forest.py | 10 +++++----- 2 files changed, 42 insertions(+), 5 deletions(-) diff --git a/doc/modules/ensemble.rst b/doc/modules/ensemble.rst index d18dd2f65009e..9120bd855fd01 100644 --- a/doc/modules/ensemble.rst +++ b/doc/modules/ensemble.rst @@ -1247,6 +1247,43 @@ estimation. representations of feature space, also these approaches focus also on dimensionality reduction. +.. _tree_ensemble_warm_start: + +Fitting additional trees +------------------------ + +RandomForest, Extra-Trees and :class:`RandomTreesEmbedding` estimators all support +``warm_start=True`` which allows you to add more trees to an already fitted model. + +:: + + >>> from sklearn.datasets import make_classification + >>> from sklearn.ensemble import RandomForestClassifier + + >>> X, y = make_classification(n_samples=100, random_state=1) + >>> clf = RandomForestClassifier(n_estimators=10) + >>> clf = clf.fit(X, y) # fit with 10 trees + >>> len(clf.estimators_) + 10 + >>> # set warm_start and increase num of estimators + >>> _ = clf.set_params(n_estimators=20, warm_start=True) + >>> _ = clf.fit(X, y) # fit additional 10 trees + >>> len(clf.estimators_) + 20 + +When ``random_state`` is also set, the internal random state is also preserved +between ``fit`` calls. This means that training a model once with ``n`` estimators is +the same as building the model iteratively via multiple ``fit`` calls, where the +final number of estimators is equal to ``n``. + +:: + + >>> clf = RandomForestClassifier(n_estimators=20) # set `n_estimators` to 10 + 10 + >>> _ = clf.fit(X, y) # fit `estimators_` will be the same as `clf` above + +Note that this differs from the usual behavior of :term:`random_state` in that it does +*not* result in the same result across different calls. + .. _bagging: Bagging meta-estimator diff --git a/sklearn/ensemble/_forest.py b/sklearn/ensemble/_forest.py index 6b1b842f5367b..28c404c3e406b 100644 --- a/sklearn/ensemble/_forest.py +++ b/sklearn/ensemble/_forest.py @@ -1308,7 +1308,7 @@ class RandomForestClassifier(ForestClassifier): When set to ``True``, reuse the solution of the previous call to fit and add more estimators to the ensemble, otherwise, just fit a whole new forest. See :term:`Glossary ` and - :ref:`gradient_boosting_warm_start` for details. + :ref:`tree_ensemble_warm_start` for details. class_weight : {"balanced", "balanced_subsample"}, dict or list of dicts, \ default=None @@ -1710,7 +1710,7 @@ class RandomForestRegressor(ForestRegressor): When set to ``True``, reuse the solution of the previous call to fit and add more estimators to the ensemble, otherwise, just fit a whole new forest. See :term:`Glossary ` and - :ref:`gradient_boosting_warm_start` for details. + :ref:`tree_ensemble_warm_start` for details. ccp_alpha : non-negative float, default=0.0 Complexity parameter used for Minimal Cost-Complexity Pruning. The @@ -2049,7 +2049,7 @@ class ExtraTreesClassifier(ForestClassifier): When set to ``True``, reuse the solution of the previous call to fit and add more estimators to the ensemble, otherwise, just fit a whole new forest. See :term:`Glossary ` and - :ref:`gradient_boosting_warm_start` for details. + :ref:`tree_ensemble_warm_start` for details. class_weight : {"balanced", "balanced_subsample"}, dict or list of dicts, \ default=None @@ -2434,7 +2434,7 @@ class ExtraTreesRegressor(ForestRegressor): When set to ``True``, reuse the solution of the previous call to fit and add more estimators to the ensemble, otherwise, just fit a whole new forest. See :term:`Glossary ` and - :ref:`gradient_boosting_warm_start` for details. + :ref:`tree_ensemble_warm_start` for details. ccp_alpha : non-negative float, default=0.0 Complexity parameter used for Minimal Cost-Complexity Pruning. The @@ -2727,7 +2727,7 @@ class RandomTreesEmbedding(TransformerMixin, BaseForest): When set to ``True``, reuse the solution of the previous call to fit and add more estimators to the ensemble, otherwise, just fit a whole new forest. See :term:`Glossary ` and - :ref:`gradient_boosting_warm_start` for details. + :ref:`tree_ensemble_warm_start` for details. Attributes ---------- From 1434bb14ca52af94dd49658b7352cbf62d951c93 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Lo=C3=AFc=20Est=C3=A8ve?= Date: Tue, 14 May 2024 17:26:53 +0200 Subject: [PATCH 540/554] MNT Use c11 rather than c17 in meson.build to work-around Pyodide issue (#29015) --- meson.build | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/meson.build b/meson.build index 52c7deb962277..b6b3652a82268 100644 --- a/meson.build +++ b/meson.build @@ -6,7 +6,7 @@ project( meson_version: '>= 1.1.0', default_options: [ 'buildtype=debugoptimized', - 'c_std=c17', + 'c_std=c11', 'cpp_std=c++14', ], ) From f8be06c50b402b840d1b3fa2bd92a16a73ef1f9a Mon Sep 17 00:00:00 2001 From: Adrin Jalali Date: Tue, 14 May 2024 22:02:20 +0200 Subject: [PATCH 541/554] DOC fix dollar sign to euro sign (#29020) Co-authored-by: Guillaume Lemaitre --- .../plot_cost_sensitive_learning.py | 22 +++++++++---------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/examples/model_selection/plot_cost_sensitive_learning.py b/examples/model_selection/plot_cost_sensitive_learning.py index 7b64af48139f2..be0900d50e4ba 100644 --- a/examples/model_selection/plot_cost_sensitive_learning.py +++ b/examples/model_selection/plot_cost_sensitive_learning.py @@ -489,7 +489,7 @@ def plot_roc_pr_curves(vanilla_model, tuned_model, *, title): _, ax = plt.subplots() ax.hist(amount_fraud, bins=100) ax.set_title("Amount of fraud transaction") -_ = ax.set_xlabel("Amount ($)") +_ = ax.set_xlabel("Amount (€)") # %% # Addressing the problem with a business metric @@ -501,8 +501,8 @@ def plot_roc_pr_curves(vanilla_model, tuned_model, *, title): # transaction result in a loss of the amount of the transaction. As stated in [2]_, the # gain and loss related to refusals (of fraudulent and legitimate transactions) are not # trivial to define. Here, we define that a refusal of a legitimate transaction is -# estimated to a loss of $5 while the refusal of a fraudulent transaction is estimated -# to a gain of $50 dollars and the amount of the transaction. Therefore, we define the +# estimated to a loss of 5€ while the refusal of a fraudulent transaction is estimated +# to a gain of 50€ and the amount of the transaction. Therefore, we define the # following function to compute the total benefit of a given decision: @@ -557,22 +557,22 @@ def business_metric(y_true, y_pred, amount): benefit_cost = business_scorer( easy_going_classifier, data_test, target_test, amount=amount_test ) -print(f"Benefit/cost of our easy-going classifier: ${benefit_cost:,.2f}") +print(f"Benefit/cost of our easy-going classifier: {benefit_cost:,.2f}€") # %% # A classifier that predict all transactions as legitimate would create a profit of -# around $220,000. We make the same evaluation for a classifier that predicts all +# around 220,000.€ We make the same evaluation for a classifier that predicts all # transactions as fraudulent. intolerant_classifier = DummyClassifier(strategy="constant", constant=1) intolerant_classifier.fit(data_train, target_train) benefit_cost = business_scorer( intolerant_classifier, data_test, target_test, amount=amount_test ) -print(f"Benefit/cost of our intolerant classifier: ${benefit_cost:,.2f}") +print(f"Benefit/cost of our intolerant classifier: {benefit_cost:,.2f}€") # %% -# Such a classifier create a loss of around $670,000. A predictive model should allow -# us to make a profit larger than $220,000. It is interesting to compare this business +# Such a classifier create a loss of around 670,000.€ A predictive model should allow +# us to make a profit larger than 220,000.€ It is interesting to compare this business # metric with another "standard" statistical metric such as the balanced accuracy. from sklearn.metrics import get_scorer @@ -607,7 +607,7 @@ def business_metric(y_true, y_pred, amount): print( "Benefit/cost of our logistic regression: " - f"${business_scorer(model, data_test, target_test, amount=amount_test):,.2f}" + f"{business_scorer(model, data_test, target_test, amount=amount_test):,.2f}€" ) print( "Balanced accuracy of our logistic regression: " @@ -645,7 +645,7 @@ def business_metric(y_true, y_pred, amount): # %% print( "Benefit/cost of our logistic regression: " - f"${business_scorer(tuned_model, data_test, target_test, amount=amount_test):,.2f}" + f"{business_scorer(tuned_model, data_test, target_test, amount=amount_test):,.2f}€" ) print( "Balanced accuracy of our logistic regression: " @@ -691,7 +691,7 @@ def business_metric(y_true, y_pred, amount): business_score = business_scorer( model_fixed_threshold, data_test, target_test, amount=amount_test ) -print(f"Benefit/cost of our logistic regression: ${business_score:,.2f}") +print(f"Benefit/cost of our logistic regression: {business_score:,.2f}€") print( "Balanced accuracy of our logistic regression: " f"{balanced_accuracy_scorer(model_fixed_threshold, data_test, target_test):.3f}" From b9bdb973f505f574700f6e6bc59da26cb91bee88 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Wed, 15 May 2024 14:01:27 +0200 Subject: [PATCH 542/554] TST check compatibility with metadata routing for *ThresholdClassifier* (#29021) --- .../_classification_threshold.py | 19 +++++++----- sklearn/tests/metadata_routing_common.py | 29 ++++++++++++------- .../test_metaestimators_metadata_routing.py | 21 ++++++++++++++ 3 files changed, 51 insertions(+), 18 deletions(-) diff --git a/sklearn/model_selection/_classification_threshold.py b/sklearn/model_selection/_classification_threshold.py index d5a864da10653..1f891577b4680 100644 --- a/sklearn/model_selection/_classification_threshold.py +++ b/sklearn/model_selection/_classification_threshold.py @@ -106,6 +106,14 @@ def __init__(self, estimator, *, response_method="auto"): self.estimator = estimator self.response_method = response_method + def _get_response_method(self): + """Define the response method.""" + if self.response_method == "auto": + response_method = ["predict_proba", "decision_function"] + else: + response_method = self.response_method + return response_method + @_fit_context( # *ThresholdClassifier*.estimator is not validated yet prefer_skip_nested_validation=False @@ -140,11 +148,6 @@ def fit(self, X, y, **params): f"Only binary classification is supported. Unknown label type: {y_type}" ) - if self.response_method == "auto": - self._response_method = ["predict_proba", "decision_function"] - else: - self._response_method = self.response_method - self._fit(X, y, **params) if hasattr(self.estimator_, "n_features_in_"): @@ -374,7 +377,7 @@ def predict(self, X): y_score, _, response_method_used = _get_response_values_binary( self.estimator_, X, - self._response_method, + self._get_response_method(), pos_label=self.pos_label, return_response_method_used=True, ) @@ -954,7 +957,7 @@ def predict(self, X): y_score, _ = _get_response_values_binary( self.estimator_, X, - self._response_method, + self._get_response_method(), pos_label=pos_label, ) @@ -995,6 +998,6 @@ def _get_curve_scorer(self): """Get the curve scorer based on the objective metric used.""" scoring = check_scoring(self.estimator, scoring=self.scoring) curve_scorer = _CurveScorer.from_scorer( - scoring, self._response_method, self.thresholds + scoring, self._get_response_method(), self.thresholds ) return curve_scorer diff --git a/sklearn/tests/metadata_routing_common.py b/sklearn/tests/metadata_routing_common.py index 889524bc05ddb..6fba2f037fd15 100644 --- a/sklearn/tests/metadata_routing_common.py +++ b/sklearn/tests/metadata_routing_common.py @@ -194,7 +194,10 @@ def decision_function(self, X): return self.predict(X) def predict(self, X): - return np.ones(len(X)) + y_pred = np.empty(shape=(len(X),)) + y_pred[: len(X) // 2] = 0 + y_pred[len(X) // 2 :] = 1 + return y_pred class NonConsumingRegressor(RegressorMixin, BaseEstimator): @@ -257,16 +260,19 @@ def predict(self, X, sample_weight="default", metadata="default"): record_metadata_not_default( self, "predict", sample_weight=sample_weight, metadata=metadata ) - return np.zeros(shape=(len(X),)) + y_score = np.empty(shape=(len(X),), dtype="int8") + y_score[len(X) // 2 :] = 0 + y_score[: len(X) // 2] = 1 + return y_score def predict_proba(self, X, sample_weight="default", metadata="default"): - pass # pragma: no cover - - # uncomment when needed - # record_metadata_not_default( - # self, "predict_proba", sample_weight=sample_weight, metadata=metadata - # ) - # return np.asarray([[0.0, 1.0]] * len(X)) + record_metadata_not_default( + self, "predict_proba", sample_weight=sample_weight, metadata=metadata + ) + y_proba = np.empty(shape=(len(X), 2)) + y_proba[: len(X) // 2, :] = np.asarray([1.0, 0.0]) + y_proba[len(X) // 2 :, :] = np.asarray([0.0, 1.0]) + return y_proba def predict_log_proba(self, X, sample_weight="default", metadata="default"): pass # pragma: no cover @@ -281,7 +287,10 @@ def decision_function(self, X, sample_weight="default", metadata="default"): record_metadata_not_default( self, "predict_proba", sample_weight=sample_weight, metadata=metadata ) - return np.zeros(shape=(len(X),)) + y_score = np.empty(shape=(len(X),)) + y_score[len(X) // 2 :] = 0 + y_score[: len(X) // 2] = 1 + return y_score # uncomment when needed # def score(self, X, y, sample_weight="default", metadata="default"): diff --git a/sklearn/tests/test_metaestimators_metadata_routing.py b/sklearn/tests/test_metaestimators_metadata_routing.py index aa6af5bd09aac..d9a7d6c9e5952 100644 --- a/sklearn/tests/test_metaestimators_metadata_routing.py +++ b/sklearn/tests/test_metaestimators_metadata_routing.py @@ -43,10 +43,12 @@ RidgeCV, ) from sklearn.model_selection import ( + FixedThresholdClassifier, GridSearchCV, HalvingGridSearchCV, HalvingRandomSearchCV, RandomizedSearchCV, + TunedThresholdClassifierCV, ) from sklearn.multiclass import ( OneVsOneClassifier, @@ -77,6 +79,7 @@ N, M = 100, 4 X = rng.rand(N, M) y = rng.randint(0, 3, size=N) +y_binary = (y >= 1).astype(int) classes = np.unique(y) y_multi = rng.randint(0, 3, size=(N, 3)) classes_multi = [np.unique(y_multi[:, i]) for i in range(y_multi.shape[1])] @@ -200,6 +203,24 @@ def enable_slep006(): "cv_name": "cv", "cv_routing_methods": ["fit"], }, + { + "metaestimator": FixedThresholdClassifier, + "estimator_name": "estimator", + "estimator": "classifier", + "X": X, + "y": y_binary, + "estimator_routing_methods": ["fit"], + "preserves_metadata": "subset", + }, + { + "metaestimator": TunedThresholdClassifierCV, + "estimator_name": "estimator", + "estimator": "classifier", + "X": X, + "y": y_binary, + "estimator_routing_methods": ["fit"], + "preserves_metadata": "subset", + }, { "metaestimator": OneVsRestClassifier, "estimator_name": "estimator", From 12ea35979e552f5fc938ac56573accd5ba894b69 Mon Sep 17 00:00:00 2001 From: Aswathavicky Date: Thu, 16 May 2024 10:01:37 +0200 Subject: [PATCH 543/554] DOC add link to sklearn_example_ensemeble_plot_adboost_twoclass (#29023) --- sklearn/ensemble/_weight_boosting.py | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/sklearn/ensemble/_weight_boosting.py b/sklearn/ensemble/_weight_boosting.py index 0461a397983be..6bbac0613de71 100644 --- a/sklearn/ensemble/_weight_boosting.py +++ b/sklearn/ensemble/_weight_boosting.py @@ -482,6 +482,10 @@ class AdaBoostClassifier( For a detailed example of using AdaBoost to fit a sequence of DecisionTrees as weaklearners, please refer to :ref:`sphx_glr_auto_examples_ensemble_plot_adaboost_multiclass.py`. + + For a detailed example of using AdaBoost to fit a non-linearly seperable + classification dataset composed of two Gaussian quantiles clusters, please + refer to :ref:`sphx_glr_auto_examples_ensemble_plot_adaboost_twoclass.py`. """ # TODO(1.6): Modify _parameter_constraints for "algorithm" to only check From 43b02ec6479728080f6751f5d1a1a97f99df2bbf Mon Sep 17 00:00:00 2001 From: Tialo <65392801+Tialo@users.noreply.github.com> Date: Thu, 16 May 2024 11:11:54 +0300 Subject: [PATCH 544/554] DOC Fix default value of n in check_cv (#29024) --- sklearn/model_selection/_split.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/sklearn/model_selection/_split.py b/sklearn/model_selection/_split.py index 53c11a665ccf4..1f9d78d3e4cbd 100644 --- a/sklearn/model_selection/_split.py +++ b/sklearn/model_selection/_split.py @@ -2596,7 +2596,7 @@ def check_cv(cv=5, y=None, *, classifier=False): Parameters ---------- - cv : int, cross-validation generator or an iterable, default=None + cv : int, cross-validation generator, iterable or None, default=5 Determines the cross-validation splitting strategy. Possible inputs for cv are: - None, to use the default 5-fold cross validation, From 37f544db78503ed1a50da02cbb4f1a4e466fb0a7 Mon Sep 17 00:00:00 2001 From: raisadz <34237447+raisadz@users.noreply.github.com> Date: Fri, 17 May 2024 09:13:51 +0100 Subject: [PATCH 545/554] DOC replace pandas with Polars in examples/gaussian_process/plot_gpr_co2.py (#28804) Co-authored-by: raisa <> Co-authored-by: Adrin Jalali --- ...latest_conda_forge_mkl_linux-64_conda.lock | 6 +-- ...pylatest_conda_forge_mkl_osx-64_conda.lock | 2 +- ...pylatest_pip_scipy_dev_linux-64_conda.lock | 2 +- .../pymin_conda_forge_mkl_win-64_conda.lock | 2 +- ...e_openblas_ubuntu_2204_linux-64_conda.lock | 4 +- build_tools/azure/pypy3_linux-64_conda.lock | 34 +++++++-------- build_tools/circle/doc_linux-64_conda.lock | 8 ++-- .../doc_min_dependencies_environment.yml | 2 +- .../doc_min_dependencies_linux-64_conda.lock | 10 ++--- examples/gaussian_process/plot_gpr_co2.py | 41 ++++++++++--------- pyproject.toml | 4 +- sklearn/_min_dependencies.py | 2 +- sklearn/tests/test_base.py | 2 +- 13 files changed, 61 insertions(+), 58 deletions(-) diff --git a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock index 3d895fda71bc3..bf5bcd3daff08 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_linux-64_conda.lock @@ -91,7 +91,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.cond https://conda.anaconda.org/conda-forge/linux-64/libssh2-1.11.0-h0841786_0.conda#1f5a58e686b13bcfde88b93f547d23fe https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.15-h0b41bf4_0.conda#33277193f5b92bad9fdd230eb700929c -https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.6-h232c23b_2.conda#9a3a42df8a95f65334dfc7b80da1195d +https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.7-hc051c1a_0.conda#5d801a4906adc712d480afc362623b59 https://conda.anaconda.org/conda-forge/linux-64/mysql-common-8.3.0-hf1915f5_4.conda#784a4df6676c581ca624fbe460703a6d https://conda.anaconda.org/conda-forge/linux-64/pcre2-10.43-hcad00b1_0.conda#8292dea9e022d9610a11fce5e0896ed8 https://conda.anaconda.org/conda-forge/linux-64/readline-8.2-h8228510_1.conda#47d31b792659ce70f470b5c82fdfb7a4 @@ -191,7 +191,7 @@ https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py311hb755f60_0.conda https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.3.14-hf3aad02_1.conda#a968ffa7e9fe0c257628033d393e512f https://conda.anaconda.org/conda-forge/linux-64/blas-1.0-mkl.tar.bz2#349aef876b1d8c9dccae01de20d5b385 https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.24.3-haf2f30d_0.conda#f3df87cc9ef0b5113bff55aefcbcafd5 -https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.4.0-h3d44ed6_0.conda#27f46291a6aaa3c2a4f798ebd35a7ddb +https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.5.0-hfac3d4d_0.conda#f5126317dd0ce0ba26945e411ecc6960 https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-16_linux64_mkl.tar.bz2#85f61af03fd291dae33150ffe89dc09a https://conda.anaconda.org/conda-forge/linux-64/libsystemd0-255-h3516f8a_1.conda#3366af27f0b593544a6cd453c7932ac5 https://conda.anaconda.org/conda-forge/noarch/meson-python-0.16.0-pyh0c530f3_0.conda#e16f0dbf502da873be9f9adb0dc52547 @@ -210,7 +210,7 @@ https://conda.anaconda.org/conda-forge/noarch/array-api-strict-1.1.1-pyhd8ed1ab_ https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.1-py311h9547e67_0.conda#74ad0ae64f1ef565e27eda87fa749e84 https://conda.anaconda.org/conda-forge/linux-64/libarrow-12.0.1-hb87d912_8_cpu.conda#3f3b11398fe79b578e3c44dd00a44e4a https://conda.anaconda.org/conda-forge/linux-64/pandas-2.2.2-py311h320fe9a_0.conda#c79e96ece4110fdaf2657c9f8e16f749 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.25-py311h00856b1_0.conda#84ad7fa8742f6d34784a961337622c55 +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.26-py311h00856b1_0.conda#d9002441c9b75b188f9cdc51bf4f22c7 https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py311hf0fb5b6_5.conda#ec7e45bc76d9d0b69a74a2075932b8e8 https://conda.anaconda.org/conda-forge/linux-64/pytorch-1.13.1-cpu_py311h410fd25_1.conda#ddd2fadddf89e3dc3d541a2537fce010 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.13.0-py311h517d4fd_1.conda#a86b8bea39e292a23b2cf9a750f49ea1 diff --git a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock index ce2d5e2c383a3..c0e54faa37bc6 100644 --- a/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock +++ b/build_tools/azure/pylatest_conda_forge_mkl_osx-64_conda.lock @@ -28,7 +28,7 @@ https://conda.anaconda.org/conda-forge/osx-64/libcxx-17.0.6-h88467a6_0.conda#0fe https://conda.anaconda.org/conda-forge/osx-64/libpng-1.6.43-h92b6c6a_0.conda#65dcddb15965c9de2c0365cb14910532 https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.45.3-h92b6c6a_0.conda#68e462226209f35182ef66eda0f794ff https://conda.anaconda.org/conda-forge/osx-64/libxcb-1.15-hb7f2c08_0.conda#5513f57e0238c87c12dffedbcc9c1a4a -https://conda.anaconda.org/conda-forge/osx-64/libxml2-2.12.6-hc0ae0f7_2.conda#50b997370584f2c83ca0c38e9028eab9 +https://conda.anaconda.org/conda-forge/osx-64/libxml2-2.12.7-h3e169fe_0.conda#4c04ba47fdd2ebecc1d3b6a77534d9ef https://conda.anaconda.org/conda-forge/osx-64/llvm-openmp-18.1.5-h39e0ece_0.conda#ee12a644568269838b91f901b2537425 https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.0-hd75f5a5_0.conda#eb8c33aa7929a7714eab8b90c1d88afe https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda#f17f77f2acf4d344734bda76829ce14e diff --git a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock index c1a50c7c8c140..e4305c97b76bc 100644 --- a/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock +++ b/build_tools/azure/pylatest_pip_scipy_dev_linux-64_conda.lock @@ -40,7 +40,7 @@ https://repo.anaconda.com/pkgs/main/linux-64/pip-24.0-py312h06a4308_0.conda#6d96 # pip meson @ https://files.pythonhosted.org/packages/33/75/b1a37fa7b2dbca8c0dbb04d5cdd7e2720c8ef6febe41b4a74866350e041c/meson-1.4.0-py3-none-any.whl#sha256=476a458d51fcfa322a6bdc64da5138997c542d08e6b2e49b9fa68c46fd7c4475 # pip ninja @ https://files.pythonhosted.org/packages/6d/92/8d7aebd4430ab5ff65df2bfee6d5745f95c004284db2d8ca76dcbfd9de47/ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl#sha256=84502ec98f02a037a169c4b0d5d86075eaf6afc55e1879003d6cab51ced2ea4b # pip packaging @ https://files.pythonhosted.org/packages/49/df/1fceb2f8900f8639e278b056416d49134fb8d84c5942ffaa01ad34782422/packaging-24.0-py3-none-any.whl#sha256=2ddfb553fdf02fb784c234c7ba6ccc288296ceabec964ad2eae3777778130bc5 -# pip platformdirs @ https://files.pythonhosted.org/packages/b0/15/1691fa5aaddc0c4ea4901c26f6137c29d5f6673596fe960a0340e8c308e1/platformdirs-4.2.1-py3-none-any.whl#sha256=17d5a1161b3fd67b390023cb2d3b026bbd40abde6fdb052dfbd3a29c3ba22ee1 +# pip platformdirs @ https://files.pythonhosted.org/packages/68/13/2aa1f0e1364feb2c9ef45302f387ac0bd81484e9c9a4c5688a322fbdfd08/platformdirs-4.2.2-py3-none-any.whl#sha256=2d7a1657e36a80ea911db832a8a6ece5ee53d8de21edd5cc5879af6530b1bfee # pip pluggy @ https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl#sha256=44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669 # pip pygments @ https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl#sha256=b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a # pip six @ https://files.pythonhosted.org/packages/d9/5a/e7c31adbe875f2abbb91bd84cf2dc52d792b5a01506781dbcf25c91daf11/six-1.16.0-py2.py3-none-any.whl#sha256=8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254 diff --git a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock index d95e56378ae56..8f0a473c031ca 100644 --- a/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_mkl_win-64_conda.lock @@ -39,7 +39,7 @@ https://conda.anaconda.org/conda-forge/win-64/libbrotlienc-1.1.0-hcfcfb64_1.cond https://conda.anaconda.org/conda-forge/win-64/libintl-0.22.5-h5728263_2.conda#aa622c938af057adc119f8b8eecada01 https://conda.anaconda.org/conda-forge/win-64/libpng-1.6.43-h19919ed_0.conda#77e398acc32617a0384553aea29e866b https://conda.anaconda.org/conda-forge/win-64/libvorbis-1.3.7-h0e60522_0.tar.bz2#e1a22282de0169c93e4ffe6ce6acc212 -https://conda.anaconda.org/conda-forge/win-64/libxml2-2.12.6-hc3477c8_2.conda#ac7af7a949db01dae61ddc48f4a93d79 +https://conda.anaconda.org/conda-forge/win-64/libxml2-2.12.7-h283a6d9_0.conda#1451be68a5549561979125c1827b79ed https://conda.anaconda.org/conda-forge/win-64/m2w64-gcc-libs-5.3.0-7.tar.bz2#fe759119b8b3bfa720b8762c6fdc35de https://conda.anaconda.org/conda-forge/win-64/pcre2-10.43-h17e33f8_0.conda#d0485b8aa2cedb141a7bd27b4efa4c9c https://conda.anaconda.org/conda-forge/win-64/python-3.9.19-h4de0772_0_cpython.conda#b6999bc275e0e6beae7b1c8ea0be1e85 diff --git a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock index 231cd528ecd0e..1a4d0feae1773 100644 --- a/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock +++ b/build_tools/azure/pymin_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock @@ -71,7 +71,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#0 https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.conda#b3316cbe90249da4f8e84cd66e1cc55b https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.15-h0b41bf4_0.conda#33277193f5b92bad9fdd230eb700929c -https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.6-h232c23b_2.conda#9a3a42df8a95f65334dfc7b80da1195d +https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.7-hc051c1a_0.conda#5d801a4906adc712d480afc362623b59 https://conda.anaconda.org/conda-forge/linux-64/mysql-common-8.3.0-hf1915f5_4.conda#784a4df6676c581ca624fbe460703a6d https://conda.anaconda.org/conda-forge/linux-64/pcre2-10.43-hcad00b1_0.conda#8292dea9e022d9610a11fce5e0896ed8 https://conda.anaconda.org/conda-forge/linux-64/readline-8.2-h8228510_1.conda#47d31b792659ce70f470b5c82fdfb7a4 @@ -175,7 +175,7 @@ https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0 https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda#e667a3ab0df62c54e60e1843d2e6defb https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.1-pyhd8ed1ab_0.conda#08807a87fa7af10754d46f63b368e016 https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.24.3-haf2f30d_0.conda#f3df87cc9ef0b5113bff55aefcbcafd5 -https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.4.0-h3d44ed6_0.conda#27f46291a6aaa3c2a4f798ebd35a7ddb +https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.5.0-hfac3d4d_0.conda#f5126317dd0ce0ba26945e411ecc6960 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.4.0-pyhd8ed1ab_0.conda#dcbadab7a68738a028e195ab68ab2d2e https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-22_linux64_openblas.conda#1fd156abd41a4992835952f6f4d951d0 https://conda.anaconda.org/conda-forge/linux-64/libsystemd0-255-h3516f8a_1.conda#3366af27f0b593544a6cd453c7932ac5 diff --git a/build_tools/azure/pypy3_linux-64_conda.lock b/build_tools/azure/pypy3_linux-64_conda.lock index 23710cfe35cb8..ab6a908edf340 100644 --- a/build_tools/azure/pypy3_linux-64_conda.lock +++ b/build_tools/azure/pypy3_linux-64_conda.lock @@ -4,24 +4,24 @@ @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2024.2.2-hbcca054_0.conda#2f4327a1cbe7f022401b236e915a5fef -https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-h7e041cc_5.conda#f6f6600d18a4047b54f803cf708b868a +https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-13.2.0-hc0a3c3a_7.conda#53ebd4c833fa01cb2c6353e99f905406 https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.9-4_pypy39_pp73.conda#c1b2f29111681a4036ed21eaa3f44620 https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h0c530f3_0.conda#161081fc7cec0bfda0d86d7cb595f8d8 https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_kmp_llvm.tar.bz2#562b26ba2e19059551a811e72ab7f793 -https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h807b86a_5.conda#d4ff227c46917d3b4565302a2bbb276b +https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-13.2.0-h77fa898_7.conda#72ec1b1b04c4d15d4204ece1ecea5978 https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-hd590300_5.conda#69b8b6202a07720f448be700e300ccf4 https://conda.anaconda.org/conda-forge/linux-64/lerc-4.0.0-h27087fc_0.tar.bz2#76bbff344f0134279f225174e9064c8f https://conda.anaconda.org/conda-forge/linux-64/libbrotlicommon-1.1.0-hd590300_1.conda#aec6c91c7371c26392a06708a73c70e5 https://conda.anaconda.org/conda-forge/linux-64/libdeflate-1.20-hd590300_0.conda#8e88f9389f1165d7c0936fe40d9a9a79 https://conda.anaconda.org/conda-forge/linux-64/libexpat-2.6.2-h59595ed_0.conda#e7ba12deb7020dd080c6c70e7b6f6a3d https://conda.anaconda.org/conda-forge/linux-64/libffi-3.4.2-h7f98852_5.tar.bz2#d645c6d2ac96843a2bfaccd2d62b3ac3 -https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-ha4646dd_5.conda#7a6bd7a12a4bd359e2afe6c0fa1acace +https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-13.2.0-hca663fb_7.conda#c0bd771f09a326fdcd95a60b617795bf https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.0.0-hd590300_1.conda#ea25936bb4080d843790b586850f82b8 -https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.3.2-hd590300_1.conda#049b7df8bae5e184d1de42cdf64855f8 +https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.4.0-hd590300_0.conda#b26e8aa824079e1be0294e7152ca4559 https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.13-hd590300_5.conda#f36c115f1ee199da648e0597ec2047ad -https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.4.20240210-h59595ed_0.conda#97da8860a0da5413c7c98a3b3838a645 -https://conda.anaconda.org/conda-forge/linux-64/ninja-1.11.1-h924138e_0.conda#73a4953a2d9c115bdc10ff30a52f675f -https://conda.anaconda.org/conda-forge/linux-64/openssl-3.2.1-hd590300_1.conda#9d731343cff6ee2e5a25c4a091bf8e2a +https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.5-h59595ed_0.conda#fcea371545eda051b6deafb24889fc69 +https://conda.anaconda.org/conda-forge/linux-64/ninja-1.12.1-h297d8ca_0.conda#3aa1c7e292afeff25a0091ddd7c69b72 +https://conda.anaconda.org/conda-forge/linux-64/openssl-3.3.0-hd590300_0.conda#c0f3abb4a16477208bbd43a39bd56f18 https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-h36c2ea0_1001.tar.bz2#22dad4df6e8630e8dff2428f6f6a7036 https://conda.anaconda.org/conda-forge/linux-64/xorg-kbproto-1.0.7-h7f98852_1002.tar.bz2#4b230e8381279d76131116660f5a241a https://conda.anaconda.org/conda-forge/linux-64/xorg-libxau-1.0.11-hd590300_0.conda#2c80dc38fface310c9bd81b17037fee5 @@ -32,22 +32,22 @@ https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2#2161 https://conda.anaconda.org/conda-forge/linux-64/expat-2.6.2-h59595ed_0.conda#53fb86322bdb89496d7579fe3f02fd61 https://conda.anaconda.org/conda-forge/linux-64/libbrotlidec-1.1.0-hd590300_1.conda#f07002e225d7a60a694d42a7bf5ff53f https://conda.anaconda.org/conda-forge/linux-64/libbrotlienc-1.1.0-hd590300_1.conda#5fc11c6020d421960607d821310fcd4d -https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_5.conda#e73e9cfd1191783392131e6238bdb3e9 +https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-13.2.0-h69a702a_7.conda#1b84f26d9f4f6026e179e7805d5a15cd https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#009981dd9cfcaa4dbfa25ffaed86bcae -https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.2-h2797004_0.conda#866983a220e27a80cb75e85cb30466a1 +https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.conda#b3316cbe90249da4f8e84cd66e1cc55b https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.15-h0b41bf4_0.conda#33277193f5b92bad9fdd230eb700929c https://conda.anaconda.org/conda-forge/linux-64/readline-8.2-h8228510_1.conda#47d31b792659ce70f470b5c82fdfb7a4 https://conda.anaconda.org/conda-forge/linux-64/tk-8.6.13-noxft_h4845f30_101.conda#d453b98d9c83e71da0741bb0ff4d76bc https://conda.anaconda.org/conda-forge/linux-64/zlib-1.2.13-hd590300_5.conda#68c34ec6149623be41a1933ab996a209 -https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.5-hfc55251_0.conda#04b88013080254850d6c01ed54810589 +https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda#4d056880988120e29d75bfff282e0f45 https://conda.anaconda.org/conda-forge/linux-64/brotli-bin-1.1.0-hd590300_1.conda#39f910d205726805a958da408ca194ba https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-h267a509_2.conda#9ae35c3d96db2c94ce0cef86efdfa2cb https://conda.anaconda.org/conda-forge/linux-64/gdbm-1.18-h0a1914f_2.tar.bz2#b77bc399b07a19c00fe12fdc95ee0297 https://conda.anaconda.org/conda-forge/linux-64/libhiredis-1.0.2-h2cc385e_0.tar.bz2#b34907d3a81a3cd8095ee83d174c074a https://conda.anaconda.org/conda-forge/linux-64/libopenblas-0.3.27-pthreads_h413a1c8_0.conda#a356024784da6dfd4683dc5ecf45b155 https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.6.0-h1dd3fc0_3.conda#66f03896ffbe1a110ffda05c7a856504 -https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.3-h4dfa4b3_0.conda#d39965123dffcad4d750989be65bcb7c -https://conda.anaconda.org/conda-forge/linux-64/sqlite-3.45.2-h2c6b66d_0.conda#1423efca06ed343c1da0fc429bae0779 +https://conda.anaconda.org/conda-forge/linux-64/llvm-openmp-18.1.5-ha31de31_0.conda#b923cdb6e567ada84f991ffcc5848afb +https://conda.anaconda.org/conda-forge/linux-64/sqlite-3.45.3-h2c6b66d_0.conda#be7d70f2db41b674733667bdd69bd000 https://conda.anaconda.org/conda-forge/linux-64/xorg-libx11-1.8.9-h8ee46fc_0.conda#077b6e8ad6a3ddb741fce2496dd01bec https://conda.anaconda.org/conda-forge/linux-64/brotli-1.1.0-hd590300_1.conda#f27a24d46e3ea7b70a1f98e50c62508f https://conda.anaconda.org/conda-forge/linux-64/ccache-4.9.1-h1fcd64f_0.conda#3620f564bcf28c3524951b6f64f5c5ac @@ -72,12 +72,12 @@ https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2 https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.4-py39h6dedee3_0.conda#557d64563e84ff21b14f586c7f662b7f https://conda.anaconda.org/conda-forge/noarch/packaging-24.0-pyhd8ed1ab_0.conda#248f521b64ce055e7feae3105e7abeb8 https://conda.anaconda.org/conda-forge/linux-64/pillow-10.3.0-py39h90a76f3_0.conda#799e6519cfffe2784db27b1db2ef33f3 -https://conda.anaconda.org/conda-forge/noarch/pluggy-1.4.0-pyhd8ed1ab_0.conda#139e9feb65187e916162917bb2484976 +https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda#d3483c8fc2dc2cc3f5cf43e26d60cabf https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.2-pyhd8ed1ab_0.conda#b9a4dacf97241704529131a0dfc0494f https://conda.anaconda.org/conda-forge/noarch/pypy-7.3.15-1_pypy39.conda#a418a6c16bd6f7ed56b92194214791a0 https://conda.anaconda.org/conda-forge/noarch/setuptools-69.5.1-pyhd8ed1ab_0.conda#7462280d81f639363e6e63c81276bd9e https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2#e5f25f8dbc060e9a8d912e432202afc2 -https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.4.0-pyhc1e730c_0.conda#b296278eef667c673bf51de6535bad88 +https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda#df68d78237980a159bd7149f33c0e8fd https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2#5844808ffab9ebdb694585b50ba02a96 https://conda.anaconda.org/conda-forge/linux-64/tornado-6.4-py39hf860d4a_0.conda#e7fded713fb466e1e0670afce1761b47 https://conda.anaconda.org/conda-forge/linux-64/unicodedata2-15.1.0-py39hf860d4a_0.conda#f699157518d28d00c87542b4ec1273be @@ -87,16 +87,16 @@ https://conda.anaconda.org/conda-forge/linux-64/blas-devel-3.9.0-22_linux64_open https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.2.1-py39ha90811c_0.conda#07ed14c8326da42356514bcbc0b04802 https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.51.0-py39hf860d4a_0.conda#63421b4dd7222fad555e34ec9af015a1 https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.4.0-pyhd8ed1ab_0.conda#c5d3907ad8bd7bf557521a1833cf7e6d -https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.0-pyhd8ed1ab_0.conda#e0ed1bf13ce3a440e022157bf4764465 +https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda#25df261d4523d9f9783bcdb7208d872f https://conda.anaconda.org/conda-forge/noarch/meson-1.4.0-pyhd8ed1ab_0.conda#52a0660cfa40b45bf254ecc3374cb2e0 https://conda.anaconda.org/conda-forge/noarch/pip-24.0-pyhd8ed1ab_0.conda#f586ac1e56c8638b64f9c8122a7b8a67 -https://conda.anaconda.org/conda-forge/noarch/pyproject-metadata-0.7.1-pyhd8ed1ab_0.conda#dcb27826ffc94d5f04e241322239983b +https://conda.anaconda.org/conda-forge/noarch/pyproject-metadata-0.8.0-pyhd8ed1ab_0.conda#573fe09d7bd0cd4bcc210d8369b5ca47 https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda#a9d145de8c5f064b5fa68fb34725d9f4 https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda#2cf4264fffb9e6eff6031c5b6884d61c https://conda.anaconda.org/conda-forge/linux-64/scipy-1.12.0-py39h6dedee3_2.conda#6c5d74bac41838f4377dfd45085e1fec https://conda.anaconda.org/conda-forge/linux-64/blas-2.122-openblas.conda#5065468105542a8b23ea47bd8b6fa55f https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.4.0-pyhd8ed1ab_0.conda#dcbadab7a68738a028e195ab68ab2d2e -https://conda.anaconda.org/conda-forge/noarch/meson-python-0.15.0-pyh0c530f3_0.conda#3bc64565ca78ce3bb80248d09926d8f9 +https://conda.anaconda.org/conda-forge/noarch/meson-python-0.16.0-pyh0c530f3_0.conda#e16f0dbf502da873be9f9adb0dc52547 https://conda.anaconda.org/conda-forge/linux-64/pyamg-5.1.0-py39h5fd064f_0.conda#04676d2a49da3cb608af77e04b796ce1 https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.5.0-pyhd8ed1ab_0.conda#d5f595da2daead898ca958ac62f0307b https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.8.4-py39h4e7d633_0.conda#58272019e595dde98d0844ae3ebf0cfe diff --git a/build_tools/circle/doc_linux-64_conda.lock b/build_tools/circle/doc_linux-64_conda.lock index e2584c2d27333..34ec64ad5863b 100644 --- a/build_tools/circle/doc_linux-64_conda.lock +++ b/build_tools/circle/doc_linux-64_conda.lock @@ -95,7 +95,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#0 https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.conda#b3316cbe90249da4f8e84cd66e1cc55b https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.15-h0b41bf4_0.conda#33277193f5b92bad9fdd230eb700929c -https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.6-h232c23b_2.conda#9a3a42df8a95f65334dfc7b80da1195d +https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.7-hc051c1a_0.conda#5d801a4906adc712d480afc362623b59 https://conda.anaconda.org/conda-forge/linux-64/mysql-common-8.3.0-hf1915f5_4.conda#784a4df6676c581ca624fbe460703a6d https://conda.anaconda.org/conda-forge/linux-64/pcre2-10.43-hcad00b1_0.conda#8292dea9e022d9610a11fce5e0896ed8 https://conda.anaconda.org/conda-forge/linux-64/readline-8.2-h8228510_1.conda#47d31b792659ce70f470b5c82fdfb7a4 @@ -165,7 +165,7 @@ https://conda.anaconda.org/conda-forge/noarch/networkx-3.2.1-pyhd8ed1ab_0.conda# https://conda.anaconda.org/conda-forge/linux-64/openblas-0.3.27-pthreads_h7a3da1a_0.conda#4b422ebe8fc6a5320d0c1c22e5a46032 https://conda.anaconda.org/conda-forge/linux-64/openjpeg-2.5.2-h488ebb8_0.conda#7f2e286780f072ed750df46dc2631138 https://conda.anaconda.org/conda-forge/noarch/packaging-24.0-pyhd8ed1ab_0.conda#248f521b64ce055e7feae3105e7abeb8 -https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.2.1-pyhd8ed1ab_0.conda#d478a8a3044cdff1aa6e62f9269cefe0 +https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.2.2-pyhd8ed1ab_0.conda#6f6cf28bf8e021933869bae3f84b8fc9 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda#d3483c8fc2dc2cc3f5cf43e26d60cabf https://conda.anaconda.org/conda-forge/noarch/ply-3.11-pyhd8ed1ab_2.conda#18c6deb6f9602e32446398203c8f0e91 https://conda.anaconda.org/conda-forge/linux-64/psutil-5.9.8-py39hd1e30aa_0.conda#ec86403fde8793ac1c36f8afa3d15902 @@ -220,7 +220,7 @@ https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda# https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.1-pyhd8ed1ab_0.conda#08807a87fa7af10754d46f63b368e016 https://conda.anaconda.org/conda-forge/linux-64/compilers-1.7.0-ha770c72_1.conda#d8d07866ac3b5b6937213c89a1874f08 https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.24.3-haf2f30d_0.conda#f3df87cc9ef0b5113bff55aefcbcafd5 -https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.4.0-h3d44ed6_0.conda#27f46291a6aaa3c2a4f798ebd35a7ddb +https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.5.0-hfac3d4d_0.conda#f5126317dd0ce0ba26945e411ecc6960 https://conda.anaconda.org/conda-forge/noarch/importlib-resources-6.4.0-pyhd8ed1ab_0.conda#dcbadab7a68738a028e195ab68ab2d2e https://conda.anaconda.org/conda-forge/noarch/lazy_loader-0.4-pyhd8ed1ab_0.conda#a284ff318fbdb0dd83928275b4b6087c https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-22_linux64_openblas.conda#1fd156abd41a4992835952f6f4d951d0 @@ -237,7 +237,7 @@ https://conda.anaconda.org/conda-forge/linux-64/imagecodecs-2024.1.1-py39ha98d97 https://conda.anaconda.org/conda-forge/noarch/imageio-2.34.1-pyh4b66e23_0.conda#bcf6a6f4c6889ca083e8d33afbafb8d5 https://conda.anaconda.org/conda-forge/linux-64/pandas-2.2.2-py39hddac248_0.conda#259c4e76e6bda8888aefc098ae1ba749 https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.6-pyhd8ed1ab_0.conda#a5b55d1cb110cdcedc748b5c3e16e687 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.25-py39ha963410_0.conda#d14227f0e141af743374d845fd4f5ccd +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.26-py39ha963410_0.conda#d138679a254e4e0918cfc1114c928bb8 https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.1-pyhd8ed1ab_0.conda#d15917f33140f8d2ac9ca44db7ec8a25 https://conda.anaconda.org/conda-forge/linux-64/pulseaudio-client-17.0-hb77b528_0.conda#07f45f1be1c25345faddb8db0de8039b https://conda.anaconda.org/conda-forge/linux-64/pywavelets-1.4.1-py39h44dd56e_1.conda#d037c20e3da2e85f03ebd20ad480c359 diff --git a/build_tools/circle/doc_min_dependencies_environment.yml b/build_tools/circle/doc_min_dependencies_environment.yml index 298a60e8ec4ff..14f4485295455 100644 --- a/build_tools/circle/doc_min_dependencies_environment.yml +++ b/build_tools/circle/doc_min_dependencies_environment.yml @@ -30,7 +30,7 @@ dependencies: - numpydoc=1.2.0 # min - sphinx-prompt=1.3.0 # min - plotly=5.14.0 # min - - polars=0.19.12 # min + - polars=0.20.23 # min - pooch - pip - pip: diff --git a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock index e08a14c235079..043587152c63b 100644 --- a/build_tools/circle/doc_min_dependencies_linux-64_conda.lock +++ b/build_tools/circle/doc_min_dependencies_linux-64_conda.lock @@ -1,6 +1,6 @@ # Generated by conda-lock. # platform: linux-64 -# input_hash: 32601810330a8200864f7908d07d870a3a58931be4f833691b2b5c7937f2d330 +# input_hash: 08b61aae27c59a8d35d008fa2f947440f3cbcbc41622112e33e68f90d69b621c @EXPLICIT https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2#d7c89558ba9fa0495403155b64376d81 https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2024.2.2-hbcca054_0.conda#2f4327a1cbe7f022401b236e915a5fef @@ -80,7 +80,7 @@ https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda#0 https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.45.3-h2797004_0.conda#b3316cbe90249da4f8e84cd66e1cc55b https://conda.anaconda.org/conda-forge/linux-64/libvorbis-1.3.7-h9c3ff4c_0.tar.bz2#309dec04b70a3cc0f1e84a4013683bc0 https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.15-h0b41bf4_0.conda#33277193f5b92bad9fdd230eb700929c -https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.6-h232c23b_2.conda#9a3a42df8a95f65334dfc7b80da1195d +https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.7-hc051c1a_0.conda#5d801a4906adc712d480afc362623b59 https://conda.anaconda.org/conda-forge/linux-64/mysql-common-8.3.0-hf1915f5_4.conda#784a4df6676c581ca624fbe460703a6d https://conda.anaconda.org/conda-forge/linux-64/pcre2-10.43-hcad00b1_0.conda#8292dea9e022d9610a11fce5e0896ed8 https://conda.anaconda.org/conda-forge/linux-64/readline-8.2-h8228510_1.conda#47d31b792659ce70f470b5c82fdfb7a4 @@ -146,7 +146,7 @@ https://conda.anaconda.org/conda-forge/linux-64/markupsafe-2.1.5-py39hd1e30aa_0. https://conda.anaconda.org/conda-forge/noarch/networkx-3.2-pyhd8ed1ab_0.conda#cec8cc498664cc00a070676aa89e69a7 https://conda.anaconda.org/conda-forge/linux-64/openjpeg-2.5.2-h488ebb8_0.conda#7f2e286780f072ed750df46dc2631138 https://conda.anaconda.org/conda-forge/noarch/packaging-24.0-pyhd8ed1ab_0.conda#248f521b64ce055e7feae3105e7abeb8 -https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.2.1-pyhd8ed1ab_0.conda#d478a8a3044cdff1aa6e62f9269cefe0 +https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.2.2-pyhd8ed1ab_0.conda#6f6cf28bf8e021933869bae3f84b8fc9 https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda#d3483c8fc2dc2cc3f5cf43e26d60cabf https://conda.anaconda.org/conda-forge/noarch/ply-3.11-pyhd8ed1ab_2.conda#18c6deb6f9602e32446398203c8f0e91 https://conda.anaconda.org/conda-forge/linux-64/psutil-5.9.8-py39hd1e30aa_0.conda#ec86403fde8793ac1c36f8afa3d15902 @@ -199,7 +199,7 @@ https://conda.anaconda.org/conda-forge/linux-64/sip-6.7.12-py39h3d6467e_0.conda# https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.1-pyhd8ed1ab_0.conda#08807a87fa7af10754d46f63b368e016 https://conda.anaconda.org/conda-forge/linux-64/compilers-1.7.0-ha770c72_1.conda#d8d07866ac3b5b6937213c89a1874f08 https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.24.3-haf2f30d_0.conda#f3df87cc9ef0b5113bff55aefcbcafd5 -https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.4.0-h3d44ed6_0.conda#27f46291a6aaa3c2a4f798ebd35a7ddb +https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-8.5.0-hfac3d4d_0.conda#f5126317dd0ce0ba26945e411ecc6960 https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-7.1.0-hd8ed1ab_0.conda#6ef2b72d291b39e479d7694efa2b2b98 https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-22_linux64_mkl.conda#eb6deb4ba6f92ea3f31c09cb8b764738 https://conda.anaconda.org/conda-forge/linux-64/libsystemd0-255-h3516f8a_1.conda#3366af27f0b593544a6cd453c7932ac5 @@ -223,7 +223,7 @@ https://conda.anaconda.org/conda-forge/noarch/imageio-2.34.1-pyh4b66e23_0.conda# https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.3.4-py39h2fa2bec_0.tar.bz2#9ec0b2186fab9121c54f4844f93ee5b7 https://conda.anaconda.org/conda-forge/linux-64/pandas-1.1.5-py39hde0f152_0.tar.bz2#79fc4b5b3a865b90dd3701cecf1ad33c https://conda.anaconda.org/conda-forge/noarch/patsy-0.5.6-pyhd8ed1ab_0.conda#a5b55d1cb110cdcedc748b5c3e16e687 -https://conda.anaconda.org/conda-forge/linux-64/polars-0.19.12-py39h90d8ae4_0.conda#191828961c95f8d59fa2b86a590f9905 +https://conda.anaconda.org/conda-forge/linux-64/polars-0.20.23-py39ha963410_0.conda#4871f09d653e979d598d2d4cd5fa868d https://conda.anaconda.org/conda-forge/linux-64/pyqt-5.15.9-py39h52134e7_5.conda#e1f148e57d071b09187719df86f513c1 https://conda.anaconda.org/conda-forge/linux-64/pywavelets-1.3.0-py39hd257fcd_1.tar.bz2#c4b698994b2d8d2e659ae02202e6abe4 https://conda.anaconda.org/conda-forge/linux-64/scipy-1.6.0-py39hee8e79c_0.tar.bz2#3afcb78281836e61351a2924f3230060 diff --git a/examples/gaussian_process/plot_gpr_co2.py b/examples/gaussian_process/plot_gpr_co2.py index 33b0ab7271549..b3da30daa0f6d 100644 --- a/examples/gaussian_process/plot_gpr_co2.py +++ b/examples/gaussian_process/plot_gpr_co2.py @@ -33,24 +33,25 @@ # We will derive a dataset from the Mauna Loa Observatory that collected air # samples. We are interested in estimating the concentration of CO2 and # extrapolate it for further year. First, we load the original dataset available -# in OpenML. +# in OpenML as a pandas dataframe. This will be replaced with Polars +# once `fetch_openml` adds a native support for it. from sklearn.datasets import fetch_openml co2 = fetch_openml(data_id=41187, as_frame=True) co2.frame.head() # %% -# First, we process the original dataframe to create a date index and select -# only the CO2 column. -import pandas as pd +# First, we process the original dataframe to create a date column and select +# it along with the CO2 column. +import polars as pl -co2_data = co2.frame -co2_data["date"] = pd.to_datetime(co2_data[["year", "month", "day"]]) -co2_data = co2_data[["date", "co2"]].set_index("date") +co2_data = pl.DataFrame(co2.frame[["year", "month", "day", "co2"]]).select( + pl.date("year", "month", "day"), "co2" +) co2_data.head() # %% -co2_data.index.min(), co2_data.index.max() +co2_data["date"].min(), co2_data["date"].max() # %% # We see that we get CO2 concentration for some days from March, 1958 to @@ -58,7 +59,8 @@ # understanding. import matplotlib.pyplot as plt -co2_data.plot() +plt.plot(co2_data["date"], co2_data["co2"]) +plt.xlabel("date") plt.ylabel("CO$_2$ concentration (ppm)") _ = plt.title("Raw air samples measurements from the Mauna Loa Observatory") @@ -67,15 +69,14 @@ # for which no measurements were collected. Such a processing will have an # smoothing effect on the data. -try: - co2_data_resampled_monthly = co2_data.resample("ME") -except ValueError: - # pandas < 2.2 uses M instead of ME - co2_data_resampled_monthly = co2_data.resample("M") - - -co2_data = co2_data_resampled_monthly.mean().dropna(axis="index", how="any") -co2_data.plot() +co2_data = ( + co2_data.sort(by="date") + .group_by_dynamic("date", every="1mo") + .agg(pl.col("co2").mean()) + .drop_nulls() +) +plt.plot(co2_data["date"], co2_data["co2"]) +plt.xlabel("date") plt.ylabel("Monthly average of CO$_2$ concentration (ppm)") _ = plt.title( "Monthly average of air samples measurements\nfrom the Mauna Loa Observatory" @@ -88,7 +89,9 @@ # # As a first step, we will divide the data and the target to estimate. The data # being a date, we will convert it into a numeric. -X = (co2_data.index.year + co2_data.index.month / 12).to_numpy().reshape(-1, 1) +X = co2_data.select( + pl.col("date").dt.year() + pl.col("date").dt.month() / 12 +).to_numpy() y = co2_data["co2"].to_numpy() # %% diff --git a/pyproject.toml b/pyproject.toml index d9b95422e7ee5..f4bed8f20fa4a 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -62,7 +62,7 @@ docs = [ "sphinx-prompt>=1.3.0", "sphinxext-opengraph>=0.4.2", "plotly>=5.14.0", - "polars>=0.19.12" + "polars>=0.20.23" ] examples = [ "matplotlib>=3.3.4", @@ -82,7 +82,7 @@ tests = [ "black>=24.3.0", "mypy>=1.9", "pyamg>=4.0.0", - "polars>=0.19.12", + "polars>=0.20.23", "pyarrow>=12.0.0", "numpydoc>=1.2.0", "pooch>=1.6.0", diff --git a/sklearn/_min_dependencies.py b/sklearn/_min_dependencies.py index 00315f31d4c3f..0b1a96748a588 100644 --- a/sklearn/_min_dependencies.py +++ b/sklearn/_min_dependencies.py @@ -33,7 +33,7 @@ "black": ("24.3.0", "tests"), "mypy": ("1.9", "tests"), "pyamg": ("4.0.0", "tests"), - "polars": ("0.19.12", "docs, tests"), + "polars": ("0.20.23", "docs, tests"), "pyarrow": ("12.0.0", "tests"), "sphinx": ("6.0.0", "docs"), "sphinx-copybutton": ("0.5.2", "docs"), diff --git a/sklearn/tests/test_base.py b/sklearn/tests/test_base.py index 3bbc236e703df..a1cd3b8fc8c7b 100644 --- a/sklearn/tests/test_base.py +++ b/sklearn/tests/test_base.py @@ -834,7 +834,7 @@ class Estimator(BaseEstimator, WithSlots): [ ("dataframe", "1.5.0"), ("pyarrow", "12.0.0"), - ("polars", "0.19.12"), + ("polars", "0.20.23"), ], ) def test_dataframe_protocol(constructor_name, minversion): From b79420f1c2e82d814dec8026e96421751bfc9c96 Mon Sep 17 00:00:00 2001 From: Guillaume Lemaitre Date: Fri, 17 May 2024 15:14:33 +0200 Subject: [PATCH 546/554] FIX add long long for int32/int64 windows compat in NumPy 2.0 (#29029) --- sklearn/utils/arrayfuncs.pyx | 1 + sklearn/utils/tests/test_arrayfuncs.py | 4 +++- 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/sklearn/utils/arrayfuncs.pyx b/sklearn/utils/arrayfuncs.pyx index 346531d325ca5..1ad5804770358 100644 --- a/sklearn/utils/arrayfuncs.pyx +++ b/sklearn/utils/arrayfuncs.pyx @@ -16,6 +16,7 @@ ctypedef fused real_numeric: short int long + long long float double diff --git a/sklearn/utils/tests/test_arrayfuncs.py b/sklearn/utils/tests/test_arrayfuncs.py index 4a80a4c1edefd..a5c99427cbd00 100644 --- a/sklearn/utils/tests/test_arrayfuncs.py +++ b/sklearn/utils/tests/test_arrayfuncs.py @@ -26,7 +26,9 @@ def test_min_pos_no_positive(dtype): assert min_pos(X) == np.finfo(dtype).max -@pytest.mark.parametrize("dtype", [np.int16, np.int32, np.float32, np.float64]) +@pytest.mark.parametrize( + "dtype", [np.int16, np.int32, np.int64, np.float32, np.float64] +) @pytest.mark.parametrize("value", [0, 1.5, -1]) def test_all_with_any_reduction_axis_1(dtype, value): # Check that return value is False when there is no row equal to `value` From 9bd7047b4a6c673bcfd2911997f124e265f8ad57 Mon Sep 17 00:00:00 2001 From: Akihiro Kuno Date: Fri, 17 May 2024 23:57:47 +0900 Subject: [PATCH 547/554] FIX convergence criterion of MeanShift (#28951) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Olivier Grisel Co-authored-by: Jérémie du Boisberranger --- doc/whats_new/v1.5.rst | 3 +++ sklearn/cluster/_mean_shift.py | 2 +- sklearn/cluster/tests/test_mean_shift.py | 9 +++++++++ 3 files changed, 13 insertions(+), 1 deletion(-) diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 5fdc0707ffbee..6dc76ceefaf5f 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -183,6 +183,9 @@ Changelog :mod:`sklearn.cluster` ...................... +- |Fix| The :class:`cluster.MeanShift` class now properly converges for constant data. + :pr:`28951` by :user:`Akihiro Kuno `. + - |FIX| Create copy of precomputed sparse matrix within the `fit` method of :class:`~cluster.OPTICS` to avoid in-place modification of the sparse matrix. :pr:`28491` by :user:`Thanh Lam Dang `. diff --git a/sklearn/cluster/_mean_shift.py b/sklearn/cluster/_mean_shift.py index fae11cca7df23..a99a607f3cf0d 100644 --- a/sklearn/cluster/_mean_shift.py +++ b/sklearn/cluster/_mean_shift.py @@ -122,7 +122,7 @@ def _mean_shift_single_seed(my_mean, X, nbrs, max_iter): my_mean = np.mean(points_within, axis=0) # If converged or at max_iter, adds the cluster if ( - np.linalg.norm(my_mean - my_old_mean) < stop_thresh + np.linalg.norm(my_mean - my_old_mean) <= stop_thresh or completed_iterations == max_iter ): break diff --git a/sklearn/cluster/tests/test_mean_shift.py b/sklearn/cluster/tests/test_mean_shift.py index 265c72d0c4ce1..d2d73ba11a3ec 100644 --- a/sklearn/cluster/tests/test_mean_shift.py +++ b/sklearn/cluster/tests/test_mean_shift.py @@ -25,6 +25,15 @@ ) +def test_convergence_of_1d_constant_data(): + # Test convergence using 1D constant data + # Non-regression test for: + # https://github.com/scikit-learn/scikit-learn/issues/28926 + model = MeanShift() + n_iter = model.fit(np.ones(10).reshape(-1, 1)).n_iter_ + assert n_iter < model.max_iter + + def test_estimate_bandwidth(): # Test estimate_bandwidth bandwidth = estimate_bandwidth(X, n_samples=200) From 4647729e5ee8c46e4fedace2d3c50c37f0a6693d Mon Sep 17 00:00:00 2001 From: jpienaar-tuks <112702520+jpienaar-tuks@users.noreply.github.com> Date: Sat, 18 May 2024 14:46:56 +0200 Subject: [PATCH 548/554] DOC Fix time complexity of MLP (#28592) Co-authored-by: Johann --- doc/modules/neural_networks_supervised.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/modules/neural_networks_supervised.rst b/doc/modules/neural_networks_supervised.rst index 95d0a1be38238..7ee2387068c81 100644 --- a/doc/modules/neural_networks_supervised.rst +++ b/doc/modules/neural_networks_supervised.rst @@ -229,7 +229,7 @@ Complexity Suppose there are :math:`n` training samples, :math:`m` features, :math:`k` hidden layers, each containing :math:`h` neurons - for simplicity, and :math:`o` output neurons. The time complexity of backpropagation is -:math:`O(n\cdot m \cdot h^k \cdot o \cdot i)`, where :math:`i` is the number +:math:`O(i \cdot n \cdot (m \cdot h + (k - 1) \cdot h \cdot h + h \cdot o))`, where :math:`i` is the number of iterations. Since backpropagation has a high time complexity, it is advisable to start with smaller number of hidden neurons and few hidden layers for training. From ffbe4ab45bd9a113737231721fa2f55a70f3d0ab Mon Sep 17 00:00:00 2001 From: Adrin Jalali Date: Sun, 19 May 2024 21:14:50 +0200 Subject: [PATCH 549/554] DOC remove obsolete SVM example (#27108) --- doc/conf.py | 1 + examples/svm/plot_svm_kernels.py | 59 +++++++++++++++++++++++------- examples/svm/plot_svm_nonlinear.py | 45 ----------------------- 3 files changed, 46 insertions(+), 59 deletions(-) delete mode 100644 examples/svm/plot_svm_nonlinear.py diff --git a/doc/conf.py b/doc/conf.py index 9d77fc68d0f71..0587e98130118 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -301,6 +301,7 @@ "auto_examples/decomposition/plot_beta_divergence": ( "auto_examples/applications/plot_topics_extraction_with_nmf_lda" ), + "auto_examples/svm/plot_svm_nonlinear": "auto_examples/svm/plot_svm_kernels", "auto_examples/ensemble/plot_adaboost_hastie_10_2": ( "auto_examples/ensemble/plot_adaboost_multiclass" ), diff --git a/examples/svm/plot_svm_kernels.py b/examples/svm/plot_svm_kernels.py index d801e2477e682..a63de6765f083 100644 --- a/examples/svm/plot_svm_kernels.py +++ b/examples/svm/plot_svm_kernels.py @@ -110,12 +110,15 @@ from sklearn.inspection import DecisionBoundaryDisplay -def plot_training_data_with_decision_boundary(kernel): +def plot_training_data_with_decision_boundary( + kernel, ax=None, long_title=True, support_vectors=True +): # Train the SVC clf = svm.SVC(kernel=kernel, gamma=2).fit(X, y) # Settings for plotting - _, ax = plt.subplots(figsize=(4, 3)) + if ax is None: + _, ax = plt.subplots(figsize=(4, 3)) x_min, x_max, y_min, y_max = -3, 3, -3, 3 ax.set(xlim=(x_min, x_max), ylim=(y_min, y_max)) @@ -136,20 +139,26 @@ def plot_training_data_with_decision_boundary(kernel): linestyles=["--", "-", "--"], ) - # Plot bigger circles around samples that serve as support vectors - ax.scatter( - clf.support_vectors_[:, 0], - clf.support_vectors_[:, 1], - s=250, - facecolors="none", - edgecolors="k", - ) + if support_vectors: + # Plot bigger circles around samples that serve as support vectors + ax.scatter( + clf.support_vectors_[:, 0], + clf.support_vectors_[:, 1], + s=150, + facecolors="none", + edgecolors="k", + ) + # Plot samples by color and add legend - ax.scatter(X[:, 0], X[:, 1], c=y, s=150, edgecolors="k") + ax.scatter(X[:, 0], X[:, 1], c=y, s=30, edgecolors="k") ax.legend(*scatter.legend_elements(), loc="upper right", title="Classes") - ax.set_title(f" Decision boundaries of {kernel} kernel in SVC") + if long_title: + ax.set_title(f" Decision boundaries of {kernel} kernel in SVC") + else: + ax.set_title(kernel) - _ = plt.show() + if ax is None: + plt.show() # %% @@ -237,7 +246,6 @@ def plot_training_data_with_decision_boundary(kernel): # using the hyperbolic tangent function (:math:`\tanh`). The kernel function # scales and possibly shifts the dot product of the two points # (:math:`\mathbf{x}_1` and :math:`\mathbf{x}_2`). - plot_training_data_with_decision_boundary("sigmoid") # %% @@ -271,3 +279,26 @@ def plot_training_data_with_decision_boundary(kernel): # parameters using techniques such as # :class:`~sklearn.model_selection.GridSearchCV` is recommended to capture the # underlying structures within the data. + +# %% +# XOR dataset +# ----------- +# A classical example of a dataset which is not linearly separable is the XOR +# pattern. HEre we demonstrate how different kernels work on such a dataset. + +xx, yy = np.meshgrid(np.linspace(-3, 3, 500), np.linspace(-3, 3, 500)) +np.random.seed(0) +X = np.random.randn(300, 2) +y = np.logical_xor(X[:, 0] > 0, X[:, 1] > 0) + +_, ax = plt.subplots(2, 2, figsize=(8, 8)) +args = dict(long_title=False, support_vectors=False) +plot_training_data_with_decision_boundary("linear", ax[0, 0], **args) +plot_training_data_with_decision_boundary("poly", ax[0, 1], **args) +plot_training_data_with_decision_boundary("rbf", ax[1, 0], **args) +plot_training_data_with_decision_boundary("sigmoid", ax[1, 1], **args) +plt.show() + +# %% +# As you can see from the plots above, only the `rbf` kernel can find a +# reasonable decision boundary for the above dataset. diff --git a/examples/svm/plot_svm_nonlinear.py b/examples/svm/plot_svm_nonlinear.py deleted file mode 100644 index 4990e509661a1..0000000000000 --- a/examples/svm/plot_svm_nonlinear.py +++ /dev/null @@ -1,45 +0,0 @@ -""" -============== -Non-linear SVM -============== - -Perform binary classification using non-linear SVC -with RBF kernel. The target to predict is a XOR of the -inputs. - -The color map illustrates the decision function learned by the SVC. - -""" - -import matplotlib.pyplot as plt -import numpy as np - -from sklearn import svm - -xx, yy = np.meshgrid(np.linspace(-3, 3, 500), np.linspace(-3, 3, 500)) -np.random.seed(0) -X = np.random.randn(300, 2) -Y = np.logical_xor(X[:, 0] > 0, X[:, 1] > 0) - -# fit the model -clf = svm.NuSVC(gamma="auto") -clf.fit(X, Y) - -# plot the decision function for each datapoint on the grid -Z = clf.decision_function(np.c_[xx.ravel(), yy.ravel()]) -Z = Z.reshape(xx.shape) - -plt.imshow( - Z, - interpolation="nearest", - extent=(xx.min(), xx.max(), yy.min(), yy.max()), - aspect="auto", - origin="lower", - cmap=plt.cm.PuOr_r, -) -contours = plt.contour(xx, yy, Z, levels=[0], linewidths=2, linestyles="dashed") -plt.scatter(X[:, 0], X[:, 1], s=30, c=Y, cmap=plt.cm.Paired, edgecolors="k") -plt.xticks(()) -plt.yticks(()) -plt.axis([-3, 3, -3, 3]) -plt.show() From 1e50434f18275bb8727c2a2e24cb953db143d8a5 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Mon, 20 May 2024 11:51:20 +0200 Subject: [PATCH 550/554] set version --- pyproject.toml | 2 +- sklearn/__init__.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pyproject.toml b/pyproject.toml index f4bed8f20fa4a..e365ef454b21b 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [project] name = "scikit-learn" -version = "1.5.0rc1" +version = "1.5.0" description = "A set of python modules for machine learning and data mining" readme = "README.rst" maintainers = [ diff --git a/sklearn/__init__.py b/sklearn/__init__.py index 63b08e022f23d..d794f2489b92b 100644 --- a/sklearn/__init__.py +++ b/sklearn/__init__.py @@ -42,7 +42,7 @@ # Dev branch marker is: 'X.Y.dev' or 'X.Y.devN' where N is an integer. # 'X.Y.dev0' is the canonical version of 'X.Y.dev' # -__version__ = "1.5.0rc1" +__version__ = "1.5.0" # On OSX, we can get a runtime error due to multiple OpenMP libraries loaded From 729b54d5af208432f788ae7945842f0cf597bd36 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Mon, 20 May 2024 12:29:29 +0200 Subject: [PATCH 551/554] test py3.12 against numpy 2 [cd build] --- .github/workflows/wheels.yml | 2 +- pyproject.toml | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/wheels.yml b/.github/workflows/wheels.yml index 8bd7ffc17beca..8e0073e67426b 100644 --- a/.github/workflows/wheels.yml +++ b/.github/workflows/wheels.yml @@ -167,7 +167,7 @@ jobs: CIBW_CONFIG_SETTINGS_WINDOWS: "setup-args=--vsenv" CIBW_REPAIR_WHEEL_COMMAND_WINDOWS: bash build_tools/github/repair_windows_wheels.sh {wheel} {dest_dir} CIBW_BEFORE_TEST_WINDOWS: bash build_tools/github/build_minimal_windows_image.sh ${{ matrix.python }} - CIBW_TEST_REQUIRES: pytest pandas + CIBW_TEST_REQUIRES: pytest pandas ${{ matrix.python == 312 && 'numpy>=2.0.0rc2' || '' }} CIBW_TEST_COMMAND: bash {project}/build_tools/wheels/test_wheels.sh CIBW_TEST_COMMAND_WINDOWS: bash {project}/build_tools/github/test_windows_wheels.sh ${{ matrix.python }} CIBW_BUILD_VERBOSITY: 1 diff --git a/pyproject.toml b/pyproject.toml index e365ef454b21b..f244745f37d30 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -95,7 +95,7 @@ build-backend = "mesonpy" requires = [ "meson-python>=0.15.0", "Cython>=3.0.10", - "numpy>=2.0.0rc1", + "numpy>=2.0.0rc2", "scipy>=1.6.0", ] From 0ac28ade871ca71a89a71c834a7b47829b075829 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Mon, 20 May 2024 14:31:55 +0200 Subject: [PATCH 552/554] DOC Release highlights 1.5 (#29007) Co-authored-by: Tim Head Co-authored-by: Guillaume Lemaitre Co-authored-by: Christian Lorentzen --- .../plot_release_highlights_1_5_0.py | 183 ++++++++++++++++++ 1 file changed, 183 insertions(+) create mode 100644 examples/release_highlights/plot_release_highlights_1_5_0.py diff --git a/examples/release_highlights/plot_release_highlights_1_5_0.py b/examples/release_highlights/plot_release_highlights_1_5_0.py new file mode 100644 index 0000000000000..0acc6fda6589d --- /dev/null +++ b/examples/release_highlights/plot_release_highlights_1_5_0.py @@ -0,0 +1,183 @@ +# ruff: noqa +""" +======================================= +Release Highlights for scikit-learn 1.5 +======================================= + +.. currentmodule:: sklearn + +We are pleased to announce the release of scikit-learn 1.5! Many bug fixes +and improvements were added, as well as some key new features. Below we +detail the highlights of this release. **For an exhaustive list of +all the changes**, please refer to the :ref:`release notes `. + +To install the latest version (with pip):: + + pip install --upgrade scikit-learn + +or with conda:: + + conda install -c conda-forge scikit-learn + +""" + +# %% +# FixedThresholdClassifier: Setting the decision threshold of a binary classifier +# ------------------------------------------------------------------------------- +# All binary classifiers of scikit-learn use a fixed decision threshold of 0.5 to +# convert probability estimates (i.e. output of `predict_proba`) into class +# predictions. However, 0.5 is almost never the desired threshold for a given problem. +# :class:`~model_selection.FixedThresholdClassifier` allows to wrap any binary +# classifier and set a custom decision threshold. +from sklearn.datasets import make_classification +from sklearn.linear_model import LogisticRegression +from sklearn.metrics import confusion_matrix + +X, y = make_classification(n_samples=1_000, weights=[0.9, 0.1], random_state=0) +classifier = LogisticRegression(random_state=0).fit(X, y) + +print("confusion matrix:\n", confusion_matrix(y, classifier.predict(X))) + +# %% +# Lowering the threshold, i.e. allowing more samples to be classified as the positive +# class, increases the number of true positives at the cost of more false positives +# (as is well known from the concavity of the ROC curve). +from sklearn.model_selection import FixedThresholdClassifier + +wrapped_classifier = FixedThresholdClassifier(classifier, threshold=0.1).fit(X, y) + +print("confusion matrix:\n", confusion_matrix(y, wrapped_classifier.predict(X))) + +# %% +# TunedThresholdClassifierCV: Tuning the decision threshold of a binary classifier +# -------------------------------------------------------------------------------- +# The decision threshold of a binary classifier can be tuned to optimize a given +# metric, using :class:`~model_selection.TunedThresholdClassifierCV`. +from sklearn.metrics import balanced_accuracy_score + +# Due to the class imbalance, the balanced accuracy is not optimal for the default +# threshold. The classifier tends to over predict the majority class. +print(f"balanced accuracy: {balanced_accuracy_score(y, classifier.predict(X)):.2f}") + +# %% +# Tuning the threshold to optimize the balanced accuracy gives a smaller threshold +# that allows more samples to be classified as the positive class. +from sklearn.model_selection import TunedThresholdClassifierCV + +tuned_classifier = TunedThresholdClassifierCV( + classifier, cv=5, scoring="balanced_accuracy" +).fit(X, y) + +print(f"new threshold: {tuned_classifier.best_threshold_:.4f}") +print( + f"balanced accuracy: {balanced_accuracy_score(y, tuned_classifier.predict(X)):.2f}" +) + +# %% +# :class:`~model_selection.TunedThresholdClassifierCV` also benefits from the +# metadata routing support (:ref:`Metadata Routing User Guide`) +# allowing to optimze complex business metrics, detailed +# in :ref:`Post-tuning the decision threshold for cost-sensitive learning +# `. + +# %% +# Performance improvements in PCA +# ------------------------------- +# :class:`~decomposition.PCA` has a new solver, "covariance_eigh", which is faster +# and more memory efficient than the other solvers for datasets with a large number +# of samples and a small number of features. +from sklearn.datasets import make_low_rank_matrix +from sklearn.decomposition import PCA + +X = make_low_rank_matrix( + n_samples=10_000, n_features=100, tail_strength=0.1, random_state=0 +) + +pca = PCA(n_components=10).fit(X) + +print(f"explained variance: {pca.explained_variance_ratio_.sum():.2f}") + +# %% +# The "full" solver has also been improved to use less memory and allows to +# transform faster. The "auto" option for the solver takes advantage of the +# new solver and is now able to select an appropriate solver for sparse +# datasets. +from scipy.sparse import random + +X = random(10000, 100, format="csr", random_state=0) + +pca = PCA(n_components=10, svd_solver="auto").fit(X) + +# %% +# ColumnTransformer is subscriptable +# ---------------------------------- +# The transformers of a :class:`~compose.ColumnTransformer` can now be directly +# accessed using indexing by name. +import numpy as np +from sklearn.compose import ColumnTransformer +from sklearn.preprocessing import StandardScaler, OneHotEncoder + +X = np.array([[0, 1, 2], [3, 4, 5]]) +column_transformer = ColumnTransformer( + [("std_scaler", StandardScaler(), [0]), ("one_hot", OneHotEncoder(), [1, 2])] +) + +column_transformer.fit(X) + +print(column_transformer["std_scaler"]) +print(column_transformer["one_hot"]) + +# %% +# Custom imputation strategies for the SimpleImputer +# -------------------------------------------------- +# :class:`~impute.SimpleImputer` now supports custom strategies for imputation, +# using a callable that computes a scalar value from the non missing values of +# a column vector. +from sklearn.impute import SimpleImputer + +X = np.array( + [ + [-1.1, 1.1, 1.1], + [3.9, -1.2, np.nan], + [np.nan, 1.3, np.nan], + [-0.1, -1.4, -1.4], + [-4.9, 1.5, -1.5], + [np.nan, 1.6, 1.6], + ] +) + + +def smallest_abs(arr): + """Return the smallest absolute value of a 1D array.""" + return np.min(np.abs(arr)) + + +imputer = SimpleImputer(strategy=smallest_abs) + +imputer.fit_transform(X) + +# %% +# Pairwise distances with non-numeric arrays +# ------------------------------------------ +# :func:`~metrics.pairwise_distances` can now compute distances between +# non-numeric arrays using a callable metric. +from sklearn.metrics import pairwise_distances + +X = ["cat", "dog"] +Y = ["cat", "fox"] + + +def levenshtein_distance(x, y): + """Return the Levenshtein distance between two strings.""" + if x == "" or y == "": + return max(len(x), len(y)) + if x[0] == y[0]: + return levenshtein_distance(x[1:], y[1:]) + return 1 + min( + levenshtein_distance(x[1:], y), + levenshtein_distance(x, y[1:]), + levenshtein_distance(x[1:], y[1:]), + ) + + +pairwise_distances(X, Y, metric=levenshtein_distance) From 919ae9bf72554a180baa3d8f4537b49c730b7580 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Mon, 20 May 2024 16:13:08 +0200 Subject: [PATCH 553/554] MAINT Reoder what's new for 1.5 (#29039) --- doc/templates/index.html | 10 +-- doc/whats_new/v1.5.rst | 143 +++++++++++++++++++++++---------------- 2 files changed, 88 insertions(+), 65 deletions(-) diff --git a/doc/templates/index.html b/doc/templates/index.html index 5b3a61a5b98bb..74816a4b473d3 100644 --- a/doc/templates/index.html +++ b/doc/templates/index.html @@ -167,7 +167,9 @@

    Machine Learning in

    News

    • On-going development: - scikit-learn 1.5 (Changelog) + scikit-learn 1.6 (Changelog) +
    • +
    • May 2024. scikit-learn 1.5.0 is available for download (Changelog).
    • April 2024. scikit-learn 1.4.2 is available for download (Changelog).
    • @@ -175,12 +177,6 @@

      News

    • January 2024. scikit-learn 1.4.0 is available for download (Changelog).
    • -
    • October 2023. scikit-learn 1.3.2 is available for download (Changelog). -
    • -
    • September 2023. scikit-learn 1.3.1 is available for download (Changelog). -
    • -
    • June 2023. scikit-learn 1.3.0 is available for download (Changelog). -
    • All releases: What's new (Changelog)
    • diff --git a/doc/whats_new/v1.5.rst b/doc/whats_new/v1.5.rst index 6dc76ceefaf5f..c2c64e24ba9e0 100644 --- a/doc/whats_new/v1.5.rst +++ b/doc/whats_new/v1.5.rst @@ -8,10 +8,8 @@ Version 1.5 =========== -.. - -- UNCOMMENT WHEN 1.5.0 IS RELEASED -- - For a short description of the main highlights of the release, please refer to - :ref:`sphx_glr_auto_examples_release_highlights_plot_release_highlights_1_5_0.py`. +For a short description of the main highlights of the release, please refer to +:ref:`sphx_glr_auto_examples_release_highlights_plot_release_highlights_1_5_0.py`. .. include:: changelog_legend.inc @@ -20,7 +18,7 @@ Version 1.5 Version 1.5.0 ============= -**In Development** +**May 2024** Security -------- @@ -59,6 +57,10 @@ Changed models Changes impacting many modules ------------------------------ +- |Fix| Raise `ValueError` with an informative error message when passing 1D + sparse arrays to methods that expect 2D sparse inputs. + :pr:`28988` by :user:`Olivier Grisel `. + - |API| The name of the input of the `inverse_transform` method of estimators has been standardized to `X`. As a consequence, `Xt` is deprecated and will be removed in version 1.7 in the following estimators: :class:`cluster.FeatureAgglomeration`, @@ -67,10 +69,6 @@ Changes impacting many modules :class:`pipeline.Pipeline` and :class:`preprocessing.KBinsDiscretizer`. :pr:`28756` by :user:`Will Dean `. -- |Fix| Raise `ValueError` with an informative error message when passing 1D - sparse arrays to methods that expect 2D sparse inputs. - :pr:`28988` by :user:`Olivier Grisel `. - Support for Array API --------------------- @@ -82,8 +80,8 @@ See :ref:`array_api` for more details. **Functions:** - :func:`sklearn.metrics.r2_score` now supports Array API compliant inputs. - :pr:`27904` by :user:`Eric Lindgren `, `Franck Charras `, - `Olivier Grisel ` and `Tim Head `. + :pr:`27904` by :user:`Eric Lindgren `, :user:`Franck Charras `, + :user:`Olivier Grisel ` and :user:`Tim Head `. **Classes:** @@ -103,8 +101,8 @@ Unless we discover a major blocker, setuptools support will be dropped in scikit-learn 1.6. The 1.5.x releases will support building scikit-learn with setuptools. -Meson support for building scikit-learn was added in :pr:`28040` by :user:`Loïc -Estève ` +Meson support for building scikit-learn was added in :pr:`28040` by +:user:`Loïc Estève ` Metadata Routing ---------------- @@ -120,7 +118,8 @@ more details. now support metadata routing. The fit methods now accept ``**fit_params`` which are passed to the underlying estimators via their `fit` methods. - :pr:`28432` by :user:`Adam Li ` and :user:`Benjamin Bossan `. + :pr:`28432` by :user:`Adam Li ` and + :user:`Benjamin Bossan `. - |Feature| :class:`linear_model.RidgeCV` and :class:`linear_model.RidgeClassifierCV` now support metadata routing in @@ -144,8 +143,8 @@ more details. - |Feature| :class:`pipeline.FeatureUnion` now supports metadata routing in its ``fit`` and ``fit_transform`` methods and route metadata to the underlying - transformers' ``fit`` and ``fit_transform``. :pr:`28205` by :user:`Stefanie - Senger `. + transformers' ``fit`` and ``fit_transform``. + :pr:`28205` by :user:`Stefanie Senger `. - |Fix| Fix an issue when resolving default routing requests set via class attributes. @@ -156,8 +155,8 @@ more details. :pr:`28651` by `Adrin Jalali`_. - |FIX| Prevent a `RecursionError` when estimators with the default `scoring` - param (`None`) route metadata. :pr:`28712` by :user:`Stefanie Senger - `. + param (`None`) route metadata. + :pr:`28712` by :user:`Stefanie Senger `. Changelog --------- @@ -217,7 +216,13 @@ Changelog :mod:`sklearn.cross_decomposition` .................................. -- |API| Deprecates `Y` in favor of `y` in the methods fit, transform and inverse_transform of: +- |Fix| The `coef_` fitted attribute of :class:`cross_decomposition.PLSRegression` + now takes into account both the scale of `X` and `Y` when `scale=True`. Note that + the previous predicted values were not affected by this bug. + :pr:`28612` by :user:`Guillaume Lemaitre `. + +- |API| Deprecates `Y` in favor of `y` in the methods fit, transform and + inverse_transform of: :class:`cross_decomposition.PLSRegression`. :class:`cross_decomposition.PLSCanonical`, :class:`cross_decomposition.CCA`, @@ -225,11 +230,6 @@ Changelog `Y` will be removed in version 1.7. :pr:`28604` by :user:`David Leon `. -- |Fix| The `coef_` fitted attribute of :class:`cross_decomposition.PLSRegression` - now takes into account both the scale of `X` and `Y` when `scale=True`. Note that - the previous predicted values were not affected by this bug. - :pr:`28612` by :user:`Guillaume Lemaitre `. - :mod:`sklearn.datasets` ....................... @@ -245,7 +245,8 @@ Changelog :func:`datasets.fetch_rcv1`, and :func:`datasets.fetch_species_distributions`. By default, the functions will retry up to 3 times in case of network failures. - :pr:`28160` by :user:`Zhehao Liu ` and :user:`Filip Karlo Došilović `. + :pr:`28160` by :user:`Zhehao Liu ` and + :user:`Filip Karlo Došilović `. :mod:`sklearn.decomposition` ............................ @@ -350,13 +351,8 @@ Changelog - |Fix| :class:`linear_model.ElasticNet`, :class:`linear_model.ElasticNetCV`, :class:`linear_model.Lasso` and :class:`linear_model.LassoCV` now explicitly don't - accept large sparse data formats. :pr:`27576` by :user:`Stefanie Senger - `. - -- |API| :class:`linear_model.RidgeCV` and :class:`linear_model.RidgeClassifierCV` - will now allow `alpha=0` when `cv != None`, which is consistent with - :class:`linear_model.Ridge` and :class:`linear_model.RidgeClassifier`. - :pr:`28425` by :user:`Lucy Liu `. + accept large sparse data formats. + :pr:`27576` by :user:`Stefanie Senger `. - |Fix| :class:`linear_model.RidgeCV` and :class:`RidgeClassifierCV` correctly pass `sample_weight` to the underlying scorer when `cv` is None. @@ -366,6 +362,11 @@ Changelog will now always be `None` when `tol` is set, as `n_nonzero_coefs` is ignored in this case. :pr:`28557` by :user:`Lucy Liu `. +- |API| :class:`linear_model.RidgeCV` and :class:`linear_model.RidgeClassifierCV` + will now allow `alpha=0` when `cv != None`, which is consistent with + :class:`linear_model.Ridge` and :class:`linear_model.RidgeClassifier`. + :pr:`28425` by :user:`Lucy Liu `. + - |API| Passing `average=0` to disable averaging is deprecated in :class:`linear_model.PassiveAggressiveClassifier`, :class:`linear_model.PassiveAggressiveRegressor`, @@ -382,7 +383,8 @@ Changelog :pr:`28703` by :user:`Christian Lorentzen `. - |API| `store_cv_values` and `cv_values_` are deprecated in favor of - `store_cv_results` and `cv_results_` in `RidgeCV` and `RidgeClassifierCV`. + `store_cv_results` and `cv_results_` in `~linear_model.RidgeCV` and + `~linear_model.RidgeClassifierCV`. :pr:`28915` by :user:`Lucy Liu `. :mod:`sklearn.manifold` @@ -401,8 +403,15 @@ Changelog :pr:`27456` by :user:`Venkatachalam N `, :user:`Kshitij Mathur ` and :user:`Julian Libiseller-Egger `. +- |Feature| :func:`sklearn.metrics.check_scoring` now returns a multi-metric scorer + when `scoring` as a `dict`, `set`, `tuple`, or `list`. :pr:`28360` by `Thomas Fan`_. + +- |Feature| :func:`metrics.d2_log_loss_score` has been added which + calculates the D^2 score for the log loss. + :pr:`28351` by :user:`Omar Salman `. + - |Efficiency| Improve efficiency of functions :func:`~metrics.brier_score_loss`, - :func:`~metrics.calibration_curve`, :func:`~metrics.det_curve`, + :func:`~calibration.calibration_curve`, :func:`~metrics.det_curve`, :func:`~metrics.precision_recall_curve`, :func:`~metrics.roc_curve` when `pos_label` argument is specified. Also improve efficiency of methods `from_estimator` @@ -411,9 +420,6 @@ Changelog :class:`~calibration.CalibrationDisplay`. :pr:`28051` by :user:`Pierre de Fréminville `. -- |Feature| :func:`sklearn.metrics.check_scoring` now returns a multi-metric scorer - when `scoring` as a `dict`, `set`, `tuple`, or `list`. :pr:`28360` by `Thomas Fan`_. - - |Fix|:class:`metrics.classification_report` now shows only accuracy and not micro-average when input is a subset of labels. :pr:`28399` by :user:`Vineet Joshi `. @@ -422,8 +428,8 @@ Changelog computation. This is likely to affect neighbor-based algorithms. :pr:`28692` by :user:`Loïc Estève `. -- |API| :func:`metrics.precision_recall_curve` deprecated the keyword argument `probas_pred` - in favor of `y_score`. `probas_pred` will be removed in version 1.7. +- |API| :func:`metrics.precision_recall_curve` deprecated the keyword argument + `probas_pred` in favor of `y_score`. `probas_pred` will be removed in version 1.7. :pr:`28092` by :user:`Adam Li `. - |API| :func:`metrics.brier_score_loss` deprecated the keyword argument `y_prob` @@ -434,10 +440,6 @@ Changelog is deprecated and will raise an error in v1.7. :pr:`18555` by :user:`Kaushik Amar Das `. -- |Feature| :func:`metrics.d2_log_loss_score` has been added which - calculates the D^2 score for the log loss. - :pr:`28351` by :user:`Omar Salman `. - :mod:`sklearn.mixture` ...................... @@ -460,22 +462,22 @@ Changelog raises a warning when groups are passed in to :term:`split`. :pr:`28210` by `Thomas Fan`_. +- |Enhancement| The HTML diagram representation of + :class:`~model_selection.GridSearchCV`, + :class:`~model_selection.RandomizedSearchCV`, + :class:`~model_selection.HalvingGridSearchCV`, and + :class:`~model_selection.HalvingRandomSearchCV` will show the best estimator when + `refit=True`. :pr:`28722` by :user:`Yao Xiao ` and `Thomas Fan`_. + - |Fix| the ``cv_results_`` attribute (of :class:`model_selection.GridSearchCV`) now returns masked arrays of the appropriate NumPy dtype, as opposed to always returning dtype ``object``. :pr:`28352` by :user:`Marco Gorelli`. -- |Fix| :func:`sklearn.model_selection.train_test_score` works with Array API inputs. +- |Fix| :func:`model_selection.train_test_split` works with Array API inputs. Previously indexing was not handled correctly leading to exceptions when using strict implementations of the Array API like CuPY. :pr:`28407` by :user:`Tim Head `. -- |Enhancement| The HTML diagram representation of - :class:`~model_selection.GridSearchCV`, - :class:`~model_selection.RandomizedSearchCV`, - :class:`~model_selection.HalvingGridSearchCV`, and - :class:`~model_selection.HalvingRandomSearchCV` will show the best estimator when - `refit=True`. :pr:`28722` by :user:`Yao Xiao ` and `Thomas Fan`_. - :mod:`sklearn.multioutput` .......................... @@ -518,6 +520,10 @@ Changelog :mod:`sklearn.utils` .................... +- |Fix| :func:`~utils._safe_indexing` now works correctly for polars DataFrame when + `axis=0` and supports indexing polars Series. + :pr:`28521` by :user:`Yao Xiao `. + - |API| :data:`utils.IS_PYPY` is deprecated and will be removed in version 1.7. :pr:`28768` by :user:`Jérémie du Boisberranger `. @@ -529,15 +535,11 @@ Changelog `joblib.register_parallel_backend` instead. :pr:`28847` by :user:`Jérémie du Boisberranger `. -- |API| Raise informative warning message in :func:`type_of_target` when - represented as bytes. For classifiers and classification metrics, labels encoded +- |API| Raise informative warning message in :func:`~utils.multiclass.type_of_target` + when represented as bytes. For classifiers and classification metrics, labels encoded as bytes is deprecated and will raise an error in v1.7. :pr:`18555` by :user:`Kaushik Amar Das `. -- |Fix| :func:`~utils._safe_indexing` now works correctly for polars DataFrame when - `axis=0` and supports indexing polars Series. - :pr:`28521` by :user:`Yao Xiao `. - - |API| :func:`utils.estimator_checks.check_estimator_sparse_data` was split into two functions: :func:`utils.estimator_checks.check_estimator_sparse_matrix` and :func:`utils.estimator_checks.check_estimator_sparse_array`. @@ -548,4 +550,29 @@ Changelog Thanks to everyone who has contributed to the maintenance and improvement of the project since version 1.4, including: -TODO: update at the time of the release. +101AlexMartin, Abdulaziz Aloqeely, Adam J. Stewart, Adam Li, Adarsh Wase, Adrin +Jalali, Advik Sinha, Akash Srivastava, Akihiro Kuno, Alan Guedes, Alexis +IMBERT, Ana Paula Gomes, Anderson Nelson, Andrei Dzis, Arnaud Capitaine, Arturo +Amor, Aswathavicky, Bharat Raghunathan, Brendan Lu, Bruno, Cemlyn, Christian +Lorentzen, Christian Veenhuis, Cindy Liang, Claudio Salvatore Arcidiacono, +Connor Boyle, Conrad Stevens, crispinlogan, davidleon123, DerWeh, Dipan Banik, +Duarte São José, DUONG, Eddie Bergman, Edoardo Abati, Egehan Gunduz, Emad +Izadifar, Erich Schubert, Filip Karlo Došilović, Franck Charras, Gael +Varoquaux, Gönül Aycı, Guillaume Lemaitre, Gyeongjae Choi, Harmanan Kohli, +Hong Xiang Yue, Ian Faust, itsaphel, Ivan Wiryadi, Jack Bowyer, Javier Marin +Tur, Jérémie du Boisberranger, Jérôme Dockès, Jiawei Zhang, Joel Nothman, +Johanna Bayer, John Cant, John Hopfensperger, jpcars, jpienaar-tuks, Julian +Libiseller-Egger, Julien Jerphanion, KanchiMoe, Kaushik Amar Das, keyber, +Koustav Ghosh, kraktus, Krsto Proroković, ldwy4, LeoGrin, lihaitao, Linus +Sommer, Loic Esteve, Lucy Liu, Lukas Geiger, manasimj, Manuel Labbé, Manuel +Morales, Marco Edward Gorelli, Maren Westermann, Marija Vlajic, Mark Elliot, +Mateusz Sokół, Mavs, Michael Higgins, Michael Mayer, miguelcsilva, Miki +Watanabe, Mohammed Hamdy, myenugula, Nathan Goldbaum, Naziya Mahimkar, Neto, +Olivier Grisel, Omar Salman, Patrick Wang, Pierre de Fréminville, Priyash +Shah, Puneeth K, Rahil Parikh, raisadz, Raj Pulapakura, Ralf Gommers, Ralph +Urlus, Randolf Scholz, Reshama Shaikh, Richard Barnes, Rodrigo Romero, Saad +Mahmood, Salim Dohri, Sandip Dutta, SarahRemus, scikit-learn-bot, Shaharyar +Choudhry, Shubham, sperret6, Stefanie Senger, Suha Siddiqui, Thanh Lam DANG, +thebabush, Thomas J. Fan, Thomas Lazarus, Thomas Li, Tialo, Tim Head, Tuhin +Sharma, VarunChaduvula, Vineet Joshi, virchan, Waël Boukhobza, Weyb, Will +Dean, Xavier Beltran, Xiao Yuan, Xuefeng Xu, Yao Xiao From b51d0c9648241d1fd53dc9151689f62a61633a3d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C3=A9r=C3=A9mie=20du=20Boisberranger?= Date: Tue, 21 May 2024 15:59:29 +0200 Subject: [PATCH 554/554] trigger whell builder [cd build]