Skip to content

[MRG] Partial dependence plots -- continued #12599

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 137 commits into from
Apr 24, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
137 commits
Select commit Hold shift + click to select a range
57f9a6f
general partial dependence plots
trevorstephens Oct 19, 2015
9a09888
add init
trevorstephens Oct 21, 2015
e714e16
implement exact and estimated methods
trevorstephens Nov 1, 2015
19ed28e
support for Pipeline and GridSearchCV type estimators
trevorstephens Nov 6, 2015
6fafc5e
add multioutput support
trevorstephens Nov 16, 2015
152a190
rebase and catch up to #6762, #7673, #7846
trevorstephens Jul 22, 2017
2cdc5ea
catch up on #9434
trevorstephens Jul 26, 2017
ba1f8da
initial update of plot_partial_dependence
trevorstephens Jul 26, 2017
1b1d8f0
deprecate ensemble.partial_dependence
trevorstephens Aug 12, 2017
9095305
refactor estimated and exact functions to _predict
trevorstephens Aug 15, 2017
3fc1727
make "auto" the default rather than None for method
trevorstephens Aug 15, 2017
259ec99
some more refactoring
trevorstephens Aug 16, 2017
cbc20af
avoid namespace collision
trevorstephens Aug 21, 2017
63da115
fix output shapes of all estimators
trevorstephens Aug 22, 2017
8f7d2b0
add tests to ensure all estimators output same shape
trevorstephens Aug 22, 2017
6fc3a49
quick fixes
trevorstephens Aug 30, 2017
b1f8bfc
fix docstring, test fails
trevorstephens Sep 1, 2017
dc93b69
refactor tests for easier debugging
trevorstephens Sep 1, 2017
cd8f8de
speed up tests, add two-way plot test
trevorstephens Sep 2, 2017
4eb1a80
move input validation on X
trevorstephens Sep 2, 2017
21544ce
fix output shape for multi-label classification
trevorstephens Oct 28, 2017
610b5c5
update plot helper to support multi-output
trevorstephens Oct 28, 2017
dcbd0c6
update plot helper to pass-through output
trevorstephens Oct 28, 2017
08ef804
Merge branch 'master' into partial_dep
NicolasHug Nov 9, 2018
3f5c7f7
removed estimated method, small refactoring
NicolasHug Nov 9, 2018
45e648d
factorized some test
NicolasHug Nov 10, 2018
ba4868f
some more refactoring
NicolasHug Nov 12, 2018
414387b
test for _grid_from_X
NicolasHug Nov 12, 2018
f635693
few changes and comments
NicolasHug Nov 13, 2018
58bfbad
some test + removed multioutput logic for now
NicolasHug Nov 13, 2018
cabb7f1
some more tests
NicolasHug Nov 13, 2018
7e28cf5
removed support for multioutput multiclass and added back multioutput…
NicolasHug Nov 14, 2018
f84787d
better tests
NicolasHug Nov 14, 2018
545ca6f
Removed support for RandomForestRegressor with recursion (does not
NicolasHug Nov 14, 2018
e086051
merged label and output into target
NicolasHug Nov 14, 2018
137cd07
renamed exact into brute
NicolasHug Nov 14, 2018
b00a23d
renamin
NicolasHug Nov 14, 2018
787e07f
some refactoring and tests
NicolasHug Nov 14, 2018
39dffd7
some docs and tests
NicolasHug Nov 15, 2018
f9f7ee7
Added check for grid_resolution
NicolasHug Nov 15, 2018
15e824d
docs
NicolasHug Nov 15, 2018
2f34cc1
added deprecation in doc and used decorator
NicolasHug Nov 15, 2018
a3f6ed1
Merge branch 'master' into partial_dep
NicolasHug Nov 15, 2018
36db441
added whatsnew entry
NicolasHug Nov 15, 2018
ef40ede
pep8
NicolasHug Nov 15, 2018
a766cad
added PR number to whatsnew
NicolasHug Nov 15, 2018
828cca4
sorted dict keys for python2
NicolasHug Nov 15, 2018
e86bdab
trying to fix python37 issue
NicolasHug Nov 15, 2018
e26c0ac
removed use of dict for function dispatching
NicolasHug Nov 15, 2018
763d151
filtered out warnings in tests
NicolasHug Nov 15, 2018
f5ff519
added test for multioutput
NicolasHug Nov 15, 2018
32cafe8
fixed comment
NicolasHug Nov 15, 2018
784277d
Fixed doctest
NicolasHug Nov 16, 2018
2a58752
updated docstrings
NicolasHug Nov 16, 2018
a5285e1
put lazy imports in deprecated module
NicolasHug Nov 16, 2018
9bcc0ca
Finished removing old support for RandomForest
NicolasHug Nov 16, 2018
1c0b11d
fixed whatsnew
NicolasHug Nov 16, 2018
8f016c6
removed unrelated change
NicolasHug Nov 16, 2018
4bf6c90
small test refactoring
NicolasHug Nov 16, 2018
e628ae6
Merge branch 'master' into partial_dep
NicolasHug Nov 16, 2018
fa6eba7
pyt back ifmatplotlib dec
NicolasHug Nov 16, 2018
634dc33
pep8
NicolasHug Nov 16, 2018
2783b04
Merge branch 'master' into partial_dep
NicolasHug Nov 21, 2018
512b353
addressed some comments
NicolasHug Nov 24, 2018
736ba01
Merge branch 'master' into partial_dep
NicolasHug Nov 24, 2018
ef09e80
Added sanity check
NicolasHug Nov 28, 2018
60b69b8
Added warnings about non constant init estimators
NicolasHug Dec 7, 2018
34edf8f
Removed useless train_test_split from example
NicolasHug Dec 7, 2018
abd242b
Merge branch 'master' into partial_dep
NicolasHug Dec 10, 2018
4975dc9
put back old versions in ensemble/partial_dependence.py to remove gri…
NicolasHug Dec 17, 2018
995f4e9
Removed grid param from partial_dependence()
NicolasHug Dec 17, 2018
f817460
Merge branch 'master' into partial_dep
NicolasHug Dec 17, 2018
46d3da4
Merge branch 'master' into partial_dep
NicolasHug Dec 19, 2018
7a8fb44
Added MLPRegressor to example
NicolasHug Dec 19, 2018
2f67a35
Remoed ax param and used fig instead
NicolasHug Dec 19, 2018
2e1f926
Addressed comments
NicolasHug Jan 9, 2019
56ac79e
minor docstring change
NicolasHug Jan 10, 2019
4149a9c
Addressed comments
NicolasHug Jan 14, 2019
69b95e9
Addressed comments from Joel
NicolasHug Jan 15, 2019
c7c4614
Merge branch 'master' into partial_dep
NicolasHug Jan 16, 2019
4d7c062
Merge branch 'master' into partial_dep
NicolasHug Feb 3, 2019
18c8f32
Merge branch 'partial_dep' of github.com:NicolasHug/scikit-learn into…
NicolasHug Feb 3, 2019
01ab87c
rm blank line
NicolasHug Feb 3, 2019
ca8c0fd
moved into inspect module
NicolasHug Feb 4, 2019
d68ebc9
added sklearn/inspect/tests/__init__.py
NicolasHug Feb 4, 2019
95de133
Merge branch 'master' into partial_dep
NicolasHug Feb 4, 2019
5d09584
Hopefully fixes windows issue?
NicolasHug Feb 12, 2019
592a589
Using add_subpackage
NicolasHug Feb 13, 2019
5f4c317
Merge branch 'master' into partial_dep
NicolasHug Feb 26, 2019
38c1c54
wording
NicolasHug Feb 26, 2019
88ce5fa
Merge branch 'master' into partial_dep
NicolasHug Feb 27, 2019
54ece6b
Added response parameter
NicolasHug Feb 27, 2019
aaeab44
indent
NicolasHug Feb 28, 2019
28b936a
pep8
NicolasHug Feb 28, 2019
c40d472
Merge branch 'master' into partial_dep
NicolasHug Mar 1, 2019
2c65b03
changed proba into predict_proba, decision into decision_function and…
NicolasHug Mar 1, 2019
2bc0f0f
Updated references
NicolasHug Mar 12, 2019
600cf7e
link to glossary terms
NicolasHug Mar 12, 2019
91c7822
Merge remote-tracking branch 'upstream/master' into partial_dep
NicolasHug Mar 28, 2019
abb15b6
Addressed Joels comments
NicolasHug Mar 28, 2019
de2ebe5
Addressed Joels comments
NicolasHug Mar 28, 2019
fe8a026
Use pytest for exceptions and warnings
NicolasHug Mar 28, 2019
07e8de4
Renamed inspect into model_selection
NicolasHug Mar 28, 2019
d33a732
created plot module and put plot_partial_dependence() there
NicolasHug Mar 28, 2019
27a7eab
Merge branch 'master' of github.com:scikit-learn/scikit-learn into pa…
NicolasHug Apr 18, 2019
8262419
Apply suggestions from code review
glemaitre Apr 18, 2019
3464b7e
Addressed comments from Guillaume
NicolasHug Apr 18, 2019
591471f
put everything in sklearn.inspection
NicolasHug Apr 18, 2019
1c073ec
Merge branch 'partial_dep' of github.com:NicolasHug/scikit-learn into…
NicolasHug Apr 18, 2019
b89d5c4
removed model_inspection
NicolasHug Apr 18, 2019
f27f542
pep8
NicolasHug Apr 18, 2019
9bca47c
plot_partial_dep doesnt return anything
NicolasHug Apr 18, 2019
ad326af
Ignored dep warning for new tests in ensemble
NicolasHug Apr 19, 2019
3823fd2
ported sample_weight tests
NicolasHug Apr 19, 2019
ba06512
Merge remote-tracking branch 'upstream/master' into partial_dep
NicolasHug Apr 20, 2019
2be41ce
docstring
NicolasHug Apr 21, 2019
0b400c0
Merge branch 'master' of github.com:scikit-learn/scikit-learn into pa…
NicolasHug Apr 23, 2019
743c838
Addressed comments
NicolasHug Apr 23, 2019
cb5166a
Apply suggestions from code review
glemaitre Apr 23, 2019
b2e3be7
Merge branch 'partial_dep' of github.com:NicolasHug/scikit-learn into…
NicolasHug Apr 23, 2019
f9cb127
forgot some merging conflicts
NicolasHug Apr 23, 2019
a8d7991
put back old test
NicolasHug Apr 23, 2019
02e74ec
Update sklearn/utils/__init__.py
glemaitre Apr 23, 2019
bed53c1
Merge branch 'partial_dep' of github.com:NicolasHug/scikit-learn into…
NicolasHug Apr 23, 2019
2b52051
Apply suggestions from code review
glemaitre Apr 23, 2019
0a13f1b
comments
NicolasHug Apr 23, 2019
14dbd2b
comments
NicolasHug Apr 23, 2019
4a1b11d
Addressed comments
NicolasHug Apr 23, 2019
92795a9
pep8
NicolasHug Apr 23, 2019
30bffb3
Avoid re-computating quantiles
NicolasHug Apr 23, 2019
e0094b6
fixed example
NicolasHug Apr 23, 2019
d5b1559
removed warnings refs
NicolasHug Apr 23, 2019
3fce2c4
MAINT: install matplotlib in conda latest build
glemaitre Apr 23, 2019
d22d7b2
Addressed comments
NicolasHug Apr 24, 2019
80f53cb
Merge branch 'partial_dep' of github.com:NicolasHug/scikit-learn into…
NicolasHug Apr 24, 2019
5677050
Merge remote-tracking branch 'upstream/master' into partial_dep
NicolasHug Apr 24, 2019
27c261e
forgot if_mpl decorator
NicolasHug Apr 24, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ jobs:
PYAMG_VERSION: '*'
PILLOW_VERSION: '*'
JOBLIB_VERSION: '*'
MATPLOTLIB_VERSION: '*'
COVERAGE: 'true'
CHECK_PYTEST_SOFT_DEPENDENCY: 'true'
TEST_DOCSTRINGS: 'true'
Expand Down
4 changes: 4 additions & 0 deletions build_tools/azure/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,10 @@ if [[ "$DISTRIB" == "conda" ]]; then
TO_INSTALL="$TO_INSTALL pillow=$PILLOW_VERSION"
fi

if [[ -n "$MATPLOTLIB_VERSION" ]]; then
TO_INSTALL="$TO_INSTALL matplotlib=$MATPLOTLIB_VERSION"
fi

make_conda $TO_INSTALL

elif [[ "$DISTRIB" == "ubuntu" ]]; then
Expand Down
10 changes: 10 additions & 0 deletions doc/inspection.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
.. include:: includes/big_toc_css.rst

.. _inspection:

Inspection
----------

.. toctree::

modules/partial_dependence
43 changes: 26 additions & 17 deletions doc/modules/classes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -428,23 +428,6 @@ Samples generator
:template: function.rst


partial dependence
------------------

.. automodule:: sklearn.ensemble.partial_dependence
:no-members:
:no-inherited-members:

.. currentmodule:: sklearn

.. autosummary::
:toctree: generated/
:template: function.rst

ensemble.partial_dependence.partial_dependence
ensemble.partial_dependence.plot_partial_dependence


.. _exceptions_ref:

:mod:`sklearn.exceptions`: Exceptions and warnings
Expand Down Expand Up @@ -1229,6 +1212,25 @@ Model validation
pipeline.make_union


.. _inspection_ref:

:mod:`sklearn.inspection`: inspection
=====================================

.. automodule:: sklearn.inspection
:no-members:
:no-inherited-members:

.. currentmodule:: sklearn

.. autosummary::
:toctree: generated/
:template: function.rst

inspection.partial_dependence
inspection.plot_partial_dependence


.. _preprocessing_ref:

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please move ensemble.* to undre Deprecated below

:mod:`sklearn.preprocessing`: Preprocessing and Normalization
Expand Down Expand Up @@ -1509,6 +1511,13 @@ To be removed in 0.23
metrics.jaccard_similarity_score
linear_model.logistic_regression_path

.. autosummary::
:toctree: generated/
:template: function.rst

ensemble.partial_dependence.partial_dependence
ensemble.partial_dependence.plot_partial_dependence


To be removed in 0.22
---------------------
Expand Down
125 changes: 0 additions & 125 deletions doc/modules/ensemble.rst
Original file line number Diff line number Diff line change
Expand Up @@ -797,131 +797,6 @@ accessed via the ``feature_importances_`` property::

* :ref:`sphx_glr_auto_examples_ensemble_plot_gradient_boosting_regression.py`

.. currentmodule:: sklearn.ensemble.partial_dependence

.. _partial_dependence:

Partial dependence
..................

Partial dependence plots (PDP) show the dependence between the target response
and a set of 'target' features, marginalizing over the
values of all other features (the 'complement' features).
Intuitively, we can interpret the partial dependence as the expected
target response [1]_ as a function of the 'target' features [2]_.

Due to the limits of human perception the size of the target feature
set must be small (usually, one or two) thus the target features are
usually chosen among the most important features.

The Figure below shows four one-way and one two-way partial dependence plots
for the California housing dataset:

.. figure:: ../auto_examples/ensemble/images/sphx_glr_plot_partial_dependence_001.png
:target: ../auto_examples/ensemble/plot_partial_dependence.html
:align: center
:scale: 70

One-way PDPs tell us about the interaction between the target
response and the target feature (e.g. linear, non-linear).
The upper left plot in the above Figure shows the effect of the
median income in a district on the median house price; we can
clearly see a linear relationship among them.

PDPs with two target features show the
interactions among the two features. For example, the two-variable PDP in the
above Figure shows the dependence of median house price on joint
values of house age and avg. occupants per household. We can clearly
see an interaction between the two features:
For an avg. occupancy greater than two, the house price is nearly independent
of the house age, whereas for values less than two there is a strong dependence
on age.

The module :mod:`partial_dependence` provides a convenience function
:func:`~sklearn.ensemble.partial_dependence.plot_partial_dependence`
to create one-way and two-way partial dependence plots. In the below example
we show how to create a grid of partial dependence plots: two one-way
PDPs for the features ``0`` and ``1`` and a two-way PDP between the two
features::

>>> from sklearn.datasets import make_hastie_10_2
>>> from sklearn.ensemble import GradientBoostingClassifier
>>> from sklearn.ensemble.partial_dependence import plot_partial_dependence

>>> X, y = make_hastie_10_2(random_state=0)
>>> clf = GradientBoostingClassifier(n_estimators=100, learning_rate=1.0,
... max_depth=1, random_state=0).fit(X, y)
>>> features = [0, 1, (0, 1)]
>>> fig, axs = plot_partial_dependence(clf, X, features) #doctest: +SKIP

For multi-class models, you need to set the class label for which the
PDPs should be created via the ``label`` argument::

>>> from sklearn.datasets import load_iris
>>> iris = load_iris()
>>> mc_clf = GradientBoostingClassifier(n_estimators=10,
... max_depth=1).fit(iris.data, iris.target)
>>> features = [3, 2, (3, 2)]
>>> fig, axs = plot_partial_dependence(mc_clf, X, features, label=0) #doctest: +SKIP

If you need the raw values of the partial dependence function rather
than the plots you can use the
:func:`~sklearn.ensemble.partial_dependence.partial_dependence` function::

>>> from sklearn.ensemble.partial_dependence import partial_dependence

>>> pdp, axes = partial_dependence(clf, [0], X=X)
>>> pdp # doctest: +ELLIPSIS
array([[ 2.46643157, 2.46643157, ...
>>> axes # doctest: +ELLIPSIS
[array([-1.62497054, -1.59201391, ...

The function requires either the argument ``grid`` which specifies the
values of the target features on which the partial dependence function
should be evaluated or the argument ``X`` which is a convenience mode
for automatically creating ``grid`` from the training data. If ``X``
is given, the ``axes`` value returned by the function gives the axis
for each target feature.

For each value of the 'target' features in the ``grid`` the partial
dependence function need to marginalize the predictions of a tree over
all possible values of the 'complement' features. In decision trees
this function can be evaluated efficiently without reference to the
training data. For each grid point a weighted tree traversal is
performed: if a split node involves a 'target' feature, the
corresponding left or right branch is followed, otherwise both
branches are followed, each branch is weighted by the fraction of
training samples that entered that branch. Finally, the partial
dependence is given by a weighted average of all visited leaves. For
tree ensembles the results of each individual tree are again
averaged.

.. rubric:: Footnotes

.. [1] For classification with ``loss='deviance'`` the target
response is logit(p).

.. [2] More precisely its the expectation of the target response after
accounting for the initial model; partial dependence plots
do not include the ``init`` model.

.. topic:: Examples:

* :ref:`sphx_glr_auto_examples_ensemble_plot_partial_dependence.py`


.. topic:: References

.. [F2001] J. Friedman, "Greedy Function Approximation: A Gradient Boosting Machine",
The Annals of Statistics, Vol. 29, No. 5, 2001.

.. [F1999] J. Friedman, "Stochastic Gradient Boosting", 1999

.. [HTF2009] T. Hastie, R. Tibshirani and J. Friedman, "Elements of Statistical Learning Ed. 2", Springer, 2009.

.. [R2007] G. Ridgeway, "Generalized Boosted Models: A guide to the gbm package", 2007


.. _voting_classifier:

Voting Classifier
Expand Down
129 changes: 129 additions & 0 deletions doc/modules/partial_dependence.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@

.. _partial_dependence:

========================
Partial dependence plots
========================

.. currentmodule:: sklearn.inspection

Partial dependence plots (PDP) show the dependence between the target
response [1]_ and a set of 'target' features, marginalizing over the values
of all other features (the 'complement' features). Intuitively, we can
interpret the partial dependence as the expected target response as a
function of the 'target' features.

Due to the limits of human perception the size of the target feature set
must be small (usually, one or two) thus the target features are usually
chosen among the most important features.

The figure below shows four one-way and one two-way partial dependence plots
for the California housing dataset, with a :class:`GradientBoostingRegressor
<sklearn.ensemble.GradientBoostingRegressor>`:

.. figure:: ../auto_examples/inspection/images/sphx_glr_plot_partial_dependence_002.png
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to check the documentation to see if this is rendered properly (for the moment the README.txt is missing).

:target: ../auto_examples/inspection/plot_partial_dependence.html
:align: center
:scale: 70

One-way PDPs tell us about the interaction between the target response and
the target feature (e.g. linear, non-linear). The upper left plot in the
above figure shows the effect of the median income in a district on the
median house price; we can clearly see a linear relationship among them. Note
that PDPs assume that the target features are independent from the complement
features, and this assumption is often violated in practice.

PDPs with two target features show the interactions among the two features.
For example, the two-variable PDP in the above figure shows the dependence
of median house price on joint values of house age and average occupants per
household. We can clearly see an interaction between the two features: for
an average occupancy greater than two, the house price is nearly independent of
the house age, whereas for values less than 2 there is a strong dependence
on age.

The :mod:`sklearn.inspection` module provides a convenience function
:func:`plot_partial_dependence` to create one-way and two-way partial
dependence plots. In the below example we show how to create a grid of
partial dependence plots: two one-way PDPs for the features ``0`` and ``1``
and a two-way PDP between the two features::

>>> from sklearn.datasets import make_hastie_10_2
>>> from sklearn.ensemble import GradientBoostingClassifier
>>> from sklearn.inspection import plot_partial_dependence

>>> X, y = make_hastie_10_2(random_state=0)
>>> clf = GradientBoostingClassifier(n_estimators=100, learning_rate=1.0,
... max_depth=1, random_state=0).fit(X, y)
>>> features = [0, 1, (0, 1)]
>>> plot_partial_dependence(clf, X, features) #doctest: +SKIP

You can access the newly created figure and Axes objects using ``plt.gcf()``
and ``plt.gca()``.

For multi-class classification, you need to set the class label for which
the PDPs should be created via the ``target`` argument::

>>> from sklearn.datasets import load_iris
>>> iris = load_iris()
>>> mc_clf = GradientBoostingClassifier(n_estimators=10,
... max_depth=1).fit(iris.data, iris.target)
>>> features = [3, 2, (3, 2)]
>>> plot_partial_dependence(mc_clf, X, features, target=0) #doctest: +SKIP

The same parameter ``target`` is used to specify the target in multi-output
regression settings.

If you need the raw values of the partial dependence function rather than
the plots, you can use the
:func:`sklearn.inspection.partial_dependence` function::

>>> from sklearn.inspection import partial_dependence

>>> pdp, axes = partial_dependence(clf, X, [0])
>>> pdp # doctest: +ELLIPSIS
array([[ 2.466..., 2.466..., ...
>>> axes # doctest: +ELLIPSIS
[array([-1.624..., -1.592..., ...

The values at which the partial dependence should be evaluated are directly
generated from ``X``. For 2-way partial dependence, a 2D-grid of values is
generated. The ``values`` field returned by
:func:`sklearn.inspection.partial_dependence` gives the actual values
used in the grid for each target feature. They also correspond to the axis
of the plots.

For each value of the 'target' features in the ``grid`` the partial
dependence function needs to marginalize the predictions of the estimator
over all possible values of the 'complement' features. With the ``'brute'``
method, this is done by replacing every target feature value of ``X`` by those
in the grid, and computing the average prediction.

In decision trees this can be evaluated efficiently without reference to the
training data (``'recursion'`` method). For each grid point a weighted tree
traversal is performed: if a split node involves a 'target' feature, the
corresponding left or right branch is followed, otherwise both branches are
followed, each branch is weighted by the fraction of training samples that
entered that branch. Finally, the partial dependence is given by a weighted
average of all visited leaves. Note that with the ``'recursion'`` method,
``X`` is only used to generate the grid, not to compute the averaged
predictions. The averaged predictions will always be computed on the data with
which the trees were trained.

.. rubric:: Footnotes

.. [1] For classification, the target response may be the probability of a
class (the positive class for binary classification), or the decision
function.

.. topic:: Examples:

* :ref:`sphx_glr_auto_examples_inspection_plot_partial_dependence.py`

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rm blank

.. topic:: References

.. [HTF2009] T. Hastie, R. Tibshirani and J. Friedman, `The Elements of
Statistical Learning <https://web.stanford.edu/~hastie/ElemStatLearn//>`_,
Second Edition, Section 10.13.2, Springer, 2009.

.. [Mol2019] C. Molnar, `Interpretable Machine Learning
<https://christophm.github.io/interpretable-ml-book/>`_, Section 5.1, 2019.
1 change: 1 addition & 0 deletions doc/user_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ User Guide
supervised_learning.rst
unsupervised_learning.rst
model_selection.rst
inspection.rst
data_transforms.rst
Dataset loading utilities <datasets/index.rst>
modules/computing.rst
Loading