Skip to content

DOC Ensures that PassiveAggressiveClassifier passes numpydoc validation #21226

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Oct 12, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion maint_tools/test_docstrings.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@
"MultiTaskLassoCV",
"OrthogonalMatchingPursuit",
"OrthogonalMatchingPursuitCV",
"PassiveAggressiveClassifier",
"PassiveAggressiveRegressor",
"PatchExtractor",
"PolynomialFeatures",
Expand Down
94 changes: 48 additions & 46 deletions sklearn/linear_model/_passive_aggressive.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,12 @@


class PassiveAggressiveClassifier(BaseSGDClassifier):
"""Passive Aggressive Classifier
"""Passive Aggressive Classifier.

Read more in the :ref:`User Guide <passive_aggressive>`.

Parameters
----------

C : float, default=1.0
Maximum step size (regularization). Defaults to 1.0.

Expand Down Expand Up @@ -58,10 +57,10 @@ class PassiveAggressiveClassifier(BaseSGDClassifier):
shuffle : bool, default=True
Whether or not the training data should be shuffled after each epoch.

verbose : integer, default=0
The verbosity level
verbose : int, default=0
The verbosity level.

loss : string, default="hinge"
loss : str, default="hinge"
The loss function to be used:
hinge: equivalent to PA-I in the reference paper.
squared_hinge: equivalent to PA-II in the reference paper.
Expand Down Expand Up @@ -97,7 +96,7 @@ class PassiveAggressiveClassifier(BaseSGDClassifier):

The "balanced" mode uses the values of y to automatically adjust
weights inversely proportional to class frequencies in the input data
as ``n_samples / (n_classes * np.bincount(y))``
as ``n_samples / (n_classes * np.bincount(y))``.

.. versionadded:: 0.17
parameter *class_weight* to automatically weight samples.
Expand All @@ -109,15 +108,15 @@ class PassiveAggressiveClassifier(BaseSGDClassifier):
average. So average=10 will begin averaging after seeing 10 samples.

.. versionadded:: 0.19
parameter *average* to use weights averaging in SGD
parameter *average* to use weights averaging in SGD.

Attributes
----------
coef_ : array, shape = [1, n_features] if n_classes == 2 else [n_classes,\
n_features]
coef_ : ndarray of shape (1, n_features) if n_classes == 2 else \
(n_classes, n_features)
Weights assigned to the features.

intercept_ : array, shape = [1] if n_classes == 2 else [n_classes]
intercept_ : ndarray of shape (1,) if n_classes == 2 else (n_classes,)
Constants in decision function.

n_features_in_ : int
Expand All @@ -135,7 +134,7 @@ class PassiveAggressiveClassifier(BaseSGDClassifier):
The actual number of iterations to reach the stopping criterion.
For multiclass fits, it is the maximum over every binary fit.

classes_ : array of shape (n_classes,)
classes_ : ndarray of shape (n_classes,)
The unique classes labels.

t_ : int
Expand All @@ -145,11 +144,21 @@ class PassiveAggressiveClassifier(BaseSGDClassifier):
loss_function_ : callable
Loss function used by the algorithm.

See Also
--------
SGDClassifier : Incrementally trained logistic regression.
Perceptron : Linear perceptron classifier.

References
----------
Online Passive-Aggressive Algorithms
<http://jmlr.csail.mit.edu/papers/volume7/crammer06a/crammer06a.pdf>
K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR (2006)

Examples
--------
>>> from sklearn.linear_model import PassiveAggressiveClassifier
>>> from sklearn.datasets import make_classification

>>> X, y = make_classification(n_features=4, random_state=0)
>>> clf = PassiveAggressiveClassifier(max_iter=1000, random_state=0,
... tol=1e-3)
Expand All @@ -161,18 +170,6 @@ class PassiveAggressiveClassifier(BaseSGDClassifier):
[1.84127814]
>>> print(clf.predict([[0, 0, 0, 0]]))
[1]

See Also
--------
SGDClassifier
Perceptron

References
----------
Online Passive-Aggressive Algorithms
<http://jmlr.csail.mit.edu/papers/volume7/crammer06a/crammer06a.pdf>
K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR (2006)

"""

def __init__(
Expand Down Expand Up @@ -221,12 +218,12 @@ def partial_fit(self, X, y, classes=None):
Parameters
----------
X : {array-like, sparse matrix} of shape (n_samples, n_features)
Subset of the training data
Subset of the training data.

y : numpy array of shape [n_samples]
Subset of the target values
y : array-like of shape (n_samples,)
Subset of the target values.

classes : array, shape = [n_classes]
classes : ndarray of shape (n_classes,)
Classes across all calls to partial_fit.
Can be obtained by via `np.unique(y_all)`, where y_all is the
target vector of the entire dataset.
Expand All @@ -236,7 +233,8 @@ def partial_fit(self, X, y, classes=None):

Returns
-------
self : returns an instance of self.
self : object
Fitted estimator.
"""
self._validate_params(for_partial_fit=True)
if self.class_weight == "balanced":
Expand Down Expand Up @@ -272,20 +270,21 @@ def fit(self, X, y, coef_init=None, intercept_init=None):
Parameters
----------
X : {array-like, sparse matrix} of shape (n_samples, n_features)
Training data
Training data.

y : numpy array of shape [n_samples]
Target values
y : array-like of shape (n_samples,)
Target values.

coef_init : array, shape = [n_classes,n_features]
coef_init : ndarray of shape (n_classes, n_features)
The initial coefficients to warm-start the optimization.

intercept_init : array, shape = [n_classes]
intercept_init : ndarray of shape (n_classes,)
The initial intercept to warm-start the optimization.

Returns
-------
self : returns an instance of self.
self : object
Fitted estimator.
"""
self._validate_params()
lr = "pa1" if self.loss == "hinge" else "pa2"
Expand Down Expand Up @@ -354,9 +353,9 @@ class PassiveAggressiveRegressor(BaseSGDRegressor):
Whether or not the training data should be shuffled after each epoch.

verbose : integer, default=0
The verbosity level
The verbosity level.

loss : string, default="epsilon_insensitive"
loss : str, default="epsilon_insensitive"
The loss function to be used:
epsilon_insensitive: equivalent to PA-I in the reference paper.
squared_epsilon_insensitive: equivalent to PA-II in the reference
Expand Down Expand Up @@ -388,7 +387,7 @@ class PassiveAggressiveRegressor(BaseSGDRegressor):
average. So average=10 will begin averaging after seeing 10 samples.

.. versionadded:: 0.19
parameter *average* to use weights averaging in SGD
parameter *average* to use weights averaging in SGD.

Attributes
----------
Expand Down Expand Up @@ -436,13 +435,14 @@ class PassiveAggressiveRegressor(BaseSGDRegressor):

See Also
--------
SGDRegressor
SGDRegressor : Linear model fitted by minimizing a regularized
empirical loss with SGD.

References
----------
Online Passive-Aggressive Algorithms
<http://jmlr.csail.mit.edu/papers/volume7/crammer06a/crammer06a.pdf>
K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR (2006)
K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR (2006).

"""

Expand Down Expand Up @@ -490,14 +490,15 @@ def partial_fit(self, X, y):
Parameters
----------
X : {array-like, sparse matrix} of shape (n_samples, n_features)
Subset of training data
Subset of training data.

y : numpy array of shape [n_samples]
Subset of target values
Subset of target values.

Returns
-------
self : returns an instance of self.
self : object
Fitted estimator.
"""
self._validate_params(for_partial_fit=True)
lr = "pa1" if self.loss == "epsilon_insensitive" else "pa2"
Expand All @@ -520,10 +521,10 @@ def fit(self, X, y, coef_init=None, intercept_init=None):
Parameters
----------
X : {array-like, sparse matrix} of shape (n_samples, n_features)
Training data
Training data.

y : numpy array of shape [n_samples]
Target values
Target values.

coef_init : array, shape = [n_features]
The initial coefficients to warm-start the optimization.
Expand All @@ -533,7 +534,8 @@ def fit(self, X, y, coef_init=None, intercept_init=None):

Returns
-------
self : returns an instance of self.
self : object
Fitted estimator.
"""
self._validate_params()
lr = "pa1" if self.loss == "epsilon_insensitive" else "pa2"
Expand Down