Skip to content

Commit 2bc4546

Browse files
bharatr21glemaitre
authored andcommitted
DOC improve learning-rate AdaBoost estimator (#19919)
1 parent 53cc079 commit 2bc4546

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

sklearn/ensemble/_weight_boosting.py

+6-6
Original file line numberDiff line numberDiff line change
@@ -311,9 +311,9 @@ class AdaBoostClassifier(ClassifierMixin, BaseWeightBoosting):
311311
In case of perfect fit, the learning procedure is stopped early.
312312
313313
learning_rate : float, default=1.
314-
Learning rate shrinks the contribution of each classifier by
315-
``learning_rate``. There is a trade-off between ``learning_rate`` and
316-
``n_estimators``.
314+
Weight applied to each classifier at each boosting iteration. A higher
315+
learning rate increases the contribution of each classifier. There is
316+
a trade-off between the `learning_rate` and `n_estimators` parameters.
317317
318318
algorithm : {'SAMME', 'SAMME.R'}, default='SAMME.R'
319319
If 'SAMME.R' then use the SAMME.R real boosting algorithm.
@@ -896,9 +896,9 @@ class AdaBoostRegressor(RegressorMixin, BaseWeightBoosting):
896896
In case of perfect fit, the learning procedure is stopped early.
897897
898898
learning_rate : float, default=1.
899-
Learning rate shrinks the contribution of each regressor by
900-
``learning_rate``. There is a trade-off between ``learning_rate`` and
901-
``n_estimators``.
899+
Weight applied to each classifier at each boosting iteration. A higher
900+
learning rate increases the contribution of each classifier. There is
901+
a trade-off between the `learning_rate` and `n_estimators` parameters.
902902
903903
loss : {'linear', 'square', 'exponential'}, default='linear'
904904
The loss function to use when updating the weights after each

0 commit comments

Comments
 (0)