@@ -311,9 +311,9 @@ class AdaBoostClassifier(ClassifierMixin, BaseWeightBoosting):
311
311
In case of perfect fit, the learning procedure is stopped early.
312
312
313
313
learning_rate : float, default=1.
314
- Learning rate shrinks the contribution of each classifier by
315
- ``learning_rate``. There is a trade-off between ``learning_rate`` and
316
- `` n_estimators`` .
314
+ Weight applied to each classifier at each boosting iteration. A higher
315
+ learning rate increases the contribution of each classifier. There is
316
+ a trade-off between the `learning_rate` and ` n_estimators` parameters .
317
317
318
318
algorithm : {'SAMME', 'SAMME.R'}, default='SAMME.R'
319
319
If 'SAMME.R' then use the SAMME.R real boosting algorithm.
@@ -896,9 +896,9 @@ class AdaBoostRegressor(RegressorMixin, BaseWeightBoosting):
896
896
In case of perfect fit, the learning procedure is stopped early.
897
897
898
898
learning_rate : float, default=1.
899
- Learning rate shrinks the contribution of each regressor by
900
- ``learning_rate``. There is a trade-off between ``learning_rate`` and
901
- `` n_estimators`` .
899
+ Weight applied to each classifier at each boosting iteration. A higher
900
+ learning rate increases the contribution of each classifier. There is
901
+ a trade-off between the `learning_rate` and ` n_estimators` parameters .
902
902
903
903
loss : {'linear', 'square', 'exponential'}, default='linear'
904
904
The loss function to use when updating the weights after each
0 commit comments