-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
DOC improve learning-rate AdaBoost estimator #19919
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to make the same changes for the AdaBoostRegressor
.
Updated |
sklearn/ensemble/_weight_boosting.py
Outdated
@@ -314,8 +314,9 @@ class AdaBoostClassifier(ClassifierMixin, BaseWeightBoosting): | |||
|
|||
learning_rate : float, default=1. | |||
Learning rate shrinks the contribution of each classifier by | |||
``learning_rate``. There is a trade-off between ``learning_rate`` and | |||
``n_estimators``. | |||
``learning_rate`` whereas a higher learning rate increases the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually I don't think that whereas
is the right conjunction.
Can you just rephrase with
Weight applied to each classifier at each boosting iteration. A higher
learning rate increases the contribution of each classifier. There is a
trade-off between the `learning-rate` and `n_estimators` parameters.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated
Thanks @Bharat123rox |
Reference Issues/PRs
#19521
What does this implement/fix? Explain your changes.
Fixes #19521 by updating documentation
Any other comments?
Please review for verbosity