-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
MAINT DOC HGBT leave updated if loss is not smooth #26254
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MAINT DOC HGBT leave updated if loss is not smooth #26254
Conversation
Don't know if that's what you're after, but maybe I think |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@glevv Do you want to give this PR a review? |
@lorentzenchr Sorry, I don't think I will be able to |
@glevv No problem. I thought I just ask as you seemed interested in the PR:smirk: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks @lorentzenchr
Reference Issues/PRs
Popped up while working on #25964.
What does this implement/fix? Explain your changes.
HGBT leave updates now rely on
loss.differentiable
and the reasons and differences to the standard gradient boosting algo are explained.Any other comments?
It is hard to find a reference for gradient boosting with 2nd order loss approximation (using hessians) and non-smooth losses.
Edit: https://arxiv.org/abs/1808.03064 explicitly considers the different boosting schemes and mentions the problem of non-smooth loss functions with Newton boosting.