Skip to content

Use common loss module in gradient boosting #25964

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
7 tasks done
lorentzenchr opened this issue Mar 24, 2023 · 2 comments · Fixed by #26278
Closed
7 tasks done

Use common loss module in gradient boosting #25964

lorentzenchr opened this issue Mar 24, 2023 · 2 comments · Fixed by #26278
Labels
Meta-issue General issue associated to an identified list of tasks module:ensemble

Comments

@lorentzenchr
Copy link
Member

lorentzenchr commented Mar 24, 2023

GradientBoostingClassifier and GradientBoostingRegressor are not yet migrated to the new common (private) loss function module, see #15123. The steps needed are:

@github-actions github-actions bot added the Needs Triage Issue requires triage label Mar 24, 2023
@lorentzenchr
Copy link
Member Author

This time, I'll try small incremental PRs.

I could need some help deciding on how to tackle the initialization of raw_predictions. Should it be 100% backwards compatible or are tiny deviations allowed, e.g. clip(.., 1e-7) vs clip(.., 1e-14).

@lorentzenchr
Copy link
Member Author

Finally 🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Meta-issue General issue associated to an identified list of tasks module:ensemble
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants