Skip to content

The negative_gradient method of ExponentialLoss returns the positive gradient #9666

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
jcrudy opened this issue Sep 1, 2017 · 3 comments · Fixed by #22050
Closed

The negative_gradient method of ExponentialLoss returns the positive gradient #9666

jcrudy opened this issue Sep 1, 2017 · 3 comments · Fixed by #22050

Comments

@jcrudy
Copy link

jcrudy commented Sep 1, 2017

I've been experimenting with gradient boosting and noticed that the ExponentialLoss loss function has a minor bug. The negative_gradient method actually returns the positive gradient of the loss function. As far as I can tell, this doesn't affect fitting at all for the sklearn GradientBoostingClassifier because of the way _update_terminal_region works. However, when using the loss functions for a custom gradient boosting implementation that uses a line search, it caused the algorithm to fail to progress. Adding a minus sign to this line fixed the problem.

@mohdsanadzakirizvi
Copy link

I would like to submit a patch for this issue.

@jnothman
Copy link
Member

jnothman commented Sep 2, 2017 via email

@glemaitre
Copy link
Member

Indeed, this seems fishy. We used the following as a reference: https://pbil.univ-lyon1.fr/CRAN/web/packages/gbm/vignettes/gbm.pdf

The loss for a given point is given by:

u = -(2.0 * y_true - 1.0)
np.exp(u * y_pred)

Thus taking the derivative respect to y_pred, we will have:

u = -(2.0 * y_true - 1.0)
u * np.exp(u * y_pred)

Thus, the negative gradient is indeed:

u = -(2.0 * y_true - 1.0)
- u * np.exp(u * y_pred)

All gradient reported in the document seems to be the negative gradient but not for the AdaBoost loss.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants