You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been experimenting with gradient boosting and noticed that the ExponentialLoss loss function has a minor bug. The negative_gradient method actually returns the positive gradient of the loss function. As far as I can tell, this doesn't affect fitting at all for the sklearn GradientBoostingClassifier because of the way _update_terminal_region works. However, when using the loss functions for a custom gradient boosting implementation that uses a line search, it caused the algorithm to fail to progress. Adding a minus sign to this line fixed the problem.
The text was updated successfully, but these errors were encountered:
I'm not sure that anyone from the core team has confirmed the issue yet,
but if you can identify it and test for correct behaviour, go ahead and
submit a patch.
I've been experimenting with gradient boosting and noticed that the
ExponentialLoss
loss function has a minor bug. Thenegative_gradient
method actually returns the positive gradient of the loss function. As far as I can tell, this doesn't affect fitting at all for the sklearnGradientBoostingClassifier
because of the way_update_terminal_region
works. However, when using the loss functions for a custom gradient boosting implementation that uses a line search, it caused the algorithm to fail to progress. Adding a minus sign to this line fixed the problem.The text was updated successfully, but these errors were encountered: