Skip to content

Switch calibration.py to Use scipy's expit to Prevent Warnings (#12896) #12909

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jan 3, 2019
Merged

Switch calibration.py to Use scipy's expit to Prevent Warnings (#12896) #12909

merged 2 commits into from
Jan 3, 2019

Conversation

ZaydH
Copy link
Contributor

@ZaydH ZaydH commented Jan 3, 2019

Switch calibration.py to Use scipy's expit Function to Prevent Possible Overflow Warnings

Reference Issues/PRs

Fixes #12896

What does this implement/fix? Explain your changes.

Previously, the logistic function was calculated via calls to np.exp. This can in some cases lead to warnings due to overflow. This was fixed by changing the calls to scipy's expit function.

Any other comments?

There is one remaining call to np.exp in calibration.py. The function is also being used to calculate a logistic function in function _sigmoid_calibration (see inner function grad). However an intermediate result is being used elsewhere so I left that unchanged as I was not sure the standard for handling this within sklearn. If you let me know the standard, I can change that too.

…it function instead of directly using np.exp
Copy link
Member

@rth rth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, Thanks @ZaydH!

There is one remaining call to np.exp in calibration.py. The function is also being used to calculate a logistic function in function _sigmoid_calibration (see inner function grad). However an intermediate result is being used elsewhere so I left that unchanged as I was not sure the standard for handling this within sklearn. If you let me know the standard, I can change that too.

I would say we leave it unchanged, given that exp is pretty expensive calculation and it would suboptimal to do this calculation twice when computing the gradient.

Copy link
Member

@qinhanmin2014 qinhanmin2014 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @ZaydH , waiting for CI
I think we can change this without tests and what's new

@rth rth merged commit 4223633 into scikit-learn:master Jan 3, 2019
@ZaydH ZaydH deleted the fix_issue_12896 branch January 5, 2019 03:27
@ZaydH ZaydH restored the fix_issue_12896 branch January 5, 2019 03:27
adrinjalali pushed a commit to adrinjalali/scikit-learn that referenced this pull request Jan 7, 2019
xhluca pushed a commit to xhluca/scikit-learn that referenced this pull request Apr 28, 2019
xhluca pushed a commit to xhluca/scikit-learn that referenced this pull request Apr 28, 2019
xhluca pushed a commit to xhluca/scikit-learn that referenced this pull request Apr 28, 2019
koenvandevelde pushed a commit to koenvandevelde/scikit-learn that referenced this pull request Jul 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Overflow Runtime Warning for Numpy Logistic Function in CalibratedClassifierCV with SVM
3 participants