Skip to content

FIX Add error when LeaveOneOut used in CalibratedClassifierCV #29545

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 11 commits into from
Aug 10, 2024

Conversation

lucyleeow
Copy link
Member

Reference Issues/PRs

closes #29000

What does this implement/fix? Explain your changes.

Any other comments?

@lucyleeow lucyleeow changed the title add loo error FIX Add error when LeaveOneOut used in CalibratedClassifierCV Jul 23, 2024
Copy link

github-actions bot commented Jul 23, 2024

✔️ Linting Passed

All linting checks passed. Your pull request is in excellent shape! ☀️

Generated for commit: 6321854. Link to the linter CI: here

assert np.all(proba[:, :i] > 0)
assert np.all(proba[:, i + 1 :] > 0)
else:
# Check `proba` are all 1/n_classes
Copy link
Member Author

@lucyleeow lucyleeow Jul 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note proba was 1/n_classes here because the original test data was unique (consisted of 10 samples belonging to 10 classes) and this was not really related to train subset not containing all classes.
I think the estimator ended up being overfit and the calibrator did not respond well to low predict values, calibrating them all to the same value. Note proba was the same value even before normalization of the probabilities

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we even have to check ensemble=False as we use cross_val_predict to get the predictions to use to calibrate and at predict, only one estimator is fit using all the data.

if isinstance(self.cv, LeaveOneOut):
raise ValueError(
"LeaveOneOut cross-validation does not allow"
"all classes to be present in test splits."
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's be explicit by asking people to use an alternative cross-validation strategies.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Amended. Hopefully not too long now?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nop this is good.

@@ -441,27 +456,30 @@ def test_calibration_prob_sum(ensemble):
def test_calibration_less_classes(ensemble):
# Test to check calibration works fine when train set in a test-train
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a use case where I'm really wondering if this is valid :). But this is here so let's go with it.

Copy link
Member

@glemaitre glemaitre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding the scope of the PR, I'm happy to already have this in the codebase. I open a related PR regarding the cross-validation and the fact that we can get a subset of classes in training. I'm not sure that this is something that we should allow but it requires much more discussions and I might overlook some aspects.

@glemaitre
Copy link
Member

glemaitre commented Aug 2, 2024

I'll add this PR in the milestone for 1.5.2

Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@thomasjpfan thomasjpfan added the To backport PR merged in master that need a backport to a release branch defined based on the milestone. label Aug 10, 2024
@thomasjpfan thomasjpfan enabled auto-merge (squash) August 10, 2024 13:50
@thomasjpfan thomasjpfan merged commit 83da530 into scikit-learn:main Aug 10, 2024
33 checks passed
@lucyleeow lucyleeow deleted the calbclass_cv branch August 10, 2024 21:50
MarcBresson pushed a commit to MarcBresson/scikit-learn that referenced this pull request Sep 2, 2024
glemaitre pushed a commit to glemaitre/scikit-learn that referenced this pull request Sep 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
To backport PR merged in master that need a backport to a release branch defined based on the milestone.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

KFold(n_samples=n) not equivalent to LeaveOneOut() cv in CalibratedClassifierCV()
3 participants