-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG] MNT Ignore xfail_checks in check_estimator #17219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG] MNT Ignore xfail_checks in check_estimator #17219
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @NicolasHug ! I like the fact that it's very simple.
I'm wondering if we should raise a warning for each xfailed test or not (in the _is_xfail
function).
The more I think about it, the more I think the answer is no. To me it makes sense to rely on pytest for anything more than just ignoring the test. And I mean, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, LGTM. We do mention in the documentation that users should be very careful about _xfail_checks
.
FWIW this is very similar to what I proposed in #16507. I changed my mind about not raising the warning. We raise the warning for skipped tests in |
Since |
Thanks for catching that! Should we use, I think the fact that |
Good catch, thanks, should be fixed.
Totally agree. I'm also quite puzzled by the need to pass |
Opened #17222 which is the same except that checks in the tag will be skipped with a SkipTestWarning. |
Fixes #16958
Alternative to and Closes #16963
Alternative to and Closes #17222
CC @thomasjpfan @rth.
I would prefer this approach, considering how much simpler it is. I agree with what @rth said: for a more advanced control, developers should be using
parametrize_with_checks
and pytest.