-
-
Notifications
You must be signed in to change notification settings - Fork 7.9k
Use some more pytest plugins: warnings & rerunfailures #8346
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
With 3 test run failures out of the last 2 builds, |
appveyor.yml
Outdated
@@ -93,6 +93,7 @@ install: | |||
- cmd: IF %PYTHON_VERSION% == 2.7 conda install -q backports.functools_lru_cache | |||
# pytest-cov>=2.3.1 due to https://github.com/pytest-dev/pytest-cov/issues/124 | |||
- conda install -q pytest "pytest-cov>=2.3.1" pytest-timeout pytest-xdist | |||
- pip install -q pytest-rerunfailures pytest-warnings |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we keep everything installed from the same source?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not available in conda or conda-forge, but I could switch them all to pip.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's use pip for everything
I was going to suggest the In my notes, I also have these:
Though I haven't checked recently if they are still flaky |
This one's covered by c3aca56.
I don't recall seeing this one very often lately. |
Hmm, you know what; I think the warnings are still not printed... I will investigate tomorrow. |
Coincidentally, |
The doc build isn't finding the command |
Aha, I've figured out why warnings are not correctly captured. The |
Edit: Actually, that build is a lock error which I've not seen before. I do recall a test with a similar name (or just this one) failing semi-regularly though. |
@@ -1,5 +1,7 @@ | |||
#! /bin/bash | |||
|
|||
set -e |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is causing the doc build to fail on line 39 below, which can just be deleted (I can't work out how to open a PR on your branch, but https://travis-ci.org/dstansby/matplotlib/builds/221288308 is my travis build)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should probably just drop line 39
FYI, I'd like to get #8380 in first so that the warning filter plugin starts working correctly. |
Well, that's unfortunate; it looks like |
can we just disable the SVG portion of that test? |
b2528c5
to
76e4655
Compare
I marked it as xfail now. |
See matplotlib#7911 for why these tests are flaky and matplotlib#7107 for why they are not so easy to fix.
Since it depends on "speed", it may sometimes fail, so try again a couple of times.
Even with a rerun, it doesn't appear to fix itself, and it's been acting up a lot lately, so just ignore any failures for now.
This is rebased now that the documentation is building again. |
As the creator of mixedsubplots, I am completely flummoxed why it would even be flakey... ever. I worry that there might actually be a real bug somewhere here. A race condition or something, perhaps. |
Not sure it's a race condition as marking it to rerun a few times did not seem to work. Once it's broken, it's broken, but I don't know what breaks it. |
Could you try taking out the antialiased argument in the plot_surface(). It is an anachronism from a bygone era. Perhaps it is causing trouble. |
According to #8549, that didn't work, so I think we should just go ahead and merge this. |
🎊 Hopefully there's at least a little reduction in restarted jobs... |
Install
pytest-warnings
so we can see warnings produced by each test instead of being entirely hidden and unnoticed (cf. #7334 (comment)).With
pytest-rerunfailures
, we can mark a few tests that are flaky so that they get rerun automatically. I'd rather flaky tests be fixed of course, but this will hopefully help speed up CI a bit since we won't have to wait the 12+ minutes for a full rebuild.I have only marked two tests as flaky because they are inherently difficult to fix. If there are any suggestions for other tests to mark, I will add them, but hopefully there will be few.