-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
Sometimes, supress_warnings misses one of its attributes #8413
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hmmm, a bit odd. Is there some (strange) threading going on in the skimage tests? That would break warning testing pretty badly, since warning handling is not thread safe in python. I have a bit difficulty to see how else this could happen, but maybe should look at the exact code the test runs. |
Since there is no Of course the whole warnings stuff breaks down if you have threads. For example: thread1: enters warning context -> replaces normal warning printing Btw. I see you have a "try to clean up after |
I just got the problem when I tried to build skimage for Debian, and I have no idea here. However, I opened scikit-image/scikit-image#2412 to have them involved. |
Just for completeness: Sometimes, I even get a stacktrace without any involvement of skimage:
|
What versions of python, numpy, etc., are you compiling with? |
Numpy 1.12 |
Hmm, I am confused.... I can't see why there would be threading problems, but also can't really fathom this error occuring without either a race condition or incorrect nesting of |
I would also think it is a race condition, since it does not happen always. Running the build twice under exactly the same environment lets the problem pop up on different places (or even not at all). |
@olebole how exactly are you running the test suit? |
Copied from our test suite:
The |
Hmmmpf, anyone knows what exactly happens under the hood when nose starts to test a module? Nose confuses me, and I can't see why it would end up also messing around with warnings.... Or it is a bug in suppress warnings after all, but can't really see it, heh. |
In my nosetests man page, I have
It seems that there is no parallel testing by default. At least, the race condition should not be there. |
Is there any remote chance that nose runs this |
No, I am being silly probably. The suppress warning context resets |
Well, I see 33 errors with skimage-0.9.3, most from PIL or indexing, but none for suppress. How do you deal with all the other errors? |
@charris The issue is reported for 0.12.3, not 0.9.3. :) |
OK, finally got that pulled down from upstream, now 39 errors and a ton of deprecations. Most of the warnings seem due to the fact that tests think they are running in QT instead of Wayland, might be a configuration problem here. I also need to run the tests as
as |
Deprecations are normal. We still check our old API, even if a deprecation is expected. About errors, it's not normal. Would you mind reporting them on our (scikit-image) bug tracker please? |
@sciunto The docs could use instructions for testing locally. |
OK, I see it in the same place. The test is
The |
Bit more information, doing
Which uses NumPy
|
Note that scikit-image has it's own context manager for warnings in Out of curiosity, I wonder if it makes a difference to explicitly specify |
So I don't see errors with
Whereas
Shows it about 25% of the time. That suggests that the source of the error lies elsewhere and the failing test just exposes it. |
In particular, |
OK, now I suspect the |
EDIT: And now I can't reproduce the problem at all. Hmm... Maybe also depends on what else is running on the machine. |
Hmmm, the |
something in EDIT: Perhaps because is precedes the |
Hmm, putting it all in |
@shoyer may have an opinion as well. I just implemented |
I don't consider the failure as terribly severe but it would be nice to have it fixed. A (private?) python version of isnat would be fine with me for 1.12. |
The |
It does :) |
thanks a lot guys for jumping on the issue during holidays and for the thorough debug is there anything i can do to implement/test a solution for this? debian is eager to get a fix for this :) |
@sandrotosi, unfortunatly it is a bit tricky maybe, the simplest fix may be to simply not use the parallel testing stuff in skimage, but that somewhat defeats the purpose. We can pretty simply do something like in my isnat stuff (using a private python version of isnat). However, I am not quite certain that it may not create test regressions elsewhere :/. Adding a mutex for those two cases may actually be a plausible hack which should remove the trouble and seems unlikely to go bad (since you would not call |
@olebole would disable parallel testing in skimage be an acceptable temporary solution? @seberg yeah i think the target might be to have a minimal fix for 1.12 and eventually address it in a more complete/comprehensive way in master - that at least would let the next debian release have numpy without this issue |
@sandrotosi I tried the lock approach in gh-8427. I am not whether or not the thought is nuts, but if someone wants to try it.... |
Is there some way we could simply modify (I have some ideas, but I'm still thinking through how exactly it could work.) |
Sure, we can just not delete the attribute, but it might create bugs in the tests lateron... Though I guess most test suits are not as nitpicky as numpy about testing warnings, so.... |
This lock prevents unsafe warning filter manipulations during testing if downstream packages do parallel testing. Warning filtering is generally not threadsafe in python, this is also true for `catch_warnings` or `suppress_warinings`. In NumPy 1.12 however, `assert_equal` and the array comparison asserts, use this to filter out some comparison warnings. Since removing this filter may also affect downstream projects and skimage (and possibly more) do parallel manual parallel testing using `assert_equal`, a quick fix seems to be to lock the less obvious threading trap. Ideally (in master this is the case), there should simply not be warning filter logic in the assert functions themself. While probably not perfectly safe in principle, it is sufficient in the case of skimage and probably most testing scenarios and the chance of deadlocks seems very unlikely. Closes numpygh-8413
I've uploaded 1.12.0rc2 (which contains #8427) to debian and rebuilt 3 times skimage (a slightly old version of teh debian package, without the test suite completely disabled) and all the times it built succesfully. thanks a lot guys for working on this during the holidays! Now, any plans for the final 1.12.0 release? :) |
I plan on making the final release Jan 15. |
I can confirm that it works with rc2. Thank you very much for your efforts! |
I'm leaving this open until it is fixed in master. |
Fixed by #8421. |
Using numpy version Just wanted to share some code here to reproduce this behavior since resources are sparse elsewhere. I ended up resolving my issue by not using the import warnings
from concurrent.futures import ThreadPoolExecutor
import numpy as np
def compute_metrics(iteration: int):
with np.testing.suppress_warnings():
warnings.warn("Test warning")
return f"thread {iteration} done"
with ThreadPoolExecutor(max_workers=10) as executor:
futures = []
for i in range(100):
future = executor.submit(compute_metrics, i)
futures.append(future)
for future in futures:
result = future.result()
print(result) |
It's documented to not be thread-safe and neither is |
When trying to compile skimage, I sometimes get the following error:
I assume this is a numpy problem, but I am not sure.
The text was updated successfully, but these errors were encountered: