-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG + 1] Run doctests only on numpy 1.14 and fix them with numpy 1.14 formatting changes #10835
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Fix doctests due to numpy 1.14 formatting changes.
3e1171a
to
80222b9
Compare
LGTM. Tough (tedious) one @lesteve :) |
Thanks, I automated some of it if that can make you sleep better ;-). |
what are the benefits of this for users over keeping the legacy formatting?
at a minimum this PR needs to document the minimum dependencies for running
doctests / we need to make sure that when we suggest users run tests the
invocation is exclusive of doctests.
One small concern I have is that doctests have previously informed us of
behaviour varying across platforms or dependency versions. I'm not sure
that was ever their intention so perhaps we don't need to worry about it,
but having tight dependencies for doctests means we no longer get that.
|
I am split regarding @jnothman comment. It is true that for users, this PR has not benefit. However, on the dev side, I find a bit silly that we ask the dev from numpy to maintain a legacy mode because we are testing our doctest. IMO, it is a good thing to run the doctest to be sure that we actually have all documentation up to date but it seems a misuage to use the doctest to find bugs. I would hope that our unit testing do this job perfectly (sik). But this is the theory and in practice I should admit @jnothman has a point. |
Users are not impacted by this. Very edge-casy, but for users with numpy >= 1.14 (numpy 1.14 was released two months ago so that will become more and more common with time), it may reduce surprises if they run examples from the documentation and get 0.9999999 instead of 1 or the other way around.
Just to be clear what this PR does is that doctests are skipped with numpy < 1.14 (the same way some tests are skipped if you don't have pandas installed). I am not sure we need to document this, the same way we don't document that pandas and pyamg should be installed if you want to run all the tests. Suppose they have numpy < 1.14 and they break one of the doctest due to a numpy >= 1.14 formatting change (to be honest this is very very unlikely I would say unless they add doctests in the PR which does not happen that often). Travis build with numpy 1.14 would break and they would be in the same situation as if the Travis build with numpy 1.8.3 broke. We would tell them to try to reproduce locally by installing the same versions of the dependencies as in the Travis build.
I don't think we should rely on doctests to catch bugs. IIRC doctests variations were more due to platforms rather than numpy versions, I may be wrong about that. |
your last statement suggests we need a numpy 1.14 run on appveyor. We don't
already have one, do we?
I'm okay with this.
|
I don't think we are testing the doctests under AppVeyor currently. For example you can see |
Was that a +1 @jnothman or do you want something more to be done before it can be merged? |
Sorry, I don't have much head space for thinking critically about this kind of thing atm. Can I say +0 for now? :\ |
Fine with me, you are doing a lot in scikit-learn atm and although your head space seems quite good at expanding at will, there must be physical constraints that kick in at some point :-). I completely understand this is not a extremely pressing issue but at the same time I don't think this changes much (if anything) in the thoroughness of our testing and, call be naive maybe ;-), I was not expecting it to be in the least controversial. |
Haha :) Atm, my Scikit-learn goal is to respond to updated issues if they can be handled relatively quickly. While I'm teaching (until July), my usual work hours are reduced, and my Scikit-learn time at work is minimal. My out-of-usual-work time has other commitments (including teaching) atm. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, let's just do this and see if anyone complains
OK merging this one then. |
I think some documentation about this might be useful, as I was actually just wondering: "why are all my doctests just skipping?" for a PR I was updating (and only to find out thanks to Guillaume and by looking in the conftest.py file). But of course, not sure if people will look in the sklearn dev docs if they encounter such a situation .. :) |
I agree it may be a bit surprising but I would be in favour of lying and waiting until it goes away ;-). There is a comment in conftest.py (in the root folder). Feel free to improve it in case you feel it is not clear enough. By the way I did not know this, but I googled and found out that you can get the skip reason like this:
|
Ah, the |
Why not enabling it by default? The question is how many tests are skipped (a quick look at some Travis logs could answer this question) and whether identical skip messages are combined into one line or not. |
Turns out they are combined or a per-file basis. |
I don't think identical ones are combined into a single line:
so for the travis logs that skip all doctests, this will be quite verbose. The travis build with numpy 1.14 has 23 skipped tests (travis build 3), but the one with older numpy 297/290 skipped tests (travis build 1/2) |
Fair enough, so if there is no clear alley for improvement, maybe we should leave it like this? We can potentially revisit if people get tripped up by this. |
Quickly made the change in a PR, will be easier to see what it gives on travis: #10946 |
Plenty of doctest fixes due to numpy 1.14 formatting changes (for more context see #10132 for example). The main changes to review are:
conftest.py
: doctests are only run if numpy >= 1.14. Until now we were setting the legacy format to 1.13..travis.yml
: update numpy version to 1.14.2 and scipy to 1.0.0. The latter is necessary because otherwise we get a numpy warning from scipy 0.19.2 and one of our test that does not expect a warning breaks.