-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Build failures for master/0.20 #11878
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
test_classification_report_dictionary_output fails on 2.7 |
Failure on plat=i686:
Should we change the tolerance? This seems not that great. |
also fails on i686. Though bumping from 1e-6 to 1e-5 or 1e-4 would fix that I think? |
is optics supposed to have NaNs in the test: |
classification report fails because recall has a change in the 16th decimal |
@NicolasHug you can help here if you like ;) |
We probably need to port for the classification report test. Or we need to use the |
The failure in the def test_sparse_oneclasssvm and check_svm_model_equal are probably triggered by the |
Going the skip route now with #11880 we'll be able to see more clearly what's happening, I hope? |
Can someone with Optics knowledge comment on the failure there? https://travis-ci.org/MacPython/scikit-learn-wheels/jobs/404543529#L1208 |
I hope I have a moment to look at various things today. |
Possibly changing the tolerance is enough as a hotfix for OPTICS, it's a very minor mismatch. |
Appveyor fails because I'm not running the tests in the right environment / right folder or something: ping @ogrisel @rth I'm not sure that the issue is :-/ That's the last thing really holding back the RC I think. |
It's hard to tell from the output of https://travis-ci.org/MacPython/scikit-learn-wheels/jobs/404543529#L1208 ; the truncated print shows exact matches for the first and last entries. If it's just a slight mismatch due to rounding difference on 32-bit, bumping the decimal place should be fine to fix. If there's a difference in ordering, it may still fail even with the decimal increment. I mentioned a possible cause for the latter case in #11857 , but I don't have access to 32-bit instance to check. |
@espg everybody has access to a virtual machine ;) |
Alternatively, see docker setup in MacPython/scikit-learn-wheels#7 (comment) |
All scikit-learn-wheels should have been fixed (or skipped), @jnothman is there anything remaining before creating the http://scikit-learn.org/stable/developers/maintainer.html#making-a-release |
I think we're good to branch now. And if my scikit-learn-wheels PR (
MacPython/scikit-learn-wheels#7) turns green, I
assume @ogrisel or @amueller should merge.
Are you intending to do the branching?
…On Wed, 29 Aug 2018 at 17:53, Roman Yurchak ***@***.***> wrote:
All scikit-learn-wheels should have been fixed (or skipped), @jnothman
<https://github.com/jnothman> is there anything remaining before creating
the 0.20.X branch, and incrementing versions there ?
http://scikit-learn.org/stable/developers/maintainer.html#making-a-release
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#11878 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz68qml9OTUIt6d-aYHbQl_nqv_5Kkks5uVkiUgaJpZM4WGBo->
.
|
If you can, please do :)
Yes, but wouldn't it be better to update the version in the branch and the submodule in the wheels PR to point to |
I am assuming that it can be closed since everything seems to be green now. Please reopen with a summary of the remaining problems if I missed something. |
This is to track current build failures in case someone wants to help ;)
https://travis-ci.org/MacPython/scikit-learn-wheels
conda-forge/scikit-learn-feedstock#70
Failures here:
https://travis-ci.org/MacPython/scikit-learn-wheels/jobs/404543529
https://travis-ci.org/MacPython/scikit-learn-wheels/jobs/404543530
related to OneClassSVM and Optics.
We should configure pytest to provide nicer tracebacks this is terrible :-/
The text was updated successfully, but these errors were encountered: