Skip to content

"RuntimeWarning: invalid value encountered in true_divide" in tests #19334

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
7 tasks done
cmarmo opened this issue Feb 3, 2021 · 11 comments
Closed
7 tasks done

"RuntimeWarning: invalid value encountered in true_divide" in tests #19334

cmarmo opened this issue Feb 3, 2021 · 11 comments
Labels
module:test-suite everything related to our tests Sprint

Comments

@cmarmo
Copy link
Contributor

cmarmo commented Feb 3, 2021

The following tests throw a RuntimeWarning

RuntimeWarning: invalid value encountered in true_divide

in the current development version

  • test_sanity_check_pls_regression_constant_column_Y in sklearn/cross_decomposition/tests/test_pls.py
  • test_iforest_with_uniform_data in sklearn/ensemble/tests/test_iforest.py
  • test_kernel_gradient in sklearn/gaussian_process/tests/test_kernels.py
  • test_radius_neighbors_classifier_zero_distance in sklearn/neighbors/tests/test_neighbors.py
  • test_calibration_multiclass[0-True-isotonic] in sklearn/tests/test_calibration.py
  • test_estimators[LinearDiscriminantAnalysis()-check_classifiers_one_label] in sklearn/tests/test_common.py
  • test_random_choice_csc in sklearn/utils/tests/test_random.py

Those warnings are caused by matricial divisions having some element equal to zero in the denominator, either in the test itself or in the function/class called by the test.

The failure can be reproduced locally using

pytest -Werror::RuntimeWarning <test_file.py> -k <test_name>

cc @ogrisel, @reshamas

@cmarmo cmarmo added module:test-suite everything related to our tests Sprint labels Feb 3, 2021
@amueller
Copy link
Member

amueller commented Feb 3, 2021

If you're starting on this issue, maybe try do one file per PR, or potentially several if the errors are related, not one PR that addresses all of it.

@reshamas
Copy link
Member

reshamas commented Feb 3, 2021

If you're starting on this issue, maybe try do one file per PR, or potentially several if the errors are related, not one PR that addresses all of it.

I think this is confusing. I would say (doesn't matter if the errors are related, because folks can practice having multiple branches):

Submit only one file in each PR.

@nuka137
Copy link
Contributor

nuka137 commented Feb 6, 2021

@cmarmo

I will join next dev sprint (9th Feb).
I would like to tackle this issue in this sprint.

At first, I will tackle below test.

test_radius_neighbors_classifier_zero_distance in sklearn/neighbors/tests/test_neighbors.py

@marenwestermann
Copy link
Member

I'm working on test_sanity_check_pls_regression_constant_column_Y.

@mabu-dev
Copy link
Contributor

mabu-dev commented Feb 6, 2021

Working on sklearn/utils/tests/test_random.py

@t-kusanagi
Copy link
Contributor

I'll work on test_calibration_multiclass[0-True-isotonic] in sklearn/tests/test_calibration.py

@LSturtew
Copy link
Contributor

LSturtew commented Mar 5, 2021

I'm working on test_iforest_with_uniform_data in sklearn/ensemble/tests/test_iforest.py

@mbatoul
Copy link
Contributor

mbatoul commented Mar 12, 2021

I'm working on test_estimators in sklearn/tests/test_common.py

@lorentzenchr
Copy link
Member

@mbatoul It helps us, if you could open a PR even if it is on an early stage.

@mbatoul
Copy link
Contributor

mbatoul commented Mar 16, 2021

@lorentzenchr I am still trying to figure out the right way to solve this issue, but I can surely open a PR right now!

@lorentzenchr
Copy link
Member

The list is complete. Thank you all for your contributions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module:test-suite everything related to our tests Sprint
Projects
None yet
Development

No branches or pull requests

10 participants