-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
TST use global_random_seed in sklearn/utils/tests/test_optimize.py #30112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TST use global_random_seed in sklearn/utils/tests/test_optimize.py #30112
Conversation
test_newton_cg uses global_random_seed fixture
test_optimize
@@ -28,7 +28,7 @@ def grad_hess(x): | |||
return grad(x), lambda x: A.T.dot(A.dot(x)) | |||
|
|||
assert_array_almost_equal( | |||
_newton_cg(grad_hess, func, grad, x0, tol=1e-10)[0], | |||
_newton_cg(grad_hess, func, grad, x0, tol=1e-7)[0], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With a tolerance of 1e-10 I had two test failures. I was able to resolve these by reducing the threshold to 1e-7. It's not an optimal solution but the default of tol
is 1e-4. So I think this solution should be acceptable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's an okay solution. Both functions don't have the same parameters so it's hard to find the right combination such that they have the same behavior.
I replaced with assert_allclose
and added a comment to add a bit more explanation.
test_newton_cg
test_newton_cg
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks @marenwestermann
Reference Issues/PRs
towards #22827
What does this implement/fix? Explain your changes.
adds the global random seed fixture to the test
test_newton_cg
.Any other comments?
I took over from the stale PR #26462