-
-
Notifications
You must be signed in to change notification settings - Fork 26k
FIX Bounds in anisotropic GP hyperparameter optimization #2867
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This is exactly the same solution as in #2798 with another unit test. :) |
That's true. I would recommend to merge #2798 in order to prevent that we get a third bug fix. |
random_start=random_start, verbose=False) | ||
gp.fit(X, y) | ||
y_pred, MSE = gp.predict(X, eval_MSE=True) | ||
|
||
assert_true(np.allclose(y_pred, y) and np.allclose(MSE, 0.)) | ||
|
||
assert_true(np.all(gp.theta_ >= thetaL)) # Lower bounds of hyperparameters | ||
assert_true(np.all(gp.theta_ <= thetaU)) # Upper bounds of hyperparameters |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PEP8:
tests/test_gaussian_process.py:73:45: E261 at least two spaces before inline comment
tests/test_gaussian_process.py:74:45: E261 at least two spaces before inline comment
I have never even looked at this code. Linear classifiers and trees are my game :) |
Could someone review this bugfix PR and merge it such that the issue is fixed in the 0.15 release? |
@ogrisel This is something which you might want to have included in 0.15. |
Merged as 9789fdb. |
Thanks for merging! |
Currently, if multi-dimensional thetaU and thetaL are given as bounds for anisotropic GP hyperparameter optimization, only the bounds in the last dimensions are obeyed (see added unittest). This pull request fixes this bug. The bug was related to https://stackoverflow.com/questions/10452770/python-lambdas-binding-to-local-values