-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Prevent division by zero in GPR when y_train is constant #19703
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Also please fix the linting problem reported by the CI and expand the test by testing with multi-target data: a Y matrix where one column is a constant 2. for instance and the other is random normal data: n_samples = X.shape[0]
rng = np.random.RandomState(0)
Y = np.concatenate([
rng.normal(size=(n_samples, 1)), # non-constant target
np.full(shape=(n_samples, 1), fill_value=2) # constant target
], axis=1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for improving the tests. LGTM. Just a few more suggestions below.
I am no GPR specialist so I would appreciate it if others (e.g. @jaburke166 @boricles, @sobkevich, @jmetzen, @plgreenLIRU) could have a look.
Added a commented test as discussed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
Can this be merged? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Last detail, then LGTM.
Co-authored-by: Chiara Marmo <cmarmo@users.noreply.github.com>
Hi,
But I really don't know if it is important to save this equality for case where y is constant |
Note that for the fixed kernel: |
Maybe) |
Anything to be done here? |
Ping. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I merge main in the branch. LGTM. I just move the code of the test regarding multitarget in the related PR.
Thanks a lot everyone! |
The bug regarding the covariance and standard deviation is solved here: #19939 |
…n#19703) Co-authored-by: Sasha Fonari <fonari@schrodinger.com> Co-authored-by: Chiara Marmo <cmarmo@users.noreply.github.com> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Sasha Fonari <fonari@schrodinger.com> Co-authored-by: Chiara Marmo <cmarmo@users.noreply.github.com> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
This is merged PR of two PRs: #18388, #19361.
This fixes: #18318.