Prevent division by zero in GPR when y_train is constant#19703
Prevent division by zero in GPR when y_train is constant#19703glemaitre merged 10 commits intoscikit-learn:mainfrom
Conversation
|
Also please fix the linting problem reported by the CI and expand the test by testing with multi-target data: a Y matrix where one column is a constant 2. for instance and the other is random normal data: n_samples = X.shape[0]
rng = np.random.RandomState(0)
Y = np.concatenate([
rng.normal(size=(n_samples, 1)), # non-constant target
np.full(shape=(n_samples, 1), fill_value=2) # constant target
], axis=1) |
ogrisel
left a comment
There was a problem hiding this comment.
Thanks for improving the tests. LGTM. Just a few more suggestions below.
I am no GPR specialist so I would appreciate it if others (e.g. @jaburke166 @boricles, @sobkevich, @jmetzen, @plgreenLIRU) could have a look.
|
Added a commented test as discussed. |
|
Can this be merged? |
Co-authored-by: Chiara Marmo <cmarmo@users.noreply.github.com>
Hi, But I really don't know if it is important to save this equality for case where y is constant |
Note that for the fixed kernel: |
Maybe) |
|
Anything to be done here? |
|
Ping. |
glemaitre
left a comment
There was a problem hiding this comment.
I merge main in the branch. LGTM. I just move the code of the test regarding multitarget in the related PR.
|
Thanks a lot everyone! |
|
The bug regarding the covariance and standard deviation is solved here: #19939 |
…n#19703) Co-authored-by: Sasha Fonari <fonari@schrodinger.com> Co-authored-by: Chiara Marmo <cmarmo@users.noreply.github.com> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Sasha Fonari <fonari@schrodinger.com> Co-authored-by: Chiara Marmo <cmarmo@users.noreply.github.com> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
This is merged PR of two PRs: #18388, #19361.
This fixes: #18318.