Skip to content

Gradient of ConstantKernel is Incorrect #11581

@adeandrade

Description

@adeandrade

Description

Given the function f(x,y) = x * y. The partial derivative of f(x,y) with respect to x is y.

When using a kernel combination of the kind:

ConstantKernel * RBF

The gradient of ConstantKernel should be the same as the evaluation of RBF.

Steps/Code to Reproduce

The following:

import numpy as np
from sklearn.gaussian_process.kernels import ConstantKernel, RBF

data = np.array([[1., 3.], [5., 6.]])

kernel = ConstantKernel(0.5) * RBF(1.0)

kernel(data, eval_gradient=True)[1][:, :, 0]

>>> array([[5.00000000e-01, 1.86332659e-06],
           [1.86332659e-06, 5.00000000e-01]])

Expected Results

Should match:

RBF(1.0)(data)

>>> array([[1.00000000e+00, 3.72665317e-06],
           [3.72665317e-06, 1.00000000e+00]])

Actual Results

Instead we are getting the value of evaluating kernel:

kernel(data)

>>> array([[5.00000000e-01, 1.86332659e-06],
           [1.86332659e-06, 5.00000000e-01]])

Versions

NumPy 1.14.5
SciPy 1.1.0
Scikit-Learn 0.19.2

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions