Kernel which is composed of a set of other kernels.
Added in version 0.18.
The other kernels
>>> from sklearn.gaussian_process.kernels import WhiteKernel >>> from sklearn.gaussian_process.kernels import RBF >>> from sklearn.gaussian_process.kernels import CompoundKernel >>> kernel = CompoundKernel( ... [WhiteKernel(noise_level=3.0), RBF(length_scale=2.0)]) >>> print(kernel.bounds) [[-11.51292546 11.51292546] [-11.51292546 11.51292546]] >>> print(kernel.n_dims) 2 >>> print(kernel.theta) [1.09861229 0.69314718]
Return the kernel k(X, Y) and optionally its gradient.
Note that this compound kernel returns the results of all simple kernel stacked along an additional axis.
Left argument of the returned kernel k(X, Y)
Right argument of the returned kernel k(X, Y). If None, k(X, X) is evaluated instead.
Determines whether the gradient with respect to the log of the kernel hyperparameter is computed.
Kernel k(X, Y)
The gradient of the kernel k(X, X) with respect to the log of the hyperparameter of the kernel. Only returned when eval_gradient is True.
Returns the log-transformed bounds on the theta.
The log-transformed bounds on the kernel’s hyperparameters theta
Returns a clone of self with given hyperparameters theta.
The hyperparameters
Returns the diagonal of the kernel k(X, X).
The result of this method is identical to np.diag(self(X)); however, it can be evaluated more efficiently since only the diagonal is evaluated.
Argument to the kernel.
Diagonal of kernel k(X, X)
Get parameters of this kernel.
If True, will return the parameters for this estimator and contained subobjects that are estimators.
Parameter names mapped to their values.
Returns a list of all hyperparameter specifications.
Returns whether the kernel is stationary.
Returns the number of non-fixed hyperparameters of the kernel.
Returns whether the kernel is defined on discrete structures.
Set the parameters of this kernel.
The method works on simple kernels as well as on nested kernels. The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.
Returns the (flattened, log-transformed) non-fixed hyperparameters.
Note that theta are typically the log-transformed values of the kernel’s hyperparameters as this representation of the search space is more amenable for hyperparameter search, as hyperparameters like length-scales naturally live on a log-scale.
The non-fixed, log-transformed hyperparameters of the kernel
© 2007–2025 The scikit-learn developers
Licensed under the 3-clause BSD License.
https://scikit-learn.org/1.6/modules/generated/sklearn.gaussian_process.kernels.CompoundKernel.html