This is a MATLAB
implementation of the "marginal GP" (MGP) described
in:
Garnett, R., Osborne, M., and Hennig, P. Active Learning of Linear Embeddings for Gaussian Processes. (2014). 30th Conference on Uncertainty in Artificial Intellignece (UAI 2014).
Suppose we have a Gaussian process model on a latent function :
where are the hyperparameters of the model. Suppose we have a dataset of observations and a test point . This function returns the mean and variance of the approximate marginal predictive distributions for the associated observation value and latent function value :
where we have marginalized over the hyperparameters .
This code is only appropriate for GP regression! Exact inference with a Gaussian observation likelihood is assumed.
The MGP approximation requires that the provided hyperparameters be the MLE hyperparameters:
or, if using a hyperparameter prior , the MAP hyperparameters:
This function does not perform the maximization over but rather assumes that the given hyperparameters represent .
This code is written to be interoperable with the GPML MATLAB toolbox, available here:
http://www.gaussianprocess.org/gpml/code/matlab/doc/
The GPML toolbox must be in your MATLAB path for this function to
work. This function also depends on the gpml_extensions
repository,
available here:
https://github.com/rmgarnett/gpml_extensions/
which must also be in your MATLAB path.
The usage of mgp.m
is identical to the gp.m
function from the GPML
toolkit in prediction mode. See mgp.m
for more information.
A demo is provided in demo/demo.m
.