Variational learning of inducing variables in sparse Gaussian processes

M Titsias - Artificial intelligence and statistics, 2009 - proceedings.mlr.press
Artificial intelligence and statistics, 2009proceedings.mlr.press
Sparse Gaussian process methods that use inducing variables require the selection of the
inducing inputs and the kernel hyperparameters. We introduce a variational formulation for
sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters
by maximizing a lower bound of the true log marginal likelihood. The key property of this
formulation is that the inducing inputs are defined to be variational parameters which are
selected by minimizing the Kullback-Leibler divergence between the variational distribution …
Abstract
Sparse Gaussian process methods that use inducing variables require the selection of the inducing inputs and the kernel hyperparameters. We introduce a variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood. The key property of this formulation is that the inducing inputs are defined to be variational parameters which are selected by minimizing the Kullback-Leibler divergence between the variational distribution and the exact posterior distribution over the latent function values. We apply this technique to regression and we compare it with other approaches in the literature.
proceedings.mlr.press
Showing the best result for this search. See all results