arXiv Analytics

Sign in

arXiv:1306.1999 [stat.CO]AbstractReferencesReviewsResources

Variational inference for sparse spectrum Gaussian process regression

Linda S. L. Tan, Victor M. H. Ong, David J. Nott, Ajay Jasra

Published 2013-06-09, updated 2015-01-26Version 3

We develop a fast variational approximation scheme for Gaussian process (GP) regression, where the spectrum of the covariance function is subjected to a sparse approximation. Our approach enables uncertainty in covariance function hyperparameters to be treated without using Monte Carlo methods and is robust to overfitting. Our article makes three contributions. First, we present a variational Bayes algorithm for fitting sparse spectrum GP regression models that uses nonconjugate variational message passing to derive fast and efficient updates. Second, we propose a novel adaptive neighbourhood technique for obtaining predictive inference that is effective in dealing with nonstationarity. Regression is performed locally at each point to be predicted and the neighbourhood is determined using a measure defined based on lengthscales estimated from an initial fit. Weighting dimensions according to lengthscales, this downweights variables of little relevance, leading to automatic variable selection and improved prediction. Third, we introduce a technique for accelerating convergence in nonconjugate variational message passing by adapting step sizes in the direction of the natural gradient of the lower bound. Our adaptive strategy can be easily implemented and empirical results indicate significant speedups.

Related articles: Most relevant | Search more
arXiv:2104.07537 [stat.CO] (Published 2021-04-15)
Variational Inference for the Smoothing Distribution in Dynamic Probit Models
arXiv:1410.6460 [stat.CO] (Published 2014-10-23)
Markov Chain Monte Carlo and Variational Inference: Bridging the Gap
arXiv:1903.06616 [stat.CO] (Published 2019-03-07)
Streamlined Computing for Variational Inference with Higher Level Random Effects