{ "id": "1306.1999", "version": "v3", "published": "2013-06-09T09:31:24.000Z", "updated": "2015-01-26T20:30:40.000Z", "title": "Variational inference for sparse spectrum Gaussian process regression", "authors": [ "Linda S. L. Tan", "Victor M. H. Ong", "David J. Nott", "Ajay Jasra" ], "comment": "20 pages, 11 figures, 1 table", "categories": [ "stat.CO" ], "abstract": "We develop a fast variational approximation scheme for Gaussian process (GP) regression, where the spectrum of the covariance function is subjected to a sparse approximation. Our approach enables uncertainty in covariance function hyperparameters to be treated without using Monte Carlo methods and is robust to overfitting. Our article makes three contributions. First, we present a variational Bayes algorithm for fitting sparse spectrum GP regression models that uses nonconjugate variational message passing to derive fast and efficient updates. Second, we propose a novel adaptive neighbourhood technique for obtaining predictive inference that is effective in dealing with nonstationarity. Regression is performed locally at each point to be predicted and the neighbourhood is determined using a measure defined based on lengthscales estimated from an initial fit. Weighting dimensions according to lengthscales, this downweights variables of little relevance, leading to automatic variable selection and improved prediction. Third, we introduce a technique for accelerating convergence in nonconjugate variational message passing by adapting step sizes in the direction of the natural gradient of the lower bound. Our adaptive strategy can be easily implemented and empirical results indicate significant speedups.", "revisions": [ { "version": "v2", "updated": "2013-08-15T02:14:19.000Z", "abstract": "We develop a fast deterministic variational approximation scheme for Gaussian process (GP) regression, where the spectrum of the covariance function is subjected to a sparse approximation. The approach enables uncertainty in covariance function hyperparameters to be treated without using Monte Carlo methods and is robust to overfitting. Our article makes three contributions. First, we present a variational Bayes algorithm for fitting sparse spectrum GP regression models, which makes use of nonconjugate variational message passing to derive fast and efficient updates. Second, inspired by related methods in classification, we propose a novel adaptive neighbourhood technique for obtaining predictive inference that is effective in dealing with nonstationarity. Regression is performed locally at each point to be predicted and the neighbourhood is determined using a measure defined based on lengthscales estimated from an initial fit. Weighting the dimensions according to the lengthscales effectively downweights variables of little relevance, leading to automatic variable selection and improved prediction. Third, we introduce a technique for accelerating convergence in nonconjugate variational message passing by adapting step sizes in the direction of the natural gradient of the lower bound. Our adaptive strategy can be easily implemented and empirical results indicate significant speed ups.", "comment": "36 pages, 7 figures", "journal": null, "doi": null, "authors": [ "Linda S. L. Tan", "David J. Nott" ] }, { "version": "v3", "updated": "2015-01-26T20:30:40.000Z" } ], "analyses": { "keywords": [ "sparse spectrum gaussian process regression", "variational inference", "spectrum gp regression models", "sparse spectrum gp regression", "deterministic variational approximation scheme" ], "note": { "typesetting": "TeX", "pages": 20, "language": "en", "license": "arXiv", "status": "editable", "adsabs": "2013arXiv1306.1999T" } } }