arXiv Analytics

Sign in

arXiv:2407.03608 [math.NA]AbstractReferencesReviewsResources

Gaussian process regression with log-linear scaling for common non-stationary kernels

P. Michael Kielstra, Michael Lindsey

Published 2024-07-04Version 1

We introduce a fast algorithm for Gaussian process regression in low dimensions, applicable to a widely-used family of non-stationary kernels. The non-stationarity of these kernels is induced by arbitrary spatially-varying vertical and horizontal scales. In particular, any stationary kernel can be accommodated as a special case, and we focus especially on the generalization of the standard Mat\'ern kernel. Our subroutine for kernel matrix-vector multiplications scales almost optimally as $O(N\log N)$, where $N$ is the number of regression points. Like the recently developed equispaced Fourier Gaussian process (EFGP) methodology, which is applicable only to stationary kernels, our approach exploits non-uniform fast Fourier transforms (NUFFTs). We offer a complete analysis controlling the approximation error of our method, and we validate the method's practical performance with numerical experiments. In particular we demonstrate improved scalability compared to to state-of-the-art rank-structured approaches in spatial dimension $d>1$.

Related articles: Most relevant | Search more
arXiv:1908.00424 [math.NA] (Published 2019-07-31)
Gaussian Process Regression and Conditional Polynomial Chaos for Parameter Estimation
arXiv:2003.11910 [math.NA] (Published 2020-03-24)
Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold
arXiv:2112.02467 [math.NA] (Published 2021-12-05, updated 2022-09-20)
Rectangularization of Gaussian process regression for optimization of hyperparameters