arXiv Analytics

Sign in

arXiv:2310.20630 [stat.ML]AbstractReferencesReviewsResources

Projecting basis functions with tensor networks for Gaussian process regression

Clara Menzen, Eva Memmel, Kim Batselier, Manon Kok

Published 2023-10-31Version 1

This paper presents a method for approximate Gaussian process (GP) regression with tensor networks (TNs). A parametric approximation of a GP uses a linear combination of basis functions, where the accuracy of the approximation depends on the total number of basis functions $M$. We develop an approach that allows us to use an exponential amount of basis functions without the corresponding exponential computational complexity. The key idea to enable this is using low-rank TNs. We first find a suitable low-dimensional subspace from the data, described by a low-rank TN. In this low-dimensional subspace, we then infer the weights of our model by solving a Bayesian inference problem. Finally, we project the resulting weights back to the original space to make GP predictions. The benefit of our approach comes from the projection to a smaller subspace: It modifies the shape of the basis functions in a way that it sees fit based on the given data, and it allows for efficient computations in the smaller subspace. In an experiment with an 18-dimensional benchmark data set, we show the applicability of our method to an inverse dynamics problem.

Related articles: Most relevant | Search more
arXiv:2308.06149 [stat.ML] (Published 2023-08-11)
Gaussian Process Regression for Maximum Entropy Distribution
arXiv:1912.06689 [stat.ML] (Published 2019-12-13)
Frequentist Consistency of Gaussian Process Regression
arXiv:2412.17455 [stat.ML] (Published 2024-12-23)
Learning from Summarized Data: Gaussian Process Regression with Sample Quasi-Likelihood