arXiv Analytics

Sign in

arXiv:1807.02582 [stat.ML]AbstractReferencesReviewsResources

Gaussian Processes and Kernel Methods: A Review on Connections and Equivalences

Motonobu Kanagawa, Philipp Hennig, Dino Sejdinovic, Bharath K Sriperumbudur

Published 2018-07-06Version 1

This paper is an attempt to bridge the conceptual gaps between researchers working on the two widely used approaches based on positive definite kernels: Bayesian learning or inference using Gaussian processes on the one side, and frequentist kernel methods based on reproducing kernel Hilbert spaces on the other. It is widely known in machine learning that these two formalisms are closely related; for instance, the estimator of kernel ridge regression is identical to the posterior mean of Gaussian process regression. However, they have been studied and developed almost independently by two essentially separate communities, and this makes it difficult to seamlessly transfer results between them. Our aim is to overcome this potential difficulty. To this end, we review several old and new results and concepts from either side, and juxtapose algorithmic quantities from each framework to highlight close similarities. We also provide discussions on subtle philosophical and theoretical differences between the two approaches.

Related articles: Most relevant | Search more
arXiv:2310.20630 [stat.ML] (Published 2023-10-31)
Projecting basis functions with tensor networks for Gaussian process regression
arXiv:2410.08361 [stat.ML] (Published 2024-10-10)
Upper Bounds for Learning in Reproducing Kernel Hilbert Spaces for Orbits of an Iterated Function System
arXiv:1212.6246 [stat.ML] (Published 2012-12-26)
Gaussian Process Regression with Heteroscedastic or Non-Gaussian Residuals