arXiv Analytics

Sign in

arXiv:2106.08443 [stat.ML]AbstractReferencesReviewsResources

Reproducing Kernel Hilbert Space, Mercer's Theorem, Eigenfunctions, Nyström Method, and Use of Kernels in Machine Learning: Tutorial and Survey

Benyamin Ghojogh, Ali Ghodsi, Fakhri Karray, Mark Crowley

Published 2021-06-15Version 1

This is a tutorial and survey paper on kernels, kernel methods, and related fields. We start with reviewing the history of kernels in functional analysis and machine learning. Then, Mercer kernel, Hilbert and Banach spaces, Reproducing Kernel Hilbert Space (RKHS), Mercer's theorem and its proof, frequently used kernels, kernel construction from distance metric, important classes of kernels (including bounded, integrally positive definite, universal, stationary, and characteristic kernels), kernel centering and normalization, and eigenfunctions are explained in detail. Then, we introduce types of use of kernels in machine learning including kernel methods (such as kernel support vector machines), kernel learning by semi-definite programming, Hilbert-Schmidt independence criterion, maximum mean discrepancy, kernel mean embedding, and kernel dimensionality reduction. We also cover rank and factorization of kernel matrix as well as the approximation of eigenfunctions and kernels using the Nystr{\"o}m method. This paper can be useful for various fields of science including machine learning, dimensionality reduction, functional analysis in mathematics, and mathematical physics in quantum mechanics.

Comments: To appear as a part of an upcoming textbook on dimensionality reduction and manifold learning
Categories: stat.ML, cs.LG, math.FA
Tags: textbook
Related articles: Most relevant | Search more
arXiv:2106.07263 [stat.ML] (Published 2021-06-14)
Machine Learning for Variance Reduction in Online Experiments
arXiv:1312.2171 [stat.ML] (Published 2013-12-08, updated 2014-11-24)
bartMachine: Machine Learning with Bayesian Additive Regression Trees
arXiv:1711.10781 [stat.ML] (Published 2017-11-29)
Introduction to Tensor Decompositions and their Applications in Machine Learning