arXiv Analytics

Sign in

arXiv:2207.08406 [stat.ML]AbstractReferencesReviewsResources

Kullback-Leibler and Renyi divergences in reproducing kernel Hilbert space and Gaussian process settings

Minh Ha Quang

Published 2022-07-18Version 1

In this work, we present formulations for regularized Kullback-Leibler and R\'enyi divergences via the Alpha Log-Determinant (Log-Det) divergences between positive Hilbert-Schmidt operators on Hilbert spaces in two different settings, namely (i) covariance operators and Gaussian measures defined on reproducing kernel Hilbert spaces (RKHS); and (ii) Gaussian processes with squared integrable sample paths. For characteristic kernels, the first setting leads to divergences between arbitrary Borel probability measures on a complete, separable metric space. We show that the Alpha Log-Det divergences are continuous in the Hilbert-Schmidt norm, which enables us to apply laws of large numbers for Hilbert space-valued random variables. As a consequence of this, we show that, in both settings, the infinite-dimensional divergences can be consistently and efficiently estimated from their finite-dimensional versions, using finite-dimensional Gram matrices/Gaussian measures and finite sample data, with {\it dimension-independent} sample complexities in all cases. RKHS methodology plays a central role in the theoretical analysis in both settings. The mathematical formulation is illustrated by numerical experiments.

Related articles: Most relevant | Search more
arXiv:2106.08443 [stat.ML] (Published 2021-06-15)
Reproducing Kernel Hilbert Space, Mercer's Theorem, Eigenfunctions, Nyström Method, and Use of Kernels in Machine Learning: Tutorial and Survey
arXiv:2402.04613 [stat.ML] (Published 2024-02-07)
Wasserstein Gradient Flows for Moreau Envelopes of f-Divergences in Reproducing Kernel Hilbert Spaces
arXiv:2311.13548 [stat.ML] (Published 2023-11-22)
Efficient Numerical Integration in Reproducing Kernel Hilbert Spaces via Leverage Scores Sampling