arXiv Analytics

Sign in

arXiv:1611.08642 [math.PR]AbstractReferencesReviewsResources

Gaussian Approximations for Probability Measures on $\mathbf{R}^d$

Yulong Lu, Andrew M. Stuart, Hendrik Weber

Published 2016-11-26Version 1

This paper concerns the approximation of probability measures on $\mathbf{R}^d$ with respect to the Kullback-Leibler divergence. Given an admissible target measure, we show the existence of the best approximation, with respect to this divergence, from certain sets of Gaussian measures and Gaussian mixtures. The asymptotic behavior of such best approximations is then studied in the small parameter limit where the measure concentrates; this asymptotic behaviour is characterized using $\Gamma$-convergence. The theory developed is then applied to understanding the frequentist consistency of Bayesian inverse problems. For a fixed realization of noise, we show the asymptotic normality of the posterior measure in the small noise limit. Taking into account the randomness of the noise, we prove a Bernstein-Von Mises type result for the posterior measure.

Related articles: Most relevant | Search more
arXiv:1704.02660 [math.PR] (Published 2017-04-09)
Centers of probability measures without the mean
arXiv:1312.6589 [math.PR] (Published 2013-12-23, updated 2014-10-02)
The linear topology associated with weak convergence of probability measures
arXiv:2007.10293 [math.PR] (Published 2020-07-20)
Weak Convergence of Probability Measures