arXiv Analytics

Sign in

arXiv:1408.1920 [math.NA]AbstractReferencesReviewsResources

Algorithms for Kullback-Leibler Approximation of Probability Measures in Infinite Dimensions

Frank J. Pinski, Gideon Simpson, Andrew M. Stuart, Hendrik Weber

Published 2014-08-08Version 1

In this paper we study algorithms to find a Gaussian approximation to a target measure defined on a Hilbert space of functions; the target measure itself is defined via its density with respect to a reference Gaussian measure. We employ the Kullback-Leibler divergence as a distance and find the best Gaussian approximation by minimizing this distance. It then follows that the approximate Gaussian must be equivalent to the Gaussian reference measure, defining a natural function space setting for the underlying calculus of variations problem. We introduce a computational algorithm which is well-adapted to the required minimization, seeking to find the mean as a function, and parameterizing the covariance in two different ways: through low rank perturbations of the reference covariance; and through Schr\"odinger potential perturbations of the inverse reference covariance. Two applications are shown: to a nonlinear inverse problem in elliptic PDEs, and to a conditioned diffusion process. We also show how the Gaussian approximations we obtain may be used to produce improved pCN-MCMC methods which are not only well-adapted to the high-dimensional setting, but also behave well with respect to small observational noise (resp. small temperatures) in the inverse problem (resp. conditioned diffusion).

Related articles: Most relevant | Search more
arXiv:1707.09687 [math.NA] (Published 2017-07-31)
Convergence of Lebenberg-Marquard method for the Inverse Problem with an Interior Measurement
arXiv:1709.03092 [math.NA] (Published 2017-09-10)
Conjugate gradient based acceleration for inverse problems
arXiv:1903.02762 [math.NA] (Published 2019-03-07)
The Inverse Problem of Numerical Differentiation