arXiv Analytics

Sign in

arXiv:2102.12956 [stat.ML]AbstractReferencesReviewsResources

Stein Variational Gradient Descent: many-particle and long-time asymptotics

Nikolas Nüsken, D. R. Michiel Renger

Published 2021-02-25Version 1

Stein variational gradient descent (SVGD) refers to a class of methods for Bayesian inference based on interacting particle systems. In this paper, we consider the originally proposed deterministic dynamics as well as a stochastic variant, each of which represent one of the two main paradigms in Bayesian computational statistics: variational inference and Markov chain Monte Carlo. As it turns out, these are tightly linked through a correspondence between gradient flow structures and large-deviation principles rooted in statistical physics. To expose this relationship, we develop the cotangent space construction for the Stein geometry, prove its basic properties, and determine the large-deviation functional governing the many-particle limit for the empirical measure. Moreover, we identify the Stein-Fisher information (or kernelised Stein discrepancy) as its leading order contribution in the long-time and many-particle regime in the sense of $\Gamma$-convergence, shedding some light on the finite-particle properties of SVGD. Finally, we establish a comparison principle between the Stein-Fisher information and RKHS-norms that might be of independent interest.

Related articles: Most relevant | Search more
arXiv:1609.06144 [stat.ML] (Published 2016-09-15)
Multilevel Monte Carlo for Scalable Bayesian Computations
arXiv:1906.06663 [stat.ML] (Published 2019-06-16)
Sampler for Composition Ratio by Markov Chain Monte Carlo
arXiv:1910.06539 [stat.ML] (Published 2019-10-15)
Challenges in Bayesian inference via Markov chain Monte Carlo for neural networks