arXiv Analytics

Sign in

arXiv:1705.03439 [stat.ML]AbstractReferencesReviewsResources

Frequentist Consistency of Variational Bayes

Yixin Wang, David M. Blei

Published 2017-05-09Version 1

A key challenge for modern Bayesian statistics is how to perform scalable inference of posterior distributions. To address this challenge, VB methods have emerged as a popular alternative to the classical MCMC methods. VB methods tend to be faster while achieving comparable predictive performance. However, there are few theoretical results around VB. In this paper, we establish frequentist consistency and asymptotic normality of VB methods. Specifically, we connect VB methods to point estimates based on variational approximations, called frequentist variational approximations, and we use the connection to prove a variational Bernstein-von-Mises theorem. The theorem leverages the theoretical characterizations of frequentist variational approximations to understand asymptotic properties of VB. In summary, we prove that (1) the VB posterior converges to the KL minimizer of a normal distribution, centered at the truth and (2) the corresponding variational expectation of the parameter is consistent and asymptotically normal. As applications of the theorem, we derive asymptotic properties of VB posteriors in Bayesian mixture models, Bayesian generalized linear mixed models, and Bayesian stochastic block models. We conduct a simulation study to illustrate these theoretical results.

Related articles: Most relevant | Search more
arXiv:1905.10859 [stat.ML] (Published 2019-05-26)
Variational Bayes under Model Misspecification
arXiv:1903.00617 [stat.ML] (Published 2019-03-02)
Approximation Properties of Variational Bayes for Vector Autoregressions
arXiv:2202.05650 [stat.ML] (Published 2022-02-11)
Bernstein Flows for Flexible Posteriors in Variational Bayes