arXiv Analytics

Sign in

arXiv:1905.10859 [stat.ML]AbstractReferencesReviewsResources

Variational Bayes under Model Misspecification

Yixin Wang, David M. Blei

Published 2019-05-26Version 1

Variational Bayes (VB) is a scalable alternative to Markov chain Monte Carlo (MCMC) for Bayesian posterior inference. Though popular, VB comes with few theoretical guarantees, most of which focus on well-specified models. However, models are rarely well-specified in practice. In this work, we study VB under model misspecification. We prove the VB posterior is asymptotically normal and centers at the value that minimizes the Kullback-Leibler (KL) divergence to the true data-generating distribution. Moreover, the VB posterior mean centers at the same value and is also asymptotically normal. These results generalize the variational Bernstein--von Mises theorem [29] to misspecified models. As a consequence of these results, we find that the model misspecification error dominates the variational approximation error in VB posterior predictive distributions. It explains the widely observed phenomenon that VB achieves comparable predictive accuracy with MCMC even though VB uses an approximating family. As illustrations, we study VB under three forms of model misspecification, ranging from model over-/under-dispersion to latent dimensionality misspecification. We conduct two simulation studies that demonstrate the theoretical results.

Related articles: Most relevant | Search more
arXiv:1910.06539 [stat.ML] (Published 2019-10-15)
Challenges in Bayesian inference via Markov chain Monte Carlo for neural networks
arXiv:1906.06663 [stat.ML] (Published 2019-06-16)
Sampler for Composition Ratio by Markov Chain Monte Carlo
arXiv:2002.12253 [stat.ML] (Published 2020-02-27)
MetFlow: A New Efficient Method for Bridging the Gap between Markov Chain Monte Carlo and Variational Inference