arXiv:1411.1810 [stat.ML]AbstractReferencesReviewsResources
Deterministic Annealing for Stochastic Variational Inference
Farhan Abrol, Stephan Mandt, Rajesh Ranganath, David Blei
Published 2014-11-07Version 1
Stochastic variational inference (SVI) maps posterior inference in latent variable models to non-convex stochastic optimization. While they enable approximate posterior inference for many otherwise intractable models, variational inference methods suffer from local optima. We introduce deterministic annealing for SVI to overcome this issue. We introduce a temperature parameter that deterministically deforms the objective, and then reduce this parameter over the course of the optimization. Initially it encourages high entropy variational distributions, which we find eases convergence to better optima. We test our method with Latent Dirichlet Allocation on three large corpora. Compared to SVI, we show improved predictive likelihoods on held-out data.