arXiv Analytics

Sign in

arXiv:1411.1810 [stat.ML]AbstractReferencesReviewsResources

Deterministic Annealing for Stochastic Variational Inference

Farhan Abrol, Stephan Mandt, Rajesh Ranganath, David Blei

Published 2014-11-07Version 1

Stochastic variational inference (SVI) maps posterior inference in latent variable models to non-convex stochastic optimization. While they enable approximate posterior inference for many otherwise intractable models, variational inference methods suffer from local optima. We introduce deterministic annealing for SVI to overcome this issue. We introduce a temperature parameter that deterministically deforms the objective, and then reduce this parameter over the course of the optimization. Initially it encourages high entropy variational distributions, which we find eases convergence to better optima. We test our method with Latent Dirichlet Allocation on three large corpora. Compared to SVI, we show improved predictive likelihoods on held-out data.

Related articles: Most relevant | Search more
arXiv:1406.3650 [stat.ML] (Published 2014-06-13, updated 2014-11-18)
Smoothed Gradients for Stochastic Variational Inference
arXiv:2004.00115 [stat.ML] (Published 2020-03-31)
Exact marginal inference in Latent Dirichlet Allocation
arXiv:1510.08628 [stat.ML] (Published 2015-10-29)
WarpLDA: a Simple and Efficient O(1) Algorithm for Latent Dirichlet Allocation