arXiv Analytics

Sign in

arXiv:2002.12253 [stat.ML]AbstractReferencesReviewsResources

MetFlow: A New Efficient Method for Bridging the Gap between Markov Chain Monte Carlo and Variational Inference

Achille Thin, Nikita Kotelevskii, Jean-Stanislas Denain, Leo Grinsztajn, Alain Durmus, Maxim Panov, Eric Moulines

Published 2020-02-27Version 1

In this contribution, we propose a new computationally efficient method to combine Variational Inference (VI) with Markov Chain Monte Carlo (MCMC). This approach can be used with generic MCMC kernels, but is especially well suited to \textit{MetFlow}, a novel family of MCMC algorithms we introduce, in which proposals are obtained using Normalizing Flows. The marginal distribution produced by such MCMC algorithms is a mixture of flow-based distributions, thus drastically increasing the expressivity of the variational family. Unlike previous methods following this direction, our approach is amenable to the reparametrization trick and does not rely on computationally expensive reverse kernels. Extensive numerical experiments show clear computational and performance improvements over state-of-the-art methods.

Related articles: Most relevant | Search more
arXiv:1810.07151 [stat.ML] (Published 2018-10-16)
Metropolis-Hastings view on variational inference and adversarial training
arXiv:1910.06539 [stat.ML] (Published 2019-10-15)
Challenges in Bayesian inference via Markov chain Monte Carlo for neural networks
arXiv:1906.06663 [stat.ML] (Published 2019-06-16)
Sampler for Composition Ratio by Markov Chain Monte Carlo