arXiv Analytics

Sign in

arXiv:1609.08203 [stat.ML]AbstractReferencesReviewsResources

Variational Inference with Hamiltonian Monte Carlo

Christopher Wolf, Maximilian Karl, Patrick van der Smagt

Published 2016-09-26Version 1

Variational inference lies at the core of many state-of-the-art algorithms. To improve the approximation of the posterior beyond parametric families, it was proposed to include MCMC steps into the variational lower bound. In this work we explore this idea using steps of the Hamiltonian Monte Carlo (HMC) algorithm, an efficient MCMC method. In particular, we incorporate the acceptance step of the HMC algorithm, guaranteeing asymptotic convergence to the true posterior. Additionally, we introduce some extensions to the HMC algorithm geared towards faster convergence. The theoretical advantages of these modifications are reflected by performance improvements in our experimental results.

Related articles: Most relevant | Search more
arXiv:1508.04319 [stat.ML] (Published 2015-08-18)
Non-Stationary Gaussian Process Regression with Hamiltonian Monte Carlo
arXiv:2209.12771 [stat.ML] (Published 2022-09-26)
Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps
arXiv:2310.20053 [stat.ML] (Published 2023-10-30)
Estimating optimal PAC-Bayes bounds with Hamiltonian Monte Carlo