arXiv Analytics

Sign in

arXiv:2307.03460 [stat.CO]AbstractReferencesReviewsResources

On the convergence of dynamic implementations of Hamiltonian Monte Carlo and No U-Turn Samplers

Alain Durmus, Samuel Gruffaz, Miika Kailas, Eero Saksman, Matti Vihola

Published 2023-07-07Version 1

There is substantial empirical evidence about the success of dynamic implementations of Hamiltonian Monte Carlo (HMC), such as the No U-Turn Sampler (NUTS), in many challenging inference problems but theoretical results about their behavior are scarce. The aim of this paper is to fill this gap. More precisely, we consider a general class of MCMC algorithms we call dynamic HMC. We show that this general framework encompasses NUTS as a particular case, implying the invariance of the target distribution as a by-product. Second, we establish conditions under which NUTS is irreducible and aperiodic and as a corrolary ergodic. Under conditions similar to the ones existing for HMC, we also show that NUTS is geometrically ergodic. Finally, we improve existing convergence results for HMC showing that this method is ergodic without any boundedness condition on the stepsize and the number of leapfrog steps, in the case where the target is a perturbation of a Gaussian distribution.

Comments: 24 pages without appendix and references, 2 figures, a future journal paper
Related articles: Most relevant | Search more
arXiv:1905.09813 [stat.CO] (Published 2019-05-23)
A Condition Number for Hamiltonian Monte Carlo
arXiv:1705.00166 [stat.CO] (Published 2017-04-29)
On the convergence of Hamiltonian Monte Carlo
arXiv:1903.03704 [stat.CO] (Published 2019-03-09)
NeuTra-lizing Bad Geometry in Hamiltonian Monte Carlo Using Neural Transport