arXiv Analytics

Sign in

arXiv:2209.12771 [stat.ML]AbstractReferencesReviewsResources

Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps

Simon Apers, Sander Gribling, Dániel Szilágyi

Published 2022-09-26Version 1

Hamiltonian Monte Carlo (HMC) is a Markov chain algorithm for sampling from a high-dimensional distribution with density $e^{-f(x)}$, given access to the gradient of $f$. A particular case of interest is that of a $d$-dimensional Gaussian distribution with covariance matrix $\Sigma$, in which case $f(x) = x^\top \Sigma^{-1} x$. We show that HMC can sample from a distribution that is $\varepsilon$-close in total variation distance using $\widetilde{O}(\sqrt{\kappa} d^{1/4} \log(1/\varepsilon))$ gradient queries, where $\kappa$ is the condition number of $\Sigma$. Our algorithm uses long and random integration times for the Hamiltonian dynamics. This contrasts with (and was motivated by) recent results that give an $\widetilde\Omega(\kappa d^{1/2})$ query lower bound for HMC with fixed integration times, even for the Gaussian case.

Related articles: Most relevant | Search more
arXiv:1508.04319 [stat.ML] (Published 2015-08-18)
Non-Stationary Gaussian Process Regression with Hamiltonian Monte Carlo
arXiv:1609.08203 [stat.ML] (Published 2016-09-26)
Variational Inference with Hamiltonian Monte Carlo
arXiv:2310.20053 [stat.ML] (Published 2023-10-30)
Estimating optimal PAC-Bayes bounds with Hamiltonian Monte Carlo