arXiv Analytics

Sign in

arXiv:2310.20053 [stat.ML]AbstractReferencesReviewsResources

Estimating optimal PAC-Bayes bounds with Hamiltonian Monte Carlo

Szilvia Ujváry, Gergely Flamich, Vincent Fortuin, José Miguel Hernández Lobato

Published 2023-10-30Version 1

An important yet underexplored question in the PAC-Bayes literature is how much tightness we lose by restricting the posterior family to factorized Gaussian distributions when optimizing a PAC-Bayes bound. We investigate this issue by estimating data-independent PAC-Bayes bounds using the optimal posteriors, comparing them to bounds obtained using MFVI. Concretely, we (1) sample from the optimal Gibbs posterior using Hamiltonian Monte Carlo, (2) estimate its KL divergence from the prior with thermodynamic integration, and (3) propose three methods to obtain high-probability bounds under different assumptions. Our experiments on the MNIST dataset reveal significant tightness gaps, as much as 5-6\% in some cases.

Comments: Mathematics of Modern Machine Learning Workshop at NeurIPS 2023
Categories: stat.ML, cs.LG
Subjects: G.3
Related articles: Most relevant | Search more
arXiv:2209.12771 [stat.ML] (Published 2022-09-26)
Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps
arXiv:1508.04319 [stat.ML] (Published 2015-08-18)
Non-Stationary Gaussian Process Regression with Hamiltonian Monte Carlo
arXiv:1609.08203 [stat.ML] (Published 2016-09-26)
Variational Inference with Hamiltonian Monte Carlo