arXiv Analytics

Sign in

arXiv:1903.10328 [stat.ML]AbstractReferencesReviewsResources

Stochastic Gradient Hamiltonian Monte Carlo for Non-Convex Learning in the Big Data Regime

Huy N. Chau, Miklos Rasonyi

Published 2019-03-25Version 1

Stochastic Gradient Hamiltonian Monte Carlo (SGHMC) is a momentum version of stochastic gradient descent with properly injected Gaussian noise to find a global minimum. In this paper, non-asymptotic convergence analysis of SGHMC is given in the context of non-convex optimization, where subsampling techniques are used over an i.i.d dataset for gradient updates. Our results complement those of [RRT17] and improve on those of [GGZ18].

Related articles: Most relevant | Search more
arXiv:2408.02839 [stat.ML] (Published 2024-08-05)
Optimizing Cox Models with Stochastic Gradient Descent: Theoretical Foundations and Practical Guidances
arXiv:2006.10840 [stat.ML] (Published 2020-06-18)
Stochastic Gradient Descent in Hilbert Scales: Smoothness, Preconditioning and Earlier Stopping
arXiv:1912.00018 [stat.ML] (Published 2019-11-29)
On the Heavy-Tailed Theory of Stochastic Gradient Descent for Deep Neural Networks