arXiv Analytics

Sign in

arXiv:2502.06719 [stat.ML]AbstractReferencesReviewsResources

Gaussian Approximation and Multiplier Bootstrap for Stochastic Gradient Descent

Marina Sheshukova, Sergey Samsonov, Denis Belomestny, Eric Moulines, Qi-Man Shao, Zhuo-Song Zhang, Alexey Naumov

Published 2025-02-10Version 1

In this paper, we establish non-asymptotic convergence rates in the central limit theorem for Polyak-Ruppert-averaged iterates of stochastic gradient descent (SGD). Our analysis builds on the result of the Gaussian approximation for nonlinear statistics of independent random variables of Shao and Zhang (2022). Using this result, we prove the non-asymptotic validity of the multiplier bootstrap for constructing the confidence sets for the optimal solution of an optimization problem. In particular, our approach avoids the need to approximate the limiting covariance of Polyak-Ruppert SGD iterates, which allows us to derive approximation rates in convex distance of order up to $1/\sqrt{n}$.

Related articles: Most relevant | Search more
arXiv:2405.16644 [stat.ML] (Published 2024-05-26)
Gaussian Approximation and Multiplier Bootstrap for Polyak-Ruppert Averaged Linear Stochastic Approximation with Applications to TD Learning
arXiv:2006.10840 [stat.ML] (Published 2020-06-18)
Stochastic Gradient Descent in Hilbert Scales: Smoothness, Preconditioning and Earlier Stopping
arXiv:1911.01483 [stat.ML] (Published 2019-11-04)
Statistical Inference for Model Parameters in Stochastic Gradient Descent via Batch Means