arXiv Analytics

Sign in

arXiv:2211.10234 [math.OC]AbstractReferencesReviewsResources

Iteration Complexity of Fixed-Step-Momentum Methods for Convex Quadratic Functions

Melinda Hagedorn, Florian Jarre

Published 2022-11-18Version 1

This note considers the momentum method without line search but with fixed step length applied to strictly convex quadratic functions assuming that exact gradients are used and appropriate upper and lower bounds for the extreme eigenvalues of the Hessian matrix are known. Simple 2-d-examples show that the Euclidean distance of the iterates to the optimal solution is non-monotone. In this context an explicit bound is derived on the number of iterations needed to guarantee a reduction of the Euclidean distance to the optimal solution by a factor $\varepsilon$. The bound is optimal up to a constant factor and complements earlier asymptotically optimal results.

Related articles: Most relevant | Search more
arXiv:2207.06304 [math.OC] (Published 2022-07-13)
On the Iteration Complexity of Smoothed Proximal ALM for Nonconvex Optimization Problem with Convex Constraints
arXiv:1707.02670 [math.OC] (Published 2017-07-10)
Accelerated Stochastic Power Iteration
arXiv:1709.03384 [math.OC] (Published 2017-09-11)
Ghost Penalties in Nonconvex Constrained Optimization: Diminishing Stepsizes and Iteration Complexity