arXiv Analytics

Sign in

arXiv:2003.09174 [math.OC]AbstractReferencesReviewsResources

Rates of Superlinear Convergence for Classical Quasi-Newton Methods

Anton Rodomanov, Yurii Nesterov

Published 2020-03-20Version 1

We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form $(\frac{n L^2}{\mu^2 k})^{k/2}$ and $(\frac{n L}{\mu k})^{k/2}$ respectively, where $k$ is the iteration counter, $n$ is the dimension of the problem, $\mu$ is the strong convexity parameter, and $L$ is the Lipschitz constant of the gradient.

Related articles: Most relevant | Search more
arXiv:2004.14866 [math.OC] (Published 2020-04-29)
New Results on Superlinear Convergence of Classical Quasi-Newton Methods
arXiv:1910.06894 [math.OC] (Published 2019-10-15)
Stability OF KKT systems and superlinear convergence of the SQP method under parabolic regularity
arXiv:1605.03529 [math.OC] (Published 2016-05-11)
On the Iteration Complexity of Oblivious First-Order Optimization Algorithms