arXiv Analytics

Sign in

arXiv:1606.09365 [math.OC]AbstractReferencesReviewsResources

On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

Etienne de Klerk, François Glineur, Adrien B. Taylor

Published 2016-06-30Version 1

We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].

Related articles: Most relevant | Search more
arXiv:2407.04914 [math.OC] (Published 2024-07-06)
Analytic analysis of the worst-case complexity of the gradient method with exact line search and the Polyak stepsize
arXiv:2401.06809 [math.OC] (Published 2024-01-10)
Greedy Newton: Newton's Method with Exact Line Search
arXiv:2106.08020 [math.OC] (Published 2021-06-15)
A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions