arXiv Analytics

Sign in

arXiv:2401.06809 [math.OC]AbstractReferencesReviewsResources

Greedy Newton: Newton's Method with Exact Line Search

Betty Shea, Mark Schmidt

Published 2024-01-10Version 1

A defining characteristic of Newton's method is local superlinear convergence within a neighbourhood of a strict local minimum. However, outside this neighborhood Newton's method can converge slowly or even diverge. A common approach to dealing with non-convergence is using a step size that is set by an Armijo backtracking line search. With suitable initialization the line-search preserves local superlinear convergence, but may give sub-optimal progress when not near a solution. In this work we consider Newton's method under an exact line search, which we call "greedy Newton" (GN). We show that this leads to an improved global convergence rate, while retaining a local superlinear convergence rate. We empirically show that GN may work better than backtracking Newton by allowing significantly larger step sizes.

Related articles: Most relevant | Search more
arXiv:2103.14987 [math.OC] (Published 2021-03-27)
Quadratic Convergence of Newton's Method for 0/1 Loss Optimization
arXiv:2101.11140 [math.OC] (Published 2021-01-27)
Newton's Method for M-Tensor Equations
arXiv:2407.04914 [math.OC] (Published 2024-07-06)
Analytic analysis of the worst-case complexity of the gradient method with exact line search and the Polyak stepsize