arXiv Analytics

Sign in

arXiv:2302.10065 [math.OC]AbstractReferencesReviewsResources

Yet another fast variant of Newton's method for nonconvex optimization

Serge Gratton, Sadok Jerad, Philippe L. Toint

Published 2023-02-20Version 1

A second-order algorithm is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionnally. As a consequence, the proposed method only requires the solution of a single linear system at nearly all iterations. We establish that at most $\mathcal{O}\left( |\log\epsilon|\,\epsilon^{-3/2}\right)$ evaluations of the problem's objective function and derivatives are needed for this algorithm to obtain an $\epsilon$-approximate first-order minimizer, and at most $\mathcal{O}\left(|\log\epsilon|\,\epsilon^{-3}\right)$ to obtain a second-order one. Initial numerical experiments with two variants of the new method are finally presented.

Related articles: Most relevant | Search more
arXiv:2209.02118 [math.OC] (Published 2022-09-05)
On radially epidifferentiable functions and regularity conditions in nonconvex optimization
arXiv:1611.04207 [math.OC] (Published 2016-11-13)
On the superlinear convergence of Newton's method on Riemannian manifolds
arXiv:2406.06100 [math.OC] (Published 2024-06-10)
Primitive Heavy-ball Dynamics Achieves $O(\varepsilon^{-7/4})$ Convergence for Nonconvex Optimization