arXiv Analytics

Sign in

arXiv:2303.03534 [math.OC]AbstractReferencesReviewsResources

Global convergence of the gradient method for functions definable in o-minimal structures

Cédric Josz

Published 2023-03-06, updated 2023-04-21Version 2

We consider the gradient method with variable step size for minimizing functions that are definable in o-minimal structures on the real field and differentiable with locally Lipschitz gradients. We prove that global convergence holds if continuous gradient trajectories are bounded, with the minimum gradient norm vanishing at the rate $o(1/k)$ if the step sizes are greater than a positive constant. If additionally the gradient is continuously differentiable, all saddle points are strict, and the step sizes are constant, then convergence to a local minimum holds almost surely over any bounded set of initial points.

Comments: 33 pages, 1 figure
Journal: Mathematical Programming 2023
Categories: math.OC
Subjects: 03C64, 90C26, 90C30
Related articles: Most relevant | Search more
arXiv:2203.00775 [math.OC] (Published 2022-03-01)
Tight convergence rates of the gradient method on hypoconvex functions
arXiv:2204.00647 [math.OC] (Published 2022-04-01)
Conditions for linear convergence of the gradient method for non-convex optimization
arXiv:1907.10494 [math.OC] (Published 2019-07-22)
An Improved Gradient Method with Approximately Optimal Stepsize Based on Conic model for Unconstrained Optimization