arXiv Analytics

Sign in

arXiv:1808.07382 [math.OC]AbstractReferencesReviewsResources

Convergence of Cubic Regularization for Nonconvex Optimization under KL Property

Yi Zhou, Zhe Wang, Yingbin Liang

Published 2018-08-22Version 1

Cubic-regularized Newton's method (CR) is a popular algorithm that guarantees to produce a second-order stationary solution for solving nonconvex optimization problems. However, existing understandings of the convergence rate of CR are conditioned on special types of geometrical properties of the objective function. In this paper, we explore the asymptotic convergence rate of CR by exploiting the ubiquitous Kurdyka-Lojasiewicz (KL) property of nonconvex objective functions. In specific, we characterize the asymptotic convergence rate of various types of optimality measures for CR including function value gap, variable distance gap, gradient norm and least eigenvalue of the Hessian matrix. Our results fully characterize the diverse convergence behaviors of these optimality measures in the full parameter regime of the KL property. Moreover, we show that the obtained asymptotic convergence rates of CR are order-wise faster than those of first-order gradient descent algorithms under the KL property.

Related articles: Most relevant | Search more
arXiv:1810.03763 [math.OC] (Published 2018-10-09)
Cubic Regularization with Momentum for Nonconvex Optimization
arXiv:2412.09556 [math.OC] (Published 2024-12-12)
Enhancing Convergence of Decentralized Gradient Tracking under the KL Property
arXiv:1812.00558 [math.OC] (Published 2018-12-03)
Metric Subregularity of Subdifferential and KL Property of Exponent 1/2