arXiv Analytics

Sign in

arXiv:1710.05782 [math.OC]AbstractReferencesReviewsResources

Second-Order Methods with Cubic Regularization Under Inexact Information

Saeed Ghadimi, Han Liu, Tong Zhang

Published 2017-10-16Version 1

In this paper, we generalize (accelerated) Newton's method with cubic regularization under inexact second-order information for (strongly) convex optimization problems. Under mild assumptions, we provide global rate of convergence of these methods and show the explicit dependence of the rate of convergence on the problem parameters. While the complexity bounds of our presented algorithms are theoretically worse than those of their exact counterparts, they are at least as good as those of the optimal first-order methods. Our numerical experiments also show that using inexact Hessians can significantly speed up the algorithms in practice.

Related articles: Most relevant | Search more
arXiv:1810.03763 [math.OC] (Published 2018-10-09)
Cubic Regularization with Momentum for Nonconvex Optimization
arXiv:1611.04982 [math.OC] (Published 2016-11-15)
Oracle Complexity of Second-Order Methods for Finite-Sum Problems
arXiv:1705.07260 [math.OC] (Published 2017-05-20)
Oracle Complexity of Second-Order Methods for Smooth Convex Optimization