arXiv Analytics

Sign in

arXiv:1705.07260 [math.OC]AbstractReferencesReviewsResources

Oracle Complexity of Second-Order Methods for Smooth Convex Optimization

Ohad Shamir, Ron Shiff

Published 2017-05-20Version 1

Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we study the oracle complexity of such methods, or equivalently, the number of iterations required to optimize a function to a given accuracy. Focusing on smooth and convex functions, we derive (to the best of our knowledge) the first algorithm-independent lower bounds for such methods. These bounds shed light on the limits of attainable performance with second-order methods, and in which cases they can or cannot require less iterations than gradient-based methods, whose oracle complexity is much better understood.

Related articles: Most relevant | Search more
arXiv:1611.04982 [math.OC] (Published 2016-11-15)
Oracle Complexity of Second-Order Methods for Finite-Sum Problems
arXiv:2205.09647 [math.OC] (Published 2022-05-19)
The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization
arXiv:2103.00667 [math.OC] (Published 2021-03-01)
Smooth Convex Optimization using Sub-Zeroth-Order Oracles