arXiv Analytics

Sign in

arXiv:1612.06965 [math.OC]AbstractReferencesReviewsResources

Quasi-Newton Methods: Superlinear Convergence Without Line Search for Self-Concordant Functions

Wenbo Gao, Donald Goldfarb

Published 2016-12-21Version 1

We derive a \emph{curvature-adaptive} step size for gradient-based iterative methods, including quasi-Newton methods, for minimizing self-concordant functions. This step size has a simple expression that can be computed analytically; hence, line searches are not needed. We show that using this step size in the BFGS method (and quasi-Newton methods generally) results in superlinear convergence for strongly convex self-concordant functions. We present numerical experiments comparing gradient descent and BFGS methods using the curvature-adaptive step size to traditional methods on logistic regression problems.

Comments: 16 pages, 2 figures
Categories: math.OC
Subjects: 90C53, 90C30
Related articles: Most relevant | Search more
arXiv:1511.02540 [math.OC] (Published 2015-11-08)
Speed learning on the fly
arXiv:1901.09063 [math.OC] (Published 2019-01-25)
Analysis of the BFGS Method with Errors
arXiv:1603.06772 [math.OC] (Published 2016-03-22)
Line Search for Averaged Operator Iteration