arXiv Analytics

Sign in

arXiv:1901.09063 [math.OC]AbstractReferencesReviewsResources

Analysis of the BFGS Method with Errors

Yuchen Xie, Richard Byrd, Jorge Nocedal

Published 2019-01-25Version 1

The classical convergence analysis of quasi-Newton methods assumes that the function and gradients employed at each iteration are exact. In this paper, we consider the case when there are (bounded) errors in both computations and establish conditions under which a slight modification of the BFGS algorithm with an Armijo-Wolfe line search converges to a neighborhood of the solution that is determined by the size of the errors. One of our results is an extension of the analysis presented in Byrd, R. H., & Nocedal, J. (1989), which establishes that, for strongly convex functions, a fraction of the BFGS iterates are good iterates. We present numerical results illustrating the performance of the new BFGS method in the presence of noise.

Related articles: Most relevant | Search more
arXiv:2004.14866 [math.OC] (Published 2020-04-29)
New Results on Superlinear Convergence of Classical Quasi-Newton Methods
arXiv:2001.07335 [math.OC] (Published 2020-01-21)
A Dynamic Subspace Based BFGS Method for Large Scale Optimization Problem
arXiv:2109.00072 [math.OC] (Published 2021-08-31)
Quasi-Newton methods for minimizing a quadratic function subject to uncertainty