arXiv Analytics

Sign in

arXiv:1710.05200 [math.OC]AbstractReferencesReviewsResources

Objective acceleration for unconstrained optimization

Asbjørn Nilsen Riseth

Published 2017-10-14Version 1

Acceleration schemes can dramatically improve existing optimization procedures. In most of the work on these schemes, such as nonlinear GMRES, acceleration is based on minimizing the $\ell_2$ norm of some target on subspaces of $\mathbb{R}^n$. There are many numerical examples that show how accelerating general purpose and domain-specific optimizers with N-GMRES results in large improvements. We propose a natural modification to N-GMRES, which significantly improves the performance in a testing environment originally used to advocate N-GMRES. Our proposed approach, which we refer to as O-ACCEL, is novel in that it minimizes an approximation to the \emph{objective function} on subspaces of $\mathbb{R}^n$. A convergence theorem is proved for our proposed approach. Comparisons with L-BFGS and N-CG indicate the competitiveness of O-ACCEL. As it can be combined with domain-specific optimizers, it may also be beneficial in areas where L-BFGS or N-CG are not suitable.

Related articles: Most relevant | Search more
arXiv:0808.2316 [math.OC] (Published 2008-08-17)
A new secant method for unconstrained optimization
arXiv:2308.15145 [math.OC] (Published 2023-08-29)
Limited memory gradient methods for unconstrained optimization
arXiv:1212.5929 [math.OC] (Published 2012-12-24)
A Globally and Superlinearly Convergent Modified BFGS Algorithm for Unconstrained Optimization