arXiv:1710.05200 [math.OC]AbstractReferencesReviewsResources
Objective acceleration for unconstrained optimization
Published 2017-10-14Version 1
Acceleration schemes can dramatically improve existing optimization procedures. In most of the work on these schemes, such as nonlinear GMRES, acceleration is based on minimizing the $\ell_2$ norm of some target on subspaces of $\mathbb{R}^n$. There are many numerical examples that show how accelerating general purpose and domain-specific optimizers with N-GMRES results in large improvements. We propose a natural modification to N-GMRES, which significantly improves the performance in a testing environment originally used to advocate N-GMRES. Our proposed approach, which we refer to as O-ACCEL, is novel in that it minimizes an approximation to the \emph{objective function} on subspaces of $\mathbb{R}^n$. A convergence theorem is proved for our proposed approach. Comparisons with L-BFGS and N-CG indicate the competitiveness of O-ACCEL. As it can be combined with domain-specific optimizers, it may also be beneficial in areas where L-BFGS or N-CG are not suitable.