arXiv:2211.08578 [math.OC]AbstractReferencesReviewsResources
Anderson acceleration of gradient methods with energy for optimization problems
Hailiang Liu, Jia-Hao He, Xuping Tian
Published 2022-11-15Version 1
Anderson acceleration (AA) as an efficient technique for speeding up the convergence of fixed-point iterations may be designed for accelerating an optimization method. We propose a novel optimization algorithm by adapting Anderson acceleration to the energy adaptive gradient method (AEGD) [arXiv:2010.05109]. The feasibility of our algorithm is examined in light of convergence results for AEGD, though it is not a fixed-point iteration. We also quantify the accelerated convergence rate of AA for gradient descent by a factor of the gain at each implementation of the Anderson mixing. Our experimental results show that the proposed algorithm requires little tuning of hyperparameters and exhibits superior fast convergence.