arXiv Analytics

Sign in

arXiv:2104.04974 [math.OC]AbstractReferencesReviewsResources

Alternating cyclic extrapolation methods for optimization algorithms

Nicolas Lepage-Saucier

Published 2021-04-11Version 1

This article introduces new acceleration methods for fixed point iterations. Speed and stability are achieved by alternating the number of mappings to compute step lengths and using them multiple times by cycling. A new type of step length is also proposed with good properties for nonlinear mappings. The methods require no specific adaptation and are especially efficient for high-dimensional problems. Computation uses few objective function evaluations, no matrix inversion and little extra memory. A convergence analysis is followed by seven applications, including gradient descent acceleration for unconstrained optimization. Performances are on par or better than alternatives. The algorithm is available as a stand-alone Julia package and may be downloaded at https://github.com/NicolasL-S/ACX.

Related articles: Most relevant | Search more
arXiv:1709.08242 [math.OC] (Published 2017-09-24)
Best practices for comparing optimization algorithms
arXiv:2410.08331 [math.OC] (Published 2024-10-10)
Fejér* monotonicity in optimization algorithms
arXiv:2309.00644 [math.OC] (Published 2023-08-30)
Ten New Benchmarks for Optimization