arXiv Analytics

Sign in

arXiv:2211.14103 [math.OC]AbstractReferencesReviewsResources

Conditional Gradient Methods

Gábor Braun, Alejandro Carderera, Cyrille W. Combettes, Hamed Hassani, Amin Karbasi, Aryan Mokhtari, Sebastian Pokutta

Published 2022-11-25Version 1

The purpose of this survey is to serve both as a gentle introduction and a coherent overview of state-of-the-art Frank--Wolfe algorithms, also called conditional gradient algorithms, for function minimization. These algorithms are especially useful in convex optimization when linear optimization is cheaper than projections. The selection of the material has been guided by the principle of highlighting crucial ideas as well as presenting new approaches that we believe might become important in the future, with ample citations even of old works imperative in the development of newer methods. Yet, our selection is sometimes biased, and need not reflect consensus of the research community, and we have certainly missed recent important contributions. After all the research area of Frank--Wolfe is very active, making it a moving target. We apologize sincerely in advance for any such distortions and we fully acknowledge: We stand on the shoulder of giants.

Comments: 238 pages with many figures. The FrankWolfe.jl Julia package (https://github.com/ZIB-IOL/FrankWolfe.jl) providces state-of-the-art implementations of many Frank--Wolfe methods
Categories: math.OC
Related articles: Most relevant | Search more
arXiv:1805.07311 [math.OC] (Published 2018-05-18)
Blended Conditional Gradients: the unconditioning of conditional gradients
arXiv:1906.11580 [math.OC] (Published 2019-06-27)
Gradient projection and conditional gradient methods for constrained nonconvex minimization
arXiv:2007.00153 [math.OC] (Published 2020-06-30)
Conditional Gradient Methods for convex optimization with function constraints