arXiv Analytics

Sign in

arXiv:1702.03828 [math.OC]AbstractReferencesReviewsResources

Sharpness, Restart and Acceleration

Vincent Roulet, Alexandre d'Aspremont

Published 2017-02-13Version 1

The {\L}ojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost generically. Here, we show that sharpness directly controls the performance of restart schemes. The constants quantifying sharpness are of course unobservable, but we show that optimal restart strategies are fairly robust, and searching for the best scheme only increases the complexity by a logarithmic factor compared to the optimal bound. Overall then, restart schemes generically accelerate accelerated methods.

Related articles: Most relevant | Search more
arXiv:1805.12591 [math.OC] (Published 2018-05-31)
On Acceleration with Noise-Corrupted Gradients
arXiv:1511.06566 [math.OC] (Published 2015-11-20)
Acceleration of the PDHGM on strongly convex subspaces
arXiv:2002.06315 [math.OC] (Published 2020-02-15)
Bregman Augmented Lagrangian and Its Acceleration