arXiv Analytics

Sign in

arXiv:2107.08011 [math.OC]AbstractReferencesReviewsResources

Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements

Kimon Antonakopoulos, Panayotis Mertikopoulos

Published 2021-07-16Version 1

We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function - as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range of problems with singular objectives, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods - like UnixGrad or AcceleGrad - is not possible, especially in the presence of randomness and uncertainty. The proposed method - which we call adaptive mirror descent (AdaMir) - aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.

Related articles: Most relevant | Search more
arXiv:2001.06511 [math.OC] (Published 2020-01-17)
A perturbation view of level-set methods for convex optimization
arXiv:1604.04713 [math.OC] (Published 2016-04-16)
Stochastic Optimization Algorithms for Convex Optimization with Fixed Point Constraints
arXiv:1008.2814 [math.OC] (Published 2010-08-17, updated 2013-12-02)
Convex optimization for the planted k-disjoint-clique problem