arXiv Analytics

Sign in

arXiv:1710.06612 [math.OC]AbstractReferencesReviewsResources

Mirror Descent and Convex Optimization Problems With Non-Smooth Inequality Constraints

Anastasia Bayandina, Pavel Dvurechensky, Alexander Gasnikov, Fedor Stonyakin, Alexander Titov

Published 2017-10-18Version 1

We consider the problem of minimization of a convex function on a simple set with convex non-smooth inequality constraint and describe first-order methods to solve such problems in different situations: smooth or non-smooth objective function; convex or strongly convex objective and constraint; deterministic or randomized information about the objective and constraint. We hope that it is convenient for a reader to have all the methods for different settings in one place. Described methods are based on Mirror Descent algorithm and switching subgradient scheme. One of our focus is to propose, for the listed different settings, a Mirror Descent with adaptive stepsizes and adaptive stopping rule. This means that neither stepsize nor stopping rule require to know the Lipschitz constant of the objective or constraint. We also construct Mirror Descent for problems with objective function, which is not Lipschitz continuous, e.g. is a quadratic function. Besides that, we address the problem of recovering the solution of the dual problem.

Related articles: Most relevant | Search more
arXiv:1403.6526 [math.OC] (Published 2014-03-25, updated 2015-07-01)
A Family of Subgradient-Based Methods for Convex Optimization Problems in a Unifying Framework
arXiv:0912.1660 [math.OC] (Published 2009-12-09)
Gradient-based methods for sparse recovery
arXiv:2103.12349 [math.OC] (Published 2021-03-23)
Fast Gradient Methods for Uniformly Convex and Weakly Smooth Problems