arXiv:1502.06259 [math.OC]AbstractReferencesReviewsResources
Gradient and gradient-free methods for stochastic convex optimization with inexact oracle
Alexander Gasnikov, Pavel Dvurechensky, Kamzolov Dmitry
Published 2015-02-22Version 1
In the paper we generalize universal gradient method (Yu. Nesterov) to strongly convex case and to Intermediate gradient method (Devolder-Glineur-Nesterov). We also consider possible generalizations to stochastic and online context. We show how these results can be generalized to gradient-free method and method of random direction search. But the main ingridient of this paper is assumption about the oracle. We considered the oracle to be inexact.
Comments: in Russian, 9 pages
Categories: math.OC
Related articles: Most relevant | Search more
Universal Gradient Methods for Stochastic Convex Optimization
arXiv:2207.02750 [math.OC] (Published 2022-07-06)
An SDE perspective on stochastic convex optimization
arXiv:1806.05140 [math.OC] (Published 2018-06-13)
Generalized Mirror Prox: Solving Variational Inequalities with Monotone Operator, Inexact Oracle, and Unknown Hölder Parameters