arXiv Analytics

Sign in

arXiv:2402.02461 [math.OC]AbstractReferencesReviewsResources

Zeroth-order Median Clipping for Non-Smooth Convex Optimization Problems with Heavy-tailed Symmetric Noise

Nikita Kornilov, Yuriy Dorn, Aleksandr Lobanov, Nikolay Kutuzov, Innokentiy Shibaev, Eduard Gorbunov, Alexander Gasnikov, Alexander Nazin

Published 2024-02-04, updated 2024-02-07Version 2

In this paper, we consider non-smooth convex optimization with a zeroth-order oracle corrupted by symmetric stochastic noise. Unlike the existing high-probability results requiring the noise to have bounded $\kappa$-th moment with $\kappa \in (1,2]$, our results allow even heavier noise with any $\kappa > 0$, e.g., the noise distribution can have unbounded $1$-st moment. Moreover, our results match the best-known ones for the case of the bounded variance. To achieve this, we use the mini-batched median estimate of the sampled gradient differences, apply gradient clipping to the result, and plug in the final estimate into the accelerated method. We apply this technique to the stochastic multi-armed bandit problem with heavy-tailed distribution of rewards and achieve $O(\sqrt{dT})$ regret by incorporating the additional assumption of noise symmetry.

Related articles: Most relevant | Search more
arXiv:1103.4296 [math.OC] (Published 2011-03-22, updated 2012-04-07)
Randomized Smoothing for Stochastic Optimization
arXiv:1903.02117 [math.OC] (Published 2019-03-05)
Random minibatch projection algorithms for convex problems with functional constraints
arXiv:2007.01983 [math.OC] (Published 2020-07-04)
Primal-Dual Partial Inverse Splitting for Constrained Monotone Inclusions