arXiv Analytics

Sign in

arXiv:2408.01848 [math.OC]AbstractReferencesReviewsResources

Methods for Optimization Problems with Markovian Stochasticity and Non-Euclidean Geometry

Vladimir Solodkin, Andrew Veprikov, Aleksandr Beznosikov

Published 2024-08-03Version 1

This paper examines a variety of classical optimization problems, including well-known minimization tasks and more general variational inequalities. We consider a stochastic formulation of these problems, and unlike most previous work, we take into account the complex Markov nature of the noise. We also consider the geometry of the problem in an arbitrary non-Euclidean setting, and propose four methods based on the Mirror Descent iteration technique. Theoretical analysis is provided for smooth and convex minimization problems and variational inequalities with Lipschitz and monotone operators. The convergence guarantees obtained are optimal for first-order stochastic methods, as evidenced by the lower bound estimates provided in this paper.

Related articles: Most relevant | Search more
arXiv:2110.04882 [math.OC] (Published 2021-10-10, updated 2022-04-29)
First- and Second-Order Analysis for Optimization Problems with Manifold-Valued Constraints
arXiv:2011.06064 [math.OC] (Published 2020-11-11)
Non-local Optimization: Imposing Structure on Optimization Problems by Relaxation
arXiv:2402.12090 [math.OC] (Published 2024-02-19)
Characterization of optimization problems that are solvable iteratively with linear convergence