arXiv Analytics

Sign in

arXiv:2506.20630 [math.OC]AbstractReferencesReviewsResources

First-order methods for stochastic and finite-sum convex optimization with deterministic constraints

Zhaosong Lu, Yifeng Xiao

Published 2025-06-25Version 1

In this paper, we study a class of stochastic and finite-sum convex optimization problems with deterministic constraints. Existing methods typically aim to find an $\epsilon$-$expectedly\ feasible\ stochastic\ optimal$ solution, in which the expected constraint violation and expected optimality gap are both within a prescribed tolerance $\epsilon$. However, in many practical applications, constraints must be nearly satisfied with certainty, rendering such solutions potentially unsuitable due to the risk of substantial violations. To address this issue, we propose stochastic first-order methods for finding an $\epsilon$-$surely\ feasible\ stochastic\ optimal$ ($\epsilon$-SFSO) solution, where the constraint violation is deterministically bounded by $\epsilon$ and the expected optimality gap is at most $\epsilon$. Our methods apply an accelerated stochastic gradient (ASG) scheme or a modified variance-reduced ASG scheme $only\ once$ to a sequence of quadratic penalty subproblems with appropriately chosen penalty parameters. We establish first-order oracle complexity bounds for the proposed methods in computing an $\epsilon$-SFSO solution. As a byproduct, we also derive first-order oracle complexity results for sample average approximation method in computing an $\epsilon$-SFSO solution of the stochastic optimization problem using our proposed methods to solve the sample average problem.

Related articles: Most relevant | Search more
arXiv:2101.00935 [math.OC] (Published 2021-01-04)
First-Order Methods for Convex Optimization
arXiv:2412.11330 [math.OC] (Published 2024-12-15, updated 2025-05-05)
Exact Verification of First-Order Methods via Mixed-Integer Linear Programming
arXiv:2412.00640 [math.OC] (Published 2024-12-01)
Stability of first-order methods in tame optimization