arXiv Analytics

Sign in

arXiv:1905.11957 [math.OC]AbstractReferencesReviewsResources

Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization

Yifan Hu, Xin Chen, Niao He

Published 2019-05-28Version 1

In this paper, we study a class of stochastic optimization problems, referred to as the \emph{Conditional Stochastic Optimization} (CSO), in the form of $\min_{x \in \mathcal{X}} \mathbb{E}_{\xi}f_\xi\Big({\mathbb{E}_{\eta|\xi}[\mathbf{g}_\eta(x,\xi)]}\Big)$. CSO finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust and invariant learning. We establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from $\mathcal{O}(d/\epsilon^4)$ to $\mathcal{O}(d/\epsilon^3)$ when assuming smoothness of the outer function, and further to $\mathcal{O}(1/\epsilon^2)$ when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA, when $\xi$ and $\eta$ are independent. Our numerical results from several experiments further support our theoretical findings. Keywords: stochastic optimization, sample average approximation, large deviations theory

Related articles: Most relevant | Search more
arXiv:1711.04734 [math.OC] (Published 2017-11-13)
Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso
arXiv:1912.13078 [math.OC] (Published 2019-12-30)
On Sample Average Approximation for Two-stage Stochastic Programs without Relatively Complete Recourse
arXiv:2311.04161 [math.OC] (Published 2023-11-07)
Breaking the Heavy-Tailed Noise Barrier in Stochastic Optimization Problems