arXiv Analytics

Sign in

arXiv:1705.00822 [math.OC]AbstractReferencesReviewsResources

Sample average approximation with heavier tails I: non-asymptotic bounds with weak assumptions and stochastic constraints

Roberto I. Oliveira, Philip Thompson

Published 2017-05-02Version 1

We give statistical guarantees for the sample average approximation (SAA) of stochastic optimization problems. Precisely, we derive exponential non-asymptotic finite-sample deviation inequalities for the approximate optimal solutions and optimal value of the SAA estimator. In that respect, we give three main contributions. First, our bounds do not require \emph{sub-Gaussian} assumptions on the data as in previous literature of stochastic optimization (SO). Instead, we just assume H\"older continuous and \emph{heavy-tailed} data (i.e. finite 2nd moments), a framework suited for risk-averse portfolio optimization. Second, we derive new deviation inequalities for SO problems with \emph{expected-valued stochastic constraints} which guarantee \emph{joint} approximate feasibility and optimality without metric regularity of the \emph{solution set} nor the use of reformulations. Thus, unlike previous works, we do not require strong growth conditions on the objective function, the use of penalization nor necessary first order conditions. Instead, we use metric regularity of the \emph{feasible set} as a sufficient condition, making our analysis general for many classes of problems. Our bounds imply \emph{exact} feasibility and approximate optimality for convex feasible sets with strict feasibility and approximate feasibility and optimality for metric regular sets which are non-convex or which are convex but not strictly feasible. In our bounds the feasible set's metric regular constant is an additional condition number. For convex sets, we use localization arguments for concentration of measure, obtaining feasibility estimates in terms of smaller metric entropies. Third, we obtain a general uniform concentration inequality for heavy-tailed H\"older continuous random functions using empirical process theory. This is the main tool in our analysis but it is also a result of independent interest.

Related articles: Most relevant | Search more
arXiv:1711.04734 [math.OC] (Published 2017-11-13)
Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso
arXiv:1905.11957 [math.OC] (Published 2019-05-28)
Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization
arXiv:1912.13078 [math.OC] (Published 2019-12-30)
On Sample Average Approximation for Two-stage Stochastic Programs without Relatively Complete Recourse