arXiv Analytics

Sign in

arXiv:1711.04734 [math.OC]AbstractReferencesReviewsResources

Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso

Roberto I. Oliveira, Philip Thompson

Published 2017-11-13Version 1

We present exponential finite-sample nonasymptotic deviation inequalities for the SAA estimator's near-optimal solution set over the class of \emph{convex} stochastic optimization problems with heavy-tailed random H\"older continuous functions in the objective and constraints. Such setting is better suited for problems where a sub-Gaussian data generating distribution is less expected, e.g., in stochastic portfolio optimization. One of our contributions is to exploit \emph{convexity} of the perturbed objective and the perturbed constraints as a property which entails \emph{localized} deviation inequalities for joint feasibility and optimality guarantees. This means that our bounds are significantly tighter in terms of diameter and metric entropy since they depend only on the near-optimal solution set but not on the whole feasible set. As a result, we obtain a much sharper sample complexity estimate when compared to a general nonconvex problem. In our analysis, we derive some localized deterministic perturbation error bounds for convex optimization problems which are of independent interest. To obtain our results, we only assume a metric regular convex feasible set, possibly not satisfying the Slater condition and not having a metric regular solution set. In this general setting, joint near feasibility and near optimality are guaranteed. If in addition the set satisfies the Slater condition, we obtain finite-sample simultaneous exact feasibility and near optimality guarantees. Another contribution of our work is to present, as a proof of concept of our localized techniques, a persistent result for a variant of the LASSO estimator under very weak assumptions on the data generating distribution.

Related articles: Most relevant | Search more
arXiv:2401.00664 [math.OC] (Published 2024-01-01)
New Sample Complexity Bounds for (Regularized) Sample Average Approximation in Several Heavy-Tailed, Non-Lipschitzian, and High-Dimensional Cases
arXiv:1705.00822 [math.OC] (Published 2017-05-02)
Sample average approximation with heavier tails I: non-asymptotic bounds with weak assumptions and stochastic constraints
arXiv:2402.03210 [math.OC] (Published 2024-02-05, updated 2024-07-11)
Universal Gradient Methods for Stochastic Convex Optimization