arXiv Analytics

Sign in

arXiv:1512.06290 [math.ST]AbstractReferencesReviewsResources

On the finite Sample Properties of Regularized M-estimators

Demian Pouzo

Published 2015-12-19Version 1

We propose a general framework for regularization in M-estimation problems under time dependent (absolutely regular-mixing) data which encompasses many of the existing estimators. We derive non-asymptotic concentration bounds for the regularized M-estimator. The concentration rate exhibits a "variance-bias" trade-off, with the "variance" term being governed by a novel measure of the "size" of the parameter set. We also show that the mixing structure affect the variance term by scaling the number of observations; depending on the decay rate of the mixing coefficients, this scaling can even affect the asymptotic behavior. Finally, we propose a data-driven method for choosing the tuning parameters of the regularized estimator which yield the same (up to constants) concentration bound as one that optimally balances the "(squared) bias" and "variance" terms. We illustrate the results with several canonical examples of, both, non-parametric and high-dimensional models.

Related articles: Most relevant | Search more
arXiv:math/0406453 [math.ST] (Published 2004-06-23)
Finite sample properties of multiple imputation estimators
arXiv:1601.06537 [math.ST] (Published 2016-01-25)
Finite sample properties of the mean occupancy counts and probabilities
arXiv:2302.01121 [math.ST] (Published 2023-02-02)
Comparing regression curves -- an $L^1$-point of view