arXiv Analytics

Sign in

arXiv:1710.02950 [stat.ML]AbstractReferencesReviewsResources

Maximum Regularized Likelihood Estimators: A General Prediction Theory and Applications

Rui Zhuang, Johannes Lederer

Published 2017-10-09Version 1

Maximum regularized likelihood estimators (MRLEs) are arguably the most established class of estimators in high-dimensional statistics. In this paper, we derive guarantees for MRLEs in Kullback-Leibler divergence, a general measure of prediction accuracy. We assume only that the densities have a convex parametrization and that the regularization is definite and positive homogenous. The results thus apply to a very large variety of models and estimators, such as tensor regression and graphical models with convex and non-convex regularized methods. A main conclusion is that MRLEs are broadly consistent in prediction - regardless of whether restricted eigenvalues or similar conditions hold.

Related articles: Most relevant | Search more
arXiv:1209.3079 [stat.ML] (Published 2012-09-14)
Signal Recovery in Unions of Subspaces with Applications to Compressive Imaging
arXiv:1406.0067 [stat.ML] (Published 2014-05-31, updated 2015-05-10)
Optimization via Low-rank Approximation for Community Detection in Networks
arXiv:1712.01934 [stat.ML] (Published 2017-12-05)
Concentration of weakly dependent Banach-valued sums and applications to kernel learning methods