arXiv Analytics

Sign in

arXiv:1601.02177 [math.ST]AbstractReferencesReviewsResources

Optimal-order bounds on the rate of convergence to normality for maximum likelihood estimators

Iosif Pinelis

Published 2016-01-10Version 1

It is well known that under general regularity conditions the distribution of the maximum likelihood estimator (MLE) is asymptotically normal. Very recently, bounds of the optimal order $O(1/\sqrt n)$ on the closeness of the distribution of the MLE to normality in the so-called bounded Wasserstein distance were obtained, where $n$ is the sample size. However, the corresponding bounds on the Kolmogorov distance were only of the order $O(1/n^{1/4})$. In this note, bounds of the optimal order $O(1/\sqrt n)$ on the closeness of the distribution of the MLE to normality in the Kolmogorov distance are given, as well as their nonuniform counterparts, which work better for large deviations of the MLE. These results are based on previously obtained general optimal-order bounds on the rate of convergence to normality in the multivariate delta method. The crucial observation is that, under natural conditions, the MLE can be tightly enough bracketed between two smooth enough functions of the sum of independent random vectors, which makes the delta method applicable.

Related articles: Most relevant | Search more
arXiv:1506.01831 [math.ST] (Published 2015-06-05)
Handy sufficient conditions for the convergence of the maximum likelihood estimator in observation-driven models
arXiv:0711.3933 [math.ST] (Published 2007-11-26, updated 2009-11-20)
Sparsistency and rates of convergence in large covariance matrix estimation
arXiv:1311.2038 [math.ST] (Published 2013-11-08, updated 2014-07-18)
The Rate of Convergence for Approximate Bayesian Computation