arXiv:1510.08226 [math.ST]AbstractReferencesReviewsResources
Asymptotic expansion of the risk of maximum likelihood estimator with respect to $α$-divergence as a measure of the difficulty of specifying a parametric model --with detailed proof
Published 2015-10-28Version 1
For a given parametric probability model, we consider the risk of the maximum likelihood estimator with respect to $\alpha$-divergence, which includes the special cases of Kullback--Leibler divergence, the Hellinger distance and $\chi^2$ divergence. The asymptotic expansion of the risk is given with respect to sample sizes of up to order $n^{-2}$. Each term in the expansion is expressed with the geometrical properties of the Riemannian manifold formed by the parametric probability model. We attempt to measure the difficulty of specifying a model through this expansion.
Comments: 84 pages, 4 figures
Related articles: Most relevant | Search more
arXiv:1705.10445 [math.ST] (Published 2017-05-30)
Asymptotic Properties of the Maximum Likelihood Estimator in Regime Switching Econometric Models
arXiv:1510.03679 [math.ST] (Published 2015-10-13)
Bounds for the multivariate normal approximation of the maximum likelihood estimator
arXiv:1609.05714 [math.ST] (Published 2016-09-19)
Bounds for the normal approximation of the maximum likelihood estimator from m-dependent random variables