arXiv Analytics

Sign in

arXiv:1004.3692 [math.PR]AbstractReferencesReviewsResources

Compound Poisson Approximation via Information Functionals

A. D. Barbour, Oliver Johnson, Ioannis Kontoyiannis, Mokshay Madiman

Published 2010-04-21Version 1

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Let $P_{S_n}$ be the distribution of a sum $S_n=\Sumn Y_i$ of independent integer-valued random variables $Y_i$. Nonasymptotic bounds are derived for the distance between $P_{S_n}$ and an appropriately chosen compound Poisson law. In the case where all $Y_i$ have the same conditional distribution given $\{Y_i\neq 0\}$, a bound on the relative entropy distance between $P_{S_n}$ and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the $Y_i$ have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals," and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.

Comments: 27 pages
Journal: Electronic Journal of Probability, Vol 15, Paper no. 42, pages 1344-1369, 2010
Categories: math.PR, cs.IT, math.IT
Related articles: Most relevant | Search more
arXiv:1312.5276 [math.PR] (Published 2013-12-18, updated 2014-04-18)
Integration by parts and representation of information functionals
arXiv:1110.5381 [math.PR] (Published 2011-10-24, updated 2012-07-17)
Compound Poisson approximation for triangular arrays with application to threshold estimation
arXiv:1710.06341 [math.PR] (Published 2017-10-17)
Compound Poisson approximation of subgraph counts in stochastic block models with multiple edges