arXiv Analytics

Sign in

arXiv:1104.4997 [math.PR]AbstractReferencesReviewsResources

Concentration and Moment Inequalities for Polynomials of Independent Random Variables

Warren Schudy, Maxim Sviridenko

Published 2011-04-26, updated 2012-06-08Version 3

In this work we design a general method for proving moment inequalities for polynomials of independent random variables. Our method works for a wide range of random variables including Gaussian, Boolean, exponential, Poisson and many others. We apply our method to derive general concentration inequalities for polynomials of independent random variables. We show that our method implies concentration inequalities for some previously open problems, e.g. permanent of a random symmetric matrices. We show that our concentration inequality is stronger than the well-known concentration inequality due to Kim and Vu. The main advantage of our method in comparison with the existing ones is a wide range of random variables we can handle and bounds for previously intractable regimes of high degree polynomials and small expectations. On the negative side we show that even for boolean random variables each term in our concentration inequality is tight.

Related articles: Most relevant | Search more
arXiv:math/0003228 [math.PR] (Published 2000-03-31)
Exponential and moment inequalities for U-statistics
arXiv:math/0505492 [math.PR] (Published 2005-05-24)
Concentration for independent random variables with heavy tails
arXiv:math/9909054 [math.PR] (Published 1999-09-09)
Measuring the magnitude of sums of independent random variables