arXiv Analytics

Sign in

arXiv:1812.03548 [math.PR]AbstractReferencesReviewsResources

Uniform Hanson-Wright type concentration inequalities for unbounded entries via the entropy method

Yegor Klochkov, Nikita Zhivotovskiy

Published 2018-12-09Version 1

This paper is devoted to uniform versions of the Hanson-Wright inequality for a random vector $X \in \mathbb{R}^n$ with independent subgaussian components. The core technique of the paper is based on the entropy method combined with truncations of both gradients of functions of interest and of the coordinates of $X$ itself. Our results recover, in particular, the classic uniform bound of Talagrand (1996) for Rademacher chaoses and the more recent uniform result of Adamczak (2015), which holds under certain rather strong assumptions on the distribution of $X$. We provide several applications of our techniques: we establish a version of the standard Hanson-Wright inequality, which is tighter in some regimes. Extending our techniques we show a version of the dimension-free matrix Bernstein inequality that holds for random matrices with a subexponential spectral norm. We apply the derived inequality to the problem of covariance estimation with missing observations and prove an almost optimal high probability version of the recent result of Lounici (2014). Finally, we show a uniform Hanson-Wright type inequality in the Ising model under Dobrushin's condition. A closely related question was posed by Marton (2003).

Related articles: Most relevant | Search more
arXiv:math/0608706 [math.PR] (Published 2006-08-29, updated 2006-11-22)
Entropy method for the left tail
arXiv:2111.12169 [math.PR] (Published 2021-11-23, updated 2022-03-01)
Hanson-Wright Inequality for Random Tensors under Einstein Product
arXiv:1409.8457 [math.PR] (Published 2014-09-30)
A note on the Hanson-Wright inequality for random vectors with dependencies