arXiv:1507.02803 [math.PR]AbstractReferencesReviewsResources
Logarithmic Sobolev inequalities in discrete product spaces: a proof by a transportation cost distance
Published 2015-07-10Version 1
The aim of this paper is to prove an inequality between relative entropy and the sum of average conditional relative entropies of the following form: For a fixed probability measure $q^n$ on $\mathcal X^n$, ($\mathcal X$ is a finite set), and any probability measure $p^n=\mathcal L(Y^n)$ on $\mathcal X^n$, we have \begin{equation}\label{*} D(p^n||q^n)\leq Const. \sum_{i=1}^n \Bbb E_{p^n} D(p_i(\cdot|Y_1,\dots, Y_{i-1},Y_{i+1},\dots, Y_n) || q_i(\cdot|Y_1,\dots, Y_{i-1},Y_{i+1},\dots, Y_n)), \end{equation} where $p_i(\cdot|y_1,\dots, y_{i-1},y_{i+1},\dots, y_n)$ and $q_i(\cdot|x_1,\dots, x_{i-1},x_{i+1},\dots, x_n)$ denote the local specifications for $p^n$ resp. $q^n$. The constant shall depend on the properties of the local specifications of $q^n$. Inequality (*) is meaningful in product spaces, both in the discrete and the continuous case, and can be used to prove a logarithmic Sobolev inequality for $q^n$, provided uniform logarithmic Sobolev inequalities are available for $q_i(\cdot|x_1,\dots, x_{i-1},x_{i+1},\dots, x_n)$, for all fixed $i$ and all fixed $(x_1,\dots, x_{i-1},x_{i+1},\dots, x_n)$. Inequality (*) directly implies that the Gibbs sampler associated with $q^n$ is a contraction for relative entropy. We derive inequality (*), and thereby a logarithmic Sobolev inequality, in discrete product spaces, by proving inequalities for an appropriate Wasserstein-like distance. A logarithmic Sobolev inequality is, roughly speaking, a contractivity property of relative entropy with respect to some Markov semigroup. It is much easier to prove contractivity for a distance between measures than for relative entropy, since distances satisfy the triangle inequality, and for them well known linear tools, like estimates through matrix norms can be applied.