arXiv Analytics

Sign in

arXiv:2407.17949 [stat.ML]AbstractReferencesReviewsResources

Fast convergence of the Expectation Maximization algorithm under a logarithmic Sobolev inequality

Rocco Caprio, Adam M Johansen

Published 2024-07-25Version 1

By utilizing recently developed tools for constructing gradient flows on Wasserstein spaces, we extend an analysis technique commonly employed to understand alternating minimization algorithms on Euclidean space to the Expectation Maximization (EM) algorithm via its representation as coordinate-wise minimization on the product of a Euclidean space and a space of probability distributions due to Neal and Hinton (1998). In so doing we obtain finite sample error bounds and exponential convergence of the EM algorithm under a natural generalisation of a log-Sobolev inequality. We further demonstrate that the analysis technique is sufficiently flexible to allow also the analysis of several variants of the EM algorithm.

Related articles: Most relevant | Search more
arXiv:2406.13036 [stat.ML] (Published 2024-06-18)
Sharp detection of low-dimensional structure in probability measures via dimensional logarithmic Sobolev inequalities
arXiv:2502.06007 [stat.ML] (Published 2025-02-09)
Transformers versus the EM Algorithm in Multi-class Clustering
arXiv:1401.6276 [stat.ML] (Published 2014-01-24)
The EM algorithm and the Laplace Approximation