arXiv Analytics

Sign in

arXiv:1301.6730 [cs.LG]AbstractReferencesReviewsResources

Accelerating EM: An Empirical Study

Luis E. Ortiz, Leslie Pack Kaelbling

Published 2013-01-23Version 1

Many applications require that we learn the parameters of a model from data. EM is a method used to learn the parameters of probabilistic models for which the data for some of the variables in the models is either missing or hidden. There are instances in which this method is slow to converge. Therefore, several accelerations have been proposed to improve the method. None of the proposed acceleration methods are theoretically dominant and experimental comparisons are lacking. In this paper, we present the different proposed accelerations and try to compare them experimentally. From the results of the experiments, we argue that some acceleration of EM is always possible, but that which acceleration is superior depends on properties of the problem.

Comments: Appears in Proceedings of the Fifteenth Conference on Uncertainty in Artificial Intelligence (UAI1999)
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1911.04120 [cs.LG] (Published 2019-11-11)
An empirical study of the relation between network architecture and complexity
arXiv:1910.01319 [cs.LG] (Published 2019-10-03)
An empirical study of pretrained representations for few-shot classification
arXiv:1808.08166 [cs.LG] (Published 2018-08-24)
An Empirical Study of Rich Subgroup Fairness for Machine Learning