arXiv Analytics

Sign in

arXiv:1201.5907 [stat.CO]AbstractReferencesReviewsResources

Kullback Proximal Algorithms for Maximum Likelihood Estimation

Stéphane Chrétien, Alfred O. Hero

Published 2012-01-27Version 1

Accelerated algorithms for maximum likelihood image reconstruction are essential for emerging applications such as 3D tomography, dynamic tomographic imaging, and other high dimensional inverse problems. In this paper, we introduce and analyze a class of fast and stable sequential optimization methods for computing maximum likelihood estimates and study its convergence properties. These methods are based on a {\it proximal point algorithm} implemented with the Kullback-Liebler (KL) divergence between posterior densities of the complete data as a proximal penalty function. When the proximal relaxation parameter is set to unity one obtains the classical expectation maximization (EM) algorithm. For a decreasing sequence of relaxation parameters, relaxed versions of EM are obtained which can have much faster asymptotic convergence without sacrifice of monotonicity. We present an implementation of the algorithm using Mor\'{e}'s {\it Trust Region} update strategy. For illustration the method is applied to a non-quadratic inverse problem with Poisson distributed data.

Comments: 6 figures
Journal: IEEE Transactions on Information Theory, (2000) 46 no.5, pp. 1800--1810
Categories: stat.CO
Related articles: Most relevant | Search more
arXiv:2501.11604 [stat.CO] (Published 2025-01-20)
A revisit to maximum likelihood estimation of Weibull model parameters
arXiv:1907.10397 [stat.CO] (Published 2019-07-24)
Some computational aspects of maximum likelihood estimation of the skew-$t$ distribution
arXiv:1402.4281 [stat.CO] (Published 2014-02-18)
Bayesian and Maximum Likelihood Estimation for Gaussian Processes on an Incomplete Lattice