By Topic

Kullback proximal algorithms for maximum-likelihood estimation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Chretien, S. ; Univ. Libre de Bruxelles, Belgium ; Hero, A.O.

Accelerated algorithms for maximum-likelihood image reconstruction are essential for emerging applications such as three-dimensional (3-D) tomography, dynamic tomographic imaging, and other high-dimensional inverse problems. In this paper, we introduce and analyze a class of fast and stable sequential optimization methods for computing maximum-likelihood estimates and study its convergence properties. These methods are based on a proximal point algorithm implemented with the Kullback-Liebler (KL) divergence between posterior densities of the complete data as a proximal penalty function. When the proximal relaxation parameter is set to unity, one obtains the classical expectation-maximization (EM) algorithm. For a decreasing sequence of relaxation parameters, relaxed versions of EM are obtained which can have much faster asymptotic convergence without sacrifice of monotonicity. We present an implementation of the algorithm using More's (1983) trust region update strategy. For illustration, the method is applied to a nonquadratic inverse problem with Poisson distributed data

Published in:

Information Theory, IEEE Transactions on  (Volume:46 ,  Issue: 5 )