Cart (Loading....) | Create Account
Close category search window

The role of likelihood and entropy in incomplete-data problems: Applications to estimating point-process intensities and toeplitz constrained covariances

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Miller, M.I. ; Washington University, St. Louis, MO, USA ; Snyder, Donald L.

The principle of maximum entropy has played an important role in the solution of problems in which the measurements correspond to moment constraints on some many-to-one mapping h(x). In this paper we explore its role in estimation problems in which the measured data are statistical observations and moment constraints on the observation function h(x) do not exist. We conclude that: 1) For the class of likelihood problems arising in a complete-incomplete data context in which the complete data x are nonuniquely determined by the measured incomplete data y via the many-to-one mapping y = h(x), the density maximizing entropy is identical to the conditional density of the complete data given the incomplete data. This equivalence results by viewing the measurements as specifying the domain over which the density is defined, rather than as a moment constraint on h(x). 2) The identity between the maximum entropy and the conditional density results in the fact that maximum-likelihood estimates may be obtained via a joint maximization (minimization) of the entropy function (Kullback-Liebler divergence). This provides the basis for the iterative algorithm of Dempster, Laird, and Rubin [1] for the maximization of likelihood functions. 3) This iterative method is used for maximum-likelihood estimation of image parameters in emission tomography and gammaray astronomy. We demonstrate that unconstrained likelihood estimation of image intensities from finite data sets yields unstable estimates. We show how Grenander's method of sieves can be used with the iterative algorithm to remove the instability. A bandwidth sieve is introduced resulting in an estimator which is smoothed via exponential splines. 4) We also derive a recursive algorithm for the generation of Toeplitz constrained maximum-likelihood estimators which at each iteration evaluates conditional mean estimates of the lag products based on the previous estimate of the covariance, from which the updated Toeplitz covariance is generated. We prove that the sequence of Toeplitz estimators has the property that they increase in likelihood, remain in the set of positive-definite Toeplitz covariances, and has all of its limit points stable and satisfying the necessary conditions for maximizing the likelihood.

Published in:

Proceedings of the IEEE  (Volume:75 ,  Issue: 7 )

Date of Publication:

July 1987

Need Help?

IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.