By Topic

The asymptotics of posterior entropy and error probability for Bayesian estimation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Kanaya, F. ; Shonan Inst. of Technol., Fujisawa, Japan ; Te Sun Han

We consider the Bayesian parameter estimation problem where the value of a finitary parameter X should be decided on the basis of i.i.d. sample Yn of size n. In this context, the amount of missing information on X after observing Yn may be evaluated by the posterior entropy, which is often called the equivocation or the conditional entropy, of X given Yn, while it is well known that the minimum possible probability of error in estimating X is achieved by the maximum a posteriori probability (MAP) estimator. In this work, the focus is on the asymptotic relation between the posterior entropy and the MAP error probability as the sample size n becomes sufficiently large. It is shown that if the sample size n is large enough, the posterior entropy as well as the MAP error probability decay with n to zero at the identical exponential rate, and that the maximum achievable exponent for this decay is determined by the minimum Chernoff information over all the possible pairs of distinct parameter values

Published in:

Information Theory, IEEE Transactions on  (Volume:41 ,  Issue: 6 )