By Topic

Probability Functionals for Self-Consistent and Invariant Inference: Entropy and Fisher Information

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Langley, R.S. ; Dept. of Eng., Univ. of Cambridge, Cambridge, UK

Two existing methods of probabilistic inference are based on variational principles: maximum entropy and minimum Fisher information. In each case, a probability density function is inferred by setting the first variation of a functional to zero, subject to information constraints. This study considers whether other functionals could be used for this purpose, and by starting with requirements for self-consistency and invariance, it is shown that the most general admissible functional is just a linear combination of entropy and Fisher information, with the proviso that the normal definition of Fisher information is modified by the inclusion of a prior. This amounts to an axiomatic derivation of entropy and Fisher information. The concern is with continuous random variables and both the single- and multivariable cases are considered. A number of examples are considered to compare inference based on entropy with that based on Fisher information, and to highlight the role of boundary conditions for inference based on Fisher information.

Published in:

Information Theory, IEEE Transactions on  (Volume:59 ,  Issue: 7 )