By Topic

Information bounds of the Fano-Kullback type

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

A large class of lower bounds relating to the performance of hypothesis testers, channel codes, and source compression codes is developed. These are extensions of Fano's inequality on the one hand, and of the discrimination inequality of Kullback on the other. The hypothesis testing and channel coding bounds are interesting primarily for small blocklengths and, in general, are asymptotically inferior to the well-known exponentially decreasing bounds. The source compression results include new proofs of converse coding theorems. A lower bound is given to the probability that a source produces an output block which cannot be encoded within a desired maximum distortion.

Published in:

Information Theory, IEEE Transactions on  (Volume:22 ,  Issue: 4 )