Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

The Interplay Between Entropy and Variational Distance

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Siu-Wai Ho ; Inst. for Telecommun. Res., Univ. of South Australia, Adelaide, SA, Australia ; Yeung, R.W.

The relation between the Shannon entropy and variational distance, two fundamental and frequently-used quantities in information theory, is studied in this paper by means of certain bounds on the entropy difference between two probability distributions in terms of the variational distance between them and their alphabet sizes. We also show how to find the distribution achieving the minimum (or maximum) entropy among those distributions within a given variational distance from any given distribution. These results are applied to solve a number of problems that are of fundamental interest. For entropy estimation, we obtain an analytic formula for the confidence interval, solving a problem that has been opened for more than 30 years. For approximation of probability distributions, we find the minimum entropy difference between two distributions in terms of their alphabet sizes and the variational distance between them. In particular, we show that the entropy difference between two distributions that are close in variational distance can be arbitrarily large if the alphabet sizes of the two distributions are unconstrained. For random number generation, we characterize the tradeoff between the amount of randomness required and the distortion in terms of variation distance. New tools for non-convex optimization have been developed to establish the results in this paper.

Published in:

Information Theory, IEEE Transactions on  (Volume:56 ,  Issue: 12 )