By Topic

On the maximum entropy of the sum of two dependent random variables

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Cover, T.M. ; Dept. of Electr. Eng., Stanford Univ., CA, USA ; Zhen Zhang

Investigates the maximization of the differential entropy h(X+Y) of arbitrary dependent random variables X and Y under the constraints of fixed equal marginal densities for X and Y. We show that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave. The maximum is achieved when X=Y. If f is not log-concave, the maximum is strictly greater than h(2X). As an example, identically distributed Gaussian random variables have log-concave densities and satisfy max[h(X+Y)]=h(2X) with X=Y. More general inequalities in this direction should lead to capacity bounds for additive noise channels with feedback

Published in:

Information Theory, IEEE Transactions on  (Volume:40 ,  Issue: 4 )