By Topic

Chain Independence and Common Information

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Makarychev, K. ; Microsoft Res., Redmond, WA, USA ; Makarychev, Y.

We present a new proof of a celebrated result of Gács and Körner that the common information is far less than the mutual information. Consider two sequences α1,... αn and β1,... βn of random variables, where pairs (α1, β1),... (αn, βn) are independent and identically distributed. Gács and Körner proved that it is not possible to extract “common information” from these two sequences unless the joint distribution matrix of random variables (αi, βi) is a block matrix. In 2000, Romashchenko introduced a notion of chain independent random variables and gave a simple proof of the result of Gács and Körner for chain independent random variables. Furthermore, Romashchenko showed that Boolean random variables α and β are chain independent unless α = β a.s. or α = 1 - β a.s. In this paper, we generalize this result to arbitrary (finite) distributions of α and β and thus give a simple proof of the result of Gács and Körner.

Published in:

Information Theory, IEEE Transactions on  (Volume:58 ,  Issue: 8 )