Skip to Main Content
We present a new proof of a celebrated result of Gács and Körner that the common information is far less than the mutual information. Consider two sequences α1,... αn and β1,... βn of random variables, where pairs (α1, β1),... (αn, βn) are independent and identically distributed. Gács and Körner proved that it is not possible to extract “common information” from these two sequences unless the joint distribution matrix of random variables (αi, βi) is a block matrix. In 2000, Romashchenko introduced a notion of chain independent random variables and gave a simple proof of the result of Gács and Körner for chain independent random variables. Furthermore, Romashchenko showed that Boolean random variables α and β are chain independent unless α = β a.s. or α = 1 - β a.s. In this paper, we generalize this result to arbitrary (finite) distributions of α and β and thus give a simple proof of the result of Gács and Körner.