By Topic

Coding theorems for Shannon's cipher system with correlated source outputs, and common information

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Yamamoto, H. ; Dept. of Commun. & Syst., Univ. of Electro-Commun., Tokyo, Japan

Source coding problems are treated for Shannon's (1949) cipher system with correlated source outputs (X,Y). Several cases are considered based on whether both X and Y, only X, or only Y must be transmitted to the receiver, whether both X and Y, only X, or only Y must be kept secret, or whether the security level is measured by (1/KH(XK|W), (1/KH(YK|W)) or 1/K H(XKYK|W) where W is a cryptogram. The admissible region of cryptogram rate and key rate for a given security level is derived for each case. Furthermore, two new kinds of common information of X and Y, say C1(X;Y) and C2(X;Y), are considered. C1(X;Y) is defined as the rate of the attainable minimum core of (XK,YK) by removing each private information from (XK,YK) as much as possible, while C2(X;Y) is defined as the rate of the attainable maximum core VC such that if one loses VC , then each uncertainty of XK and YK becomes H(VC). It is proved that C1(X;Y)=I(X;Y) and C2(X;Y)=min {H(X), H(Y)}. C1(X;Y) justifies the author's intuitive feeling that the mutual information represents a common information of X and Y

Published in:

Information Theory, IEEE Transactions on  (Volume:40 ,  Issue: 1 )