By Topic

On essentially conditional information inequalities

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Kaced, T. ; LIF de Marseille, Univ. Aix-Marseille, Marseille, France ; Romashchenko, A.

In 1997, Z. Zhang and R.W. Yeung found the first example of a conditional information inequality in four variables that is not “Shannon-type”. This linear inequality for entropies is called conditional (or constraint) since it holds only under condition that some linear equations are satisfied for the involved entropies. Later, the same authors and other researchers discovered several unconditional information inequalities that do not follow from Shannon's inequalities for entropy. In this paper we show that some non Shannon-type conditional inequalities are “essentially” conditional, i.e., they cannot be extended to any unconditional inequality. We prove one new essentially conditional information inequality for Shannon's entropy and discuss conditional information inequalities for Kolmogorov complexity.

Published in:

Information Theory Proceedings (ISIT), 2011 IEEE International Symposium on

Date of Conference:

July 31 2011-Aug. 5 2011