Skip to Main Content
We provide a novel upper-bound on Witsenhausen's rate, the rate required in the zero-error analogue of the Slepian-Wolf problem. Our bound is given in terms of a new information-theoretic functional defined on a certain graph and is derived by upper bounding complementary graph entropy. We use the functional, along with graph entropy, to give a single letter lower-bound on the error exponent for the Slepian-Wolf problem under the vanishing error probability criterion, where the decoder has full (i.e., unencoded) side information. We demonstrate that our error exponent can beat the “expurgated” source-coding exponent of Csiszár and Körner for some sources that have zeroes in the “channel” matrix connecting the source with the side information. An extension of our scheme to the lossy case (i.e., Wyner-Ziv) is given. For the case in which the side information is a deterministic function of the source, the exponent of our improved scheme agrees with the sphere-packing bound exactly (thus determining the reliability function). An application of our functional to zero-error channel capacity is also given.
Date of Publication: Sept. 2011