By Topic

The rate loss in the Wyner-Ziv problem

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

The rate-distortion function for source coding with side information at the decoder (the “Wyner-Ziv problem”) is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of multiterminal source coding problems, corresponds to a loss in coding rate with respect to the conditional rate-distortion function, i.e., to the case where the encoder is fully informed. We show that for difference (or balanced) distortion measures, this loss is bounded by a universal constant, which is the minimax capacity of a suitable additive-noise channel. Furthermore, in the worst case, this loss is equal to the maximin redundancy over the rate-distortion function of the additive noise “test” channel. For example, the loss in the Wyner-Ziv problem is less than 0.5 bit/sample in the squared-error distortion case, and it is less than 0.22 bit for a binary source with Hamming distance. These results have implications also in universal quantization with side information, and in more general multiterminal source coding problems

Published in:

Information Theory, IEEE Transactions on  (Volume:42 ,  Issue: 6 )