By Topic

Gaussian codes and Shannon bounds for multiple descriptions

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
R. Zamir ; Dept. of Electr. Eng.-Syst., Tel Aviv Univ., Israel

A pair of well-known inequalities, due to Shannon, upper/lower bound the rate-distortion function of a real source by the rate-distortion function of the Gaussian source with the same variance/entropy. We extend these bounds to multiple descriptions, a problem for which a general “single-letter” solution is not known. We show that the set DX(R1, R2) of achievable marginal (d1, d2) and central (d0) mean-squared errors in decoding X from two descriptions at rates R1 and R2 satisfies D*(σx2, R1, R2)⊆D X(R1, R2)⊆D*(Px, R1, R2) where σx2 and Px are the variance and the entropy-power of X, respectively, and D*(σ2, R1, R2) is the multiple description distortion region for a Gaussian source with variance σ2 found by Ozarow (1980). We further show that like in the single description case, a Gaussian random code achieves the outer bound in the limit as d1, d2→0, thus the outer bound is asymptotically tight at high resolution conditions

Published in:

IEEE Transactions on Information Theory  (Volume:45 ,  Issue: 7 )