By Topic

Remote Vector Gaussian Source Coding With Decoder Side Information Under Mutual Information and Distortion Constraints

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Tian, C. ; AT&T Labs.-Res., Florham Park, NJ, USA ; Jun Chen

Let X , Y , Z be zero-mean, jointly Gaussian random vectors of dimensions nx, ny, and nz, respectively. Let P be the set of random variables W such that W harr Y harr (X, Z) is a Markov string. We consider the following optimization problem: WisinP min I(Y; Z) subject to one of the following two possible constraints: 1) I(X; W|Z) ges RI, and 2) the mean squared error between X and Xcirc = E(X|W, Z) is less than d . The problem under the first kind of constraint is motivated by multiple-input multiple-output (MIMO) relay channels with an oblivious transmitter and a relay connected to the receiver through a dedicated link, while for the second case, it is motivated by source coding with decoder side information where the sensor observation is noisy. In both cases, we show that jointly Gaussian solutions are optimal. Moreover, explicit water filling interpretations are given for both cases, which suggest transform coding approaches performed in different transform domains, and that the optimal solution for one problem is, in general, suboptimal for the other.

Published in:

Information Theory, IEEE Transactions on  (Volume:55 ,  Issue: 10 )