By Topic

Noise prediction for channels with side information at the transmitter

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Erez, U. ; Dept. of Electr. Eng.-Syst., Tel Aviv Univ., Israel ; Zamir, R.

The computation of channel capacity with side information at the transmitter side (but not at the receiver side) requires, in general, extension of the input alphabet to a space of “strategies”, and is often hard. We consider the special case of a discrete memoryless module-additive noise channel Y=X+Zs, where the encoder observes causally the random state S∈S that governs the distribution of the noise Zs. We show that the capacity of this channel is given by C=log|χ|-mint:S→χH(Z S-t(S)). This capacity is realized by a state-independent code, followed by a shift by the “noise prediction” tmin(S) that minimizes the entropy of Zs-t(S). If the set of conditional noise distributions {p(z|s),s∈S} is such that the optimum predictor tmin(·) is independent of the state weights, then C is also the capacity for a noncausal encoder, that observes the entire state sequence in advance. Furthermore, for this case we also derive a simple formula for the capacity when the state process has memory

Published in:

Information Theory, IEEE Transactions on  (Volume:46 ,  Issue: 4 )