Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Second-order asymptotics of mutual information

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Prelov, V.V. ; Inst. for Problems of Inf. Transmission, Russian Acad. of Sci., Moscow, Russia ; Verdu, S.

A formula for the second-order expansion of the input-output mutual information of multidimensional channels as the signal-to-noise ratio (SNR) goes to zero is obtained. While the additive noise is assumed to be Gaussian, we deal with very general classes of input and channel distributions. As special cases, these channel models include fading channels, channels with random parameters, and channels with almost Gaussian noise. When the channel is unknown at the receiver, the second term in the asymptotic expansion depends not only on the covariance matrix of the input signal but also on the fourth mixed moments of its components. The study of the second-order asymptotics of mutual information finds application in the analysis of the bandwidth-power tradeoff achieved by various signaling strategies in the wideband regime.

Published in:

Information Theory, IEEE Transactions on  (Volume:50 ,  Issue: 8 )