Skip to Main Content
This paper considers a general linear vector Gaussian channel with arbitrary signaling and pursues two closely related goals: i) closed-form expressions for the gradient of the mutual information with respect to arbitrary parameters of the system, and ii) fundamental connections between information theory and estimation theory. Generalizing the fundamental relationship recently unveiled by Guo, Shamai, and Verdu´, we show that the gradient of the mutual information with respect to the channel matrix is equal to the product of the channel matrix and the error covariance matrix of the best estimate of the input given the output. Gradients and derivatives with respect to other parameters are then found via the differentiation chain rule.
Date of Publication: Jan. 2006