Skip to Main Content
This paper builds upon our recent work which computed the moment generating function of the multiple-input multiple-output mutual information exactly in terms of a Painlevé V differential equation. By exploiting this key analytical tool, we provide an in-depth characterization of the mutual information distribution for sufficiently large (but finite) antenna numbers. In particular, we derive systematic closed-form expansions for the high-order cumulants. These results yield considerable new insight, such as providing a technical explanation as to why the well-known Gaussian approximation is quite robust to large signal-to-noise ratio for the case of unequal antenna arrays, while it deviates strongly for equal antenna arrays. In addition, by drawing upon our high-order cumulant expansions, we employ the Edgeworth expansion technique to propose a refined Gaussian approximation which is shown to give a very accurate closed-form characterization of the mutual information distribution, both around the mean and for moderate deviations into the tails (where the Gaussian approximation fails remarkably). For stronger deviations where the Edgeworth expansion becomes unwieldy, we employ the saddle point method and asymptotic integration tools to establish new analytical characterizations which are shown to be very simple and accurate. Based on these results, we also recover key well-established properties of the tail distribution, including the diversity-multiplexing-tradeoff.