Skip to Main Content
The rate distortion function of an information source was introduced by Shannon to specify the channel capacity required in transmitting information from the source with an average distortion not exceeding . Exact rates have been calculated for Gaussian sources under a mean-square error criterion. For non-Gaussian continuous sources, Shannon has given upper and lower bounds on . In specific cases, the difference between these two bounds may not be sufficiently small to provide a useful estimate of . The present paper is concerned with improving estimates of information rates of non-Gaussian sources under a mean-square error criterion. The sources considered are ergodic, and their statistical properties are characterized by a bounded and continuous -dimensional probability density function. The paper gives a set of necessary and sufficient conditions for to equal Shannon's lower bound. For sources satisfying these conditions, exact rate calculations are possible. For sources that do not satisfy the required conditions, an improved upper bound is obtained that never exceeds Shannon's upper bound. Under rather general conditions, the new upper bound approaches Shannon's lower bound for small values of distortion, so that the true value of can be estimated very accurately for small .