Skip to Main Content
By relating the average probability of error to the distortion measure of a source-sink pair, we prove a converse to the channel coding theorem. This converse lower-bounds the probability of error per source letter. It differs from the more familiar "weak" and "strong" converses which bound the probability of error of an entire message. The result is applicable to all stationary sources, all channels, and all block lengths. Lower-bounding the rate distortion function of the source-sink pair with which the channel is to be used reduces the new result to a lower bound on the achievable probability of error per source letter expressed in terms of the source entropy, alphabet size, and maximum achievable average mutual information on the channel. This latter result had been proved previously only for a memoryless channel operating with an independent letter source.