Skip to Main Content
When transmitting a sampled signal digitally, data and error correction bits must be transmitted at least as fast as the sampling rate. Typically, each bit is allocated the same transmission time interval, which means the optimal detector yields the same error probability for each bit. An alternative is to vary the bit interval duration according to the bit's contribution to the reconstructed sample. The optimal solution yields significant gains in mean-squared error (several dB) over that provided by equal-duration bit intervals. These gains occurred over a wide range of signal-to-noise ratios. When block error correction is performed, we derive the optimal decoder from a Bayesian viewpoint and show that gains obtain here as well.