Skip to Main Content
Quantizers for probabilistic sources are usually optimized for mean-squared error. In many applications, maintaining low relative error is a more suitable objective. This measure has previously been heuristically connected with the use of logarithmic companding in perceptual coding. We derive optimal companding quantizers for fixed rate and variable rate under high-resolution assumptions. The analysis shows logarithmic companding is optimal for variable-rate quantization but generally not for fixed-rate quantization. Naturally, the improvement in relative error from using a correctly optimized quantizer can be arbitrarily large. We extend this framework for a large class of nondifference distortions.