Skip to Main Content
Bucklew's (1984) high-rate vector quantizer mismatch result is extended from fixed-rate coding to variable-rate coding using a Lagrangian formulation. It is shown that if an asymptotically (high-rate) optimal sequence of variable rate codes is designed for a k-dimensional probability density function (PDF) g and then applied to another PDF f for which f/g is bounded, then the resulting mismatch or loss of performance from the optimal possible is given by the relative entropy or Kullback-Leibler (1968) divergence I(f||g). It is also shown that under the same assumptions, an asymptotically optimal code sequence for g can be converted to an asymptotically optimal code sequence for a mismatched source f by modifying only the lossless component of the code. Applications to quantizer design using uniform and Gaussian densities are described, including a high-rate analog to the Shannon rate-distortion result of Sakrison (1975) and Lapidoth (1997) showing that the Gaussian is the "worst case" for lossy compression of a source with known covariance. By coupling the mismatch result with composite quantizers, the worst case properties of uniform and Gaussian densities are extended to conditionally uniform and Gaussian densities, which provides a Lloyd clustering algorithm for fitting mixtures to general densities.