Notification:
We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Alphabet-constrained vector quantization

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Rao, R.P. ; Compression Labs. Inc., San Jose, CA, USA ; Pearlman, W.A.

Alphabet-constrained rate-distortion theory is extended to coding of sources with memory. Two different cases are considered: when only the size of the codebook is constrained and when the codevector values are also held fixed. For both cases, nth-order constrained-alphabet rate-distortion functions are defined and a convergent algorithm for their evaluation is presented. Specific simulations using AR(1) sources show that performance near the rate-distortion bound is possible using a reproduction alphabet consisting of a small number of codevectors. It is also shown that the additional constraint of holding the codevector values fixed does not degrade performance of the coder in relation to the size-only constrained case. This observation motivates the development of a fixed-codebook vector quantizer, called the alphabet- and entropy-constrained vector quantizer, the performance of which is comparable to the entropy-constrained vector quantizer. A number of examples using an AR(1) and a speech source are presented to corroborate the theory

Published in:

Information Theory, IEEE Transactions on  (Volume:39 ,  Issue: 4 )