By Topic

Finite state residual vector quantization using tree-structured competitive neural network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Rizvi, S.A. ; Dept. of Electr. & Comput. Eng., State Univ. of New York, Buffalo, NY, USA ; Nasrabadi, N.M.

The performance of an ordinary vector quantizer (VQ) can be improved by incorporating memory in the VQ scheme. A VQ scheme with finite memory known as finite state vector quantization has been shown to give better performance than the ordinary VQ. The major problems with the FSVQ are the lack of accurate prediction of the current state, the state codebook design, and the amount of memory required to store all the state codebooks. The paper presents a new FSVQ scheme called finite-state residual vector quantization (FSRVQ) in which a neural network based state prediction is used. Furthermore, a novel tree-structured competitive neural network is used to jointly design the next-state and the state codebooks for the proposed FSRVQ. Simulation results show that the new scheme gives better performance with significant reduction in the memory requirement when compared to the conventional FSVQ schemes

Published in:

Acoustics, Speech, and Signal Processing, 1995. ICASSP-95., 1995 International Conference on  (Volume:4 )

Date of Conference:

9-12 May 1995