By Topic

Using the self-organizing map to speed up the probability density estimation for speech recognition with mixture density HMMs

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Kurimo, M. ; Neural Networks Res. Centre, Helsinki Univ. of Technol., Espoo, Finland ; Somervuo, P.

This paper presents methods to improve the probability density estimation in hidden Markov models for phoneme recognition by exploiting the self-organizing map (SOM) algorithm. The advantage of using the SOM is based on the created approximative topology between the mixture densities by training the Gaussian mean vectors used as the kernel centers by the SOM algorithm. The topology allows the neighboring mixtures to respond strongly to the same inputs and so most of the nearest mixtures used to approximate the current observation probability will be found in the topological neighborhood of the “winner” mixture. Also the knowledge about the previous winners are used to speed up the search for the new winners. Tree-search SOMs and segmental SOM training are studied aiming at faster search and suitability for HMM training. The framework for the presented experiments includes mel-cepstrum features and phoneme-wise tied mixture density HMMs

Published in:

Spoken Language, 1996. ICSLP 96. Proceedings., Fourth International Conference on  (Volume:1 )

Date of Conference:

3-6 Oct 1996