By Topic

Connectionist training of non-linear hidden Markov models for speech recognition

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Z. Zhao ; Dept. of Electron. Syst. Eng., Essex Univ., Colchester, UK

Neural networks and hidden Markov models (HMMs) are compared. It is shown that the conventional HMMs are equivalent to linear recurrent networks (LRNs) with time varying weights. Inspired by the nonlinear nature of nodes in the neural networks, nonlinearity is introduced into the HMMs. Accordingly, a connectionist training approach is proposed to train such nonlinear HMMs. The training is discriminative when the objective function is defined as the mutual information between the observed event and the Markov model. The introduction of nonlinearity allows one to view the HMMs in a broader perspective. For instance, the normalizing of forward probability can be interpreted as a kind of nonlinearity in a nonlinear HMM. The proposed training algorithm has been tested on a speaker-dependent isolated digit recognition problem; this test demonstrated that the discriminative power of the HMMs can be enhanced

Published in:

Neural Networks, 1991. 1991 IEEE International Joint Conference on

Date of Conference:

18-21 Nov 1991