By Topic

Comments on "Efficient training algorithms for HMMs using incremental estimation"

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
W. Byrne ; Dept. of Electr. & Comput. Eng., Johns Hopkins Univ., Baltimore, MD, USA ; A. Gunawardana

The paper entitled "Efficient training algorithms for HMMs using incremental estimation" by Gotoh et al. (IEEE Trans. Speech Audio Processing, vol.6, p.539-48, Nov. 1998) investigated expectation maximization (EM) procedures that increase training speed. The claim of Gotoh et al. that these procedures are generalized EM (Dempster et al. 1977) procedures is shown to be incorrect in the present paper. We discuss why this is so, provide an example of nonmonotonic convergence to a local maximum in likelihood, and outline conditions that guarantee such convergence.

Published in:

IEEE Transactions on Speech and Audio Processing  (Volume:8 ,  Issue: 6 )