By Topic

MMI-based training for a probabilistic neural network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Nan Bu ; Dept. of the Artificial Complex Syst. Eng., Hiroshima Univ., Japan ; Tsuji, T. ; Fukuda, O.

Probabilistic neural networks (PNNs) that incorporate the Bayesian decision rule and statistical models have been widely used for pattern classification. Efficient estimation of the PNN's weights, however, is still a major problem. In this paper, we propose a new training scheme based on a discriminative criterion, maximum mutual information (MMI), and apply this method to the log-linearized Gaussian mixture network (LLGMN) which is one of the PNNs. The MMI training achieves a consistent estimator of network weights, and includes the conventional maximum likelihood (ML) algorithm as a special case. Also, the dynamics of terminal attractor (TA) is introduced for iteration control of the MMI training. Finally, the classification ability of the proposed method is examined with a pattern classification problem of the electromyogram (EMG) signals, and found that the MMI training results in better classification than the conventional ML algorithm.

Published in:

Neural Networks, 2003. Proceedings of the International Joint Conference on  (Volume:4 )

Date of Conference:

20-24 July 2003