By Topic

Discriminative training of GMM based on Maximum Mutual Information for language identification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Qu Dan ; Dept. of Signal Analyzing Eng., Inf. Eng. Univ., Zhengzhou ; Wang Bingxi ; Yan Honggang ; Dai Guannan

In this paper, a discriminative training procedure based on maximum mutual information (MMI) for a Gaussian mixture model (GMM) language identification system is described. The idea is to find the model parameters lambda that minimize the conditional entropy Hlambda (C | X) of the random variable C given the random variable X , which means minimize the uncertainty in knowing what language was spoken given access to the utterance in X . The implementation of the proposal is based on the generalized probabilistic descent (GPD) algorithm formulated to estimate the GMM parameters. The evaluation is conducted using the OGI multi-language telephone speech corpus. The experimental results show such system is very effective in language identification tasks

Published in:

Intelligent Control and Automation, 2006. WCICA 2006. The Sixth World Congress on  (Volume:1 )

Date of Conference:

0-0 0