Skip to Main Content
This paper extends our previous work on large margin estimation (LME) of GMM parameters with extend Baum-Welch (EBW) for spoken language recognition. To overcome the problem in the LME that negative samples in the training set are not used in parameter estimation, we propose a soft margin estimation (SME) method in this paper. The soft margin is scaled by a loss function measuring the distance between a negative sample and the classification boundary. We formulate the constrained optimization of SME as an unconstrained optimization among both positive samples and negative samples using a penalty function, and update the GMM parameters with the EBW algorithm. Experiments on the NIST language recognition evaluation (LRE) 2007 task show that the SME method effectively improves the LME performance.