Cart (Loading....) | Create Account
Close category search window
 

Joint Parsimonious Modeling and Model Order Selection for Multivariate Gaussian Mixtures

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Markley, S.C. ; RKF Eng. LLC, Washington, DC, USA ; Miller, D.J.

Multivariate Gaussian mixture models (GMMs) are widely for density estimation, model-based data clustering, and statistical classification. A difficult problem is estimating the model order, i.e., the number of mixture components, and model structure. Use of full covariance matrices, with number of parameters quadratic in the feature dimension, entails high model complexity, and thus may underestimate order, while naive Bayes mixtures may introduce model bias and lead to order overestimates. We develop a parsimonious modeling and model order and structure selection method for GMMs which allows for and optimizes over parameter tying configurations across mixture components applied to each individual parameter, including the covariates. We derive a generalized Expectation-Maximization algorithm for [(Bayesian information criterion (BIC)-based] penalized likelihood minimization. This, coupled with sequential model order reduction, forms our joint learning and model selection. Our method searches over a rich space of models and, consistent with minimizing BIC, achieves fine-grained matching of model complexity to the available data. We have found our method to be effective and largely robust in learning accurate model orders and parameter-tying structures for simulated ground-truth mixtures. We compared against naive Bayes and standard full-covariance GMMs for several criteria: 1) model order and structure accuracy (for synthetic data sets); 2) test set log-likelihood; 3) unsupervised classification accuracy; and 4) accuracy when class-conditional mixtures are used in a plug-in Bayes classifier. Our method, which chooses model orders intermediate between standard and naive Bayes GMMs, gives improved accuracy with respect to each of these performance measures.

Published in:

Selected Topics in Signal Processing, IEEE Journal of  (Volume:4 ,  Issue: 3 )

Date of Publication:

June 2010

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.