By Topic

Convex surrogates and stable message-passing: joint parameter estimation and prediction in coupled gaussian mixture models

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Wainwright, M.J. ; Dept. of Electr. Eng. & Comput. Sci. & Stat., California Univ., Berkeley, CA

The coupled mixture of Gaussian (MoG) model is a graphical model useful for various applications in signal processing. The parameter estimation and prediction problems, though tractable for tree-structured graphs, are intractable when the local mixture models are coupled together with a more complex graph with cycles. We present a joint approach to parameter estimation and prediction/smoothing problems in a coupled MoG model for an arbitrary graph with cycles. Our method exploits a convex surrogate to the cumulant generating function, for which both the parameter estimation and prediction steps can be solved efficiently by a tree-reweighted sum-product algorithm. We prove that our methods are globally Lipschitz stable, and provide bounds on the increase in MSE relative to the (unattainable) Bayes optimum. We also present the results of experimental simulations that both confirm these theoretical results, and show that our method outperforms the analogous method based on the ordinary' sum-product algorithm

Published in:

Statistical Signal Processing, 2005 IEEE/SP 13th Workshop on

Date of Conference:

17-20 July 2005