We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Maximum entropy relaxation for multiscale graphical model selection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Myung Jin Choi ; Dept. of Electr. Eng. & Comput. Sci., Massachusetts Inst. of Technol., Cambridge, MA ; Chandrasekaran, V. ; Willsky, A.S.

We consider the problem of learning multiscale graphical models. Given a collection of variables along with covariance specifications for these variables, we introduce hidden variables and learn a sparse graphical model approximation on the entire set of variables (original and hidden). Our method for learning such models is based on maximizing entropy over an exponential family of graphical models, subject to divergence constraints on small subsets of variables. We demonstrate the advantages of our approach compared to methods that do not use hidden variables (which do not capture long-range behavior) and methods that use tree-structure approximations (which result in blocky artifacts).

Published in:

Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on

Date of Conference:

March 31 2008-April 4 2008