By Topic

Learning probability density functions from marginal distributions with applications to Gaussian mixtures

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Qutang Cai ; Dept. of Autom., Tsinghua Univ., Beijing, China ; Changshui Zhang ; Chunyi Peng

Probability density function (PDF) estimation is a constantly important topic in the fields related to artificial intelligence and machine learning. This paper is dedicated to considering problems on the estimation of a density function simply from its marginal distributions. The possibility of the learning problem is first investigated and a uniqueness proposition involving a large family of distribution functions is proposed. The learning problem is then reformulated into an optimization task which is studied and applied to Gaussian mixture models (GMM) via the generalized expectation maximization procedure (GEM) and Monte Carlo method. Experimental results show that our approach for GMM, only using partial information of the coordinates of the samples, can obtain satisfactory performance, which in turn verifies the proposed reformulation and proposition.

Published in:

Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on  (Volume:2 )

Date of Conference:

31 July-4 Aug. 2005