By Topic

Semi-supervised Feature Extraction Using Independent Factor Analysis

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Oukhellou, L. ; GRETTIA-IFSTTAR, Univ. Paris-Est (UPE), Noisy-le-Grand, France ; Come, E. ; Aknin, P. ; Denoeux, T.

Efficient dimensionality reduction can involve generative latent variable models such as probabilistic principal component analysis (PPCA) or independent component analysis (ICA). Such models aim to extract a reduced set of variables (latent variables) from the original ones. In most cases, the learning of these models occur within an unsupervised framework where only unlabeled samples are used. In this paper, we investigate the possibility of estimating an independent factor analysis model (IFA), and thus projecting original data onto a lower dimensional space, when prior knowledge on the cluster membership of some training samples is incorporated. We propose to allow this model to learn within a semi-supervised framework in which likelihood of both labeled and unlabeled samples is maximized by a generalized expectation-maximization (GEM) algorithm. Experimental results with real data sets are provided to demonstrate the ability of our approach to find a low dimensional manifold with good explanatory power.

Published in:

Machine Learning and Applications and Workshops (ICMLA), 2011 10th International Conference on  (Volume:2 )

Date of Conference:

18-21 Dec. 2011