By Topic

Semi-supervised logistic regression via manifold regularization

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Yu Mao ; Center of Information Science and Technology, Department of Computer Science, Beijing University of Posts and Telecommunications, Beijing ; Muyuan Xi ; Hao Yu ; Xiaojie Wang

In this paper, we propose a novel algorithm that extends the classical probabilistic models to semi-supervised learning framework via manifold regularization. This regularization is used to control the complexity of the model as measured by the geometry of the distribution. Specifically, the intrinsic geometric structure of data is modeled by an adjacency graph, then, the graph Laplacian, analogous to the Laplace-Beltrami operator on manifold, is applied to smooth the data distributions. We realize the regularization framework by applying manifold regularization to conditionally trained log-linear maximum entropy models, which are also known as multinomial logistic regression models. Experimental evidence suggests that our algorithm can exploit the geometry of the data distribution effectively and provide consistent improvement of accuracy. Finally, we give a short discussion of generalizing manifold regularization framework to other probabilistic models.

Published in:

2011 IEEE International Conference on Cloud Computing and Intelligence Systems

Date of Conference:

15-17 Sept. 2011