Skip to Main Content
In this paper, we propose a novel algorithm that extends the classical probabilistic models to semi-supervised learning framework via manifold regularization. This regularization is used to control the complexity of the model as measured by the geometry of the distribution. Specifically, the intrinsic geometric structure of data is modeled by an adjacency graph, then, the graph Laplacian, analogous to the Laplace-Beltrami operator on manifold, is applied to smooth the data distributions. We realize the regularization framework by applying manifold regularization to conditionally trained log-linear maximum entropy models, which are also known as multinomial logistic regression models. Experimental evidence suggests that our algorithm can exploit the geometry of the data distribution effectively and provide consistent improvement of accuracy. Finally, we give a short discussion of generalizing manifold regularization framework to other probabilistic models.