Skip to Main Content
This paper describes a semi-supervised regularized method for additive logistic regression. The graph regularization term of the combined functions is added to the original cost functional used in AdaBoost. This term constrains the learned function to be smooth on a graph. Then the gradient solution is computed with the advantage that the regularization parameter can be adaptively selected. Finally, the function step-size of each iteration can be computed using Newton-Raphson iteration. Experiments on benchmark data sets show that the algorithm gives better results than existing methods.