Skip to Main Content
When training data is not sufficient, boosting algorithms tend to overfit as more weak learners are combined to form a strong classifier. In this paper, we propose a new variant of RealBoost, called W-Boost, which is based on a novel weight update scheme and uses changeable bin number to estimate marginal distributions in weak learner design. This new boosting procedure results in both fast convergence rate and small generalization error. Experimental results on synthetic data and Web image classification demonstrate the effectiveness of our approach.