Skip to Main Content
An effective ensemble should consist of a set of networks that are both accurate and diverse. Ensemble learning is an algorithm to improve the generalization ability of the unstable classifier. We propose an improved boosting algorithm based on cloud model for constructing neural network ensemble, where cloud model is used to classify trained networks according to similarity and optimally select the most accurate individual network from each cluster to make up the ensemble. Empirical studies on regression of typical datasets showed that this approach yields significantly smaller ensemble achieving better performance than other traditional ones such as Bagging and Boosting. The bias variance decomposition of the predictive error shows that the success of the proposed approach may lie in its properly tuning the bias/variance trade-off to reduce the prediction error.