Skip to Main Content
Multiple classifier systems (MCSs) have been shown theoretically and empirically to outperform a single classifier in many applications. However, many ensemble training algorithms sometimes create a very large MCS which is combined by many individual classifiers. A large MCS not only consumes computational resources but also decreases the effectiveness. One of the solutions is the pruning method. It reduces the number of individual classifiers inside an MCS that maintains the performance well or is just slightly worse than the original one. In this paper, a new pruning method, called NNEPSM, for Neural Network ensemble based on a sensitivity measure is proposed. The classifiers which have less impact to the final output of MCS will be removed. The advantages of this method include efficient performance, low-complexity and independence on training method. NNEPSM has been applied in Web applications and other benchmark dataset. The experimental results showed that our approach performs well using different datasets.