Skip to Main Content
Multiple classifier systems improve the recognition performance of a discrimination task considerably, which makes them very attractive for pattern recognition products. Two aspects are eminently important: firstly, how can a powerful classifier ensemble be generated effectively and secondly, what classifier combination rule would produce the best collective result. This paper proposes a new boosting strategy, to generate a powerful classifier ensemble. The strategy trains a classifier ensemble by using sequentially selected learning sample subsets. The first subset is gained from the initial learning sample set. Each following subset is obtained from the previous steps subset by eliminating selected items. The selection criterion is a recognition quality limit, which divides the actual subset into error free and error containing result regions. The portion corresponding to the error-containing region provides the basis for development of the next step classifier. The sampling subset is reduced iteratively until the discrimination with the last-trained classifier is almost errorless. A boost system, designed and developed in this way, shows excellent reclassification and a reduction of about 30 percent in the generalization error.