Skip to Main Content
This paper presents an interactive training method for neural network ensembles (NNEs). For an NNE, proposed method trains component neural networks (NNs) one after another sequentially and interactions among the NNs are maintained indirectly via an intermediate space, called information center (IC). IC manages outputs of all previously trained NNs. Update rule, to train an NN in conjunction with IC, is developed from negative correlation learning (NCL) and defined the proposed method as progressive NCL (pNCL). The introduction of such an information center in ensemble methods reduces the training time interaction among component NNs. The effectiveness of the proposed method is evaluated on several benchmark classification problems. The experimental results show that the proposed approach can improve the performance of NNEs. pNCL is incorporated with two popular NNE methods, bagging and boosting. It is also found that the performance of bagging and boosting algorithms can be further improved by incorporating pNCL with their training processes.