By Topic

Training of neural network ensemble through progressive interaction

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Akhand, M.A.H. ; Grad. Sch. of Eng., Univ. of Fukui, Fukui ; Islam, M.M. ; Murase, K.

This paper presents an interactive training method for neural network ensembles (NNEs). For an NNE, proposed method trains component neural networks (NNs) one after another sequentially and interactions among the NNs are maintained indirectly via an intermediate space, called information center (IC). IC manages outputs of all previously trained NNs. Update rule, to train an NN in conjunction with IC, is developed from negative correlation learning (NCL) and defined the proposed method as progressive NCL (pNCL). The introduction of such an information center in ensemble methods reduces the training time interaction among component NNs. The effectiveness of the proposed method is evaluated on several benchmark classification problems. The experimental results show that the proposed approach can improve the performance of NNEs. pNCL is incorporated with two popular NNE methods, bagging and boosting. It is also found that the performance of bagging and boosting algorithms can be further improved by incorporating pNCL with their training processes.

Published in:

Neural Networks, 2008. IJCNN 2008. (IEEE World Congress on Computational Intelligence). IEEE International Joint Conference on

Date of Conference:

1-8 June 2008