By Topic

Ensemble classifier design by parallel distributed implementation of genetic fuzzy rule selection for large data sets

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Nojima, Yusuke ; Dept. of Comput. Sci. & Intell. Syst., Osaka Prefecture Univ., Sakai, Japan ; Mihara, S. ; Ishibuchi, H.

Evolutionary algorithms have been actively applied to knowledge discovery, data mining and machine learning under the name of genetics-based machine learning (GBML). The main advantage of using evolutionary algorithms in those application areas is their flexibility: Various knowledge extraction criteria such as accuracy and complexity can be easily utilized as fitness functions. On the other hand, the main disadvantage is their large computation load. It is not easy to apply evolutionary algorithms to large data sets. The scalability improvement to large data sets is one of the main research issues in GBML. In our former studies, we proposed an idea of parallel distributed implementation of GBML and examined its effectiveness for genetic fuzzy rule selection. The point of our idea was to realize a quadratic speed-up by dividing not only a population but also training data. Training data subsets were periodically rotated over sub-populations in order to prevent each sub-population from over-fitting to a specific training data subset. In this paper, we propose the use of parallel distributed implementation for the design of ensemble classifiers. An ensemble classifier is designed by combining base classifiers, each of which is obtained from each sub-population. Through computational experiments on parallel distributed genetic fuzzy rule selection, we examine the generalization ability of designed ensemble classifiers under various settings with respect to the size of training data subsets and their rotation frequency.

Published in:

Evolutionary Computation (CEC), 2010 IEEE Congress on

Date of Conference:

18-23 July 2010