By Topic

An ensemble based incremental learning framework for concept drift and class imbalance

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Ditzler, G. ; ECE Dept., Rowan Univ., Glassboro, NJ, USA ; Polikar, R.

We have recently introduced an incremental learning algorithm, Learn++.NSE, designed to learn in nonstationary environments, and has been shown to provide an attractive solution to a number of concept drift problems under different drift scenarios. However, Learn++.NSE relies on error to weigh the classifiers in the ensemble on the most recent data. For balanced class distributions, this approach works very well, but when faced with imbalanced data, error is no longer an acceptable measure of performance. On the other hand, the well-established SMOTE algorithm can address the class imbalance issue, however, it cannot learn in nonstationary environments. While there is some literature available for learning in nonstationary environments and imbalanced data separately, the combined problem of learning from imbalanced data coming from nonstationary environments is underexplored. Therefore, in this work we propose two modified frameworks for an algorithm that can be used to incrementally learn from imbalanced data coming from a nonstationary environment.

Published in:

Neural Networks (IJCNN), The 2010 International Joint Conference on

Date of Conference:

18-23 July 2010