By Topic

Incremental learning and model selection under virtual concept drifting environments

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Yamauchi, K. ; Dept. of Comput. Sci., Chubu Univ., Kasugai, Japan

This paper presents an incremental learning and model selection method under the virtual concept drifting environments, where their prior distribution of inputs is changing over time. In the previous work, a statistical model of the virtual concept drift was constructed, and the model-selection criterion for radial basis function neural networks (RBFNNs) under such environments was built with the environmental model (Yamauchi 2009). However, in the previous model, no consideration was given to reducing the computational complexity and storage space for storing learned samples used in future re-learning. This study extends the previous model to a new one that uses less storage space. The extended model uses pseudo-learning samples generated by its RBFNN predecessor instead of using the real old learning samples.

Published in:

Neural Networks (IJCNN), The 2010 International Joint Conference on

Date of Conference:

18-23 July 2010