By Topic

A Machine Learning Feature Reduction Technique for Feature Based Knowledge Systems

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Yeung, Daniel ; Professor, The Hong Kong Polytechnic University, Kowloon, Hong Kong. csdaniel@inet.polyu.edu.hk

Generalization error model provides a theoretical support for a pattern classifier's performance in terms of prediction accuracy. However, existing models give very loose error bounds. This explains why classification systems generally rely on experimental validation for their claims on prediction accuracy. In this talk we will revisit this problem and explore the idea of developing a new generalization error model based on the assumption that only prediction accuracy on unseen points in a neighbourhood of a training point will be considered, since it will be unreasonable to require a pattern classifier to accurately predict unseen points "far away" from training samples. The new error model makes use of the concept of sensitivity measure for a multiplayer feedforward neural network (Multilayer Perceptron or Radial Basis Function Neural Network). It could be demonstrated that any knowledgebase system represented by a set of features may be simplified by reducing its feature set using such a model. A number of experimental results using datasets such as the UCI and the 99 KDD Cup will be presented.

Published in:

Information Reuse and Integration, 2007. IRI 2007. IEEE International Conference on

Date of Conference:

13-15 Aug. 2007