By Topic

Spectral Graph Optimization for Instance Reduction

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Nikolaidis, K. ; Dept. of Electr. Eng. & Electron., Univ. of Liverpool, Liverpool, UK ; Rodriguez-Martinez, E. ; Goulermas, J.Y. ; Wu, Q.H.

The operation of instance-based learning algorithms is based on storing a large set of prototypes in the system's database. However, such systems often experience issues with storage requirements, sensitivity to noise, and computational complexity, which result in high search and response times. In this brief, we introduce a novel framework that employs spectral graph theory to efficiently partition the dataset to border and internal instances. This is achieved by using a diverse set of border-discriminating features that capture the local friend and enemy profiles of the samples. The fused information from these features is then used via graph-cut modeling approach to generate the final dataset partitions of border and nonborder samples. The proposed method is referred to as the spectral instance reduction (SIR) algorithm. Experiments with a large number of datasets show that SIR performs competitively compared to many other reduction algorithms, in terms of both objectives of classification accuracy and data condensation.

Published in:

Neural Networks and Learning Systems, IEEE Transactions on  (Volume:23 ,  Issue: 7 )