By Topic

Considerations about sample-size sensitivity of a family of edited nearest-neighbor rules

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
F. J. Ferri ; Dept. Inf. i Electron., Valencia Univ., Spain ; J. V. Albert ; E. Vidal

The edited nearest neighbor classification rules constitute a valid alternative to k-NN rules and other nonparametric classifiers. Experimental results with synthetic and real data from various domains and from different researchers and practitioners suggest that some editing algorithms (especially, the optimal ones) are very sensitive to the total number of prototypes considered. This paper investigates the possibility of modifying optimal editing to cope with a broader range of practical situations. Most previously introduced editing algorithms are presented in a unified form and their different properties (acid not just their asymptotic behavior) are intuitively analyzed. The results show the relative limits in the applicability of different editing algorithms

Published in:

IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics)  (Volume:29 ,  Issue: 5 )