By Topic

A Re-Examination of the Distance-Weighted k-Nearest Neighbor Classification Rule

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Macleod, J.E.S. ; Department of Electronics and Electrical Engineering, University of Glasgow, Glasgow G12 8QQ, Scotland, United Kingdom ; Luk, A. ; Titterington, D.M.

It was previously proved by Bailey and Jain that the asymptotic classification error rate of the (unweighted) k-nearest neighbor (k-NN) rule is lower than that of any weighted k-NN rule. Equations are developed for the classification error rate of a test sample when the number of training samples is finite, and it is argued intuitively that a weighted rule may then in some cases achieve a lower error rate than the unweighted rule. This conclusion is confirmed by analytically solving a particular simple problem, and as an illustration, experimental results are presented that were obtained using a generalized form of a weighting function proposed by Dudani.

Published in:

Systems, Man and Cybernetics, IEEE Transactions on  (Volume:17 ,  Issue: 4 )