By Topic

k_n -nearest neighbor classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

The k_n nearest neighbor classification rule is a nonparametric classification procedure that assigns a random vector Z to one of two populations \pi_1, \pi_2 . Samples of equal size n are taken from \pi_1 and \pi_2 and are ordered separately with respect to their distance from Z = z . The rule assigns Z to \pi_1 if the distance of the k_n th sample observation from \pi_1 to z is less than the distance of the k_n th sample observation from \pi_2 to z ; otherwise Z is assigned to \pi_2 . This rule is equivalent to the Fix and Hodges, "majority rule" [4] or the nearest neighbor rule of Cover and Hart [3]. This paper studies some asymptotic properties of this rule including an expression for a consistent upper bound on the probability of misclassification.

Published in:

Information Theory, IEEE Transactions on  (Volume:18 ,  Issue: 5 )