By Topic

A class of new KNN methods for low sample problems

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Parthasarathy, G. ; Dept. of Electron. & Electr. Commun. Eng., Indian Inst. of Technol., Kharagpur, India ; Chatterji, B.N.

The K-nearest-neighbor (KNN) estimates proposed by D.O. Loftsgaarden and C.P. Quesenbery (1965) give unbiased and consistent estimates of the probability density function of a random variable from N observations of that random variable when K, the number of nearest neighbors considered, and N, the total number of observations available, tend to infinity such that K/N →0. A class of new KNN estimates is proposed as weighted averages of K KNN estimates, and it is shown that in small sample problems they give closer estimates to the true probability density than the traditional KNN estimates. On the basis of some experimental results, the KNN rules based on these estimates are shown to be suitable for small sample classification problems

Published in:

Systems, Man and Cybernetics, IEEE Transactions on  (Volume:20 ,  Issue: 3 )