By Topic

Best-case results for nearest-neighbor learning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
S. Salzberg ; Dept. of Comput. Sci., Johns Hopkins Univ., Baltimore, MD, USA ; A. L. Delcher ; D. Heath ; S. Kasif

Proposes a theoretical model for analysis of classification methods, in which the teacher knows the classification algorithm and chooses examples in the best way possible. The authors apply this model using the nearest-neighbor learning algorithm, and develop upper and lower bounds on sample complexity for several different concept classes. For some concept classes, the sample complexity turns out to be exponential even using this best-case model, which implies that the concept class is inherently difficult for the NN algorithm. The authors identify several geometric properties that make learning certain concepts relatively easy. Finally the authors discuss the relation of their work to helpful teacher models, its application to decision tree learning algorithms, and some of its implications for experimental work

Published in:

IEEE Transactions on Pattern Analysis and Machine Intelligence  (Volume:17 ,  Issue: 6 )