Abstract:
Most nearest neighbor (NN) classifiers employ NN search algorithms for the acceleration. However, NN classification does not always require the NN search. Based on this i...Show MoreMetadata
Abstract:
Most nearest neighbor (NN) classifiers employ NN search algorithms for the acceleration. However, NN classification does not always require the NN search. Based on this idea, we propose a novel algorithm named k-d decision tree (KDDT). Since KDDT uses Voronoi condensed prototypes, it is less memory consuming than naive NN classifiers. We have confirmed that KDDT is much faster than NN search based classifiers through the comparative experiment (from 9 to 369 times faster).
Published in: Third IEEE International Conference on Data Mining
Date of Conference: 22-22 November 2003
Date Added to IEEE Xplore: 19 December 2003
Print ISBN:0-7695-1978-4