By Topic

Inducing NNC-trees with the R4-rule

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Qiangfu Zhao ; Univ. of Aizu, Fukushima, Japan

An NNC-Tree is a decision tree (DT) with each nonterminal node containing a nearest neighbor classifier (NNC). Compared with the conventional axis-parallel DTs (APDTs), the NNC-Trees can be more efficient, because the decision boundary made by an NNC is more complex than an axis-parallel hyperplane. Compared with single-layer NNCs, the NNC-Trees can classify given data in a hierarchical structure that is often useful for many applications. This paper proposes an algorithm for inducing NNC-Trees based on the R4-rule, which was proposed by the author for finding the smallest nearest neighbor based multilayer perceptrons (NN-MLPs). There are mainly two contributions here. 1) A heuristic but effective method is given to define the teacher signals (group labels) for the data assigned to each nonterminal node. 2) The R4-rule is modified so that an NNC with proper size can be designed automatically in each nonterminal node. Experiments with several public databases show that the proposed algorithm can produce NNC-Trees effectively and efficiently.

Published in:

Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on  (Volume:36 ,  Issue: 3 )