By Topic

Generalized neural trees for pattern classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Foresti, G.L. ; Dept. of Math. & Comput. Sci., Udine Univ., Italy ; Micheloni, C.

In this paper, a new neural tree (NT) model, the generalized NT (GNT), is presented. The main novelty of the GNT consists in the definition of a new training rule that performs an overall optimization of the tree. Each time the tree is increased by a new level, the whole tree is reevaluated. The training rule uses a weight correction strategy that takes into account the entire tree structure, and it applies a normalization procedure to the activation values of each node such that these values can be interpreted as a probability. The weight connection updating is calculated by minimizing a cost function, which represents a measure of the overall probability of correct classification. Significant results on both synthetic and real data have been obtained by comparing the classification performances among multilayer perceptrons (MLPs), NTs, and GNTs. In particular, the GNT model displays good classification performances for training sets having complex distributions. Moreover, its particular structure provides an easily probabilistic interpretation of the pattern classification task and allows growing small neural trees with good generalization properties.

Published in:

Neural Networks, IEEE Transactions on  (Volume:13 ,  Issue: 6 )