By Topic

Optimal partitioning for classification and regression trees

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Chou, P.A. ; Dept. of Electr. Eng., Standford Univ., CA, USA

An iterative algorithm that finds a locally optimal partition for an arbitrary loss function, in time linear in N for each iteration is presented. The algorithm is a K-means-like clustering algorithm that uses as its distance measure a generalization of Kullback's information divergence. Moreover, it is proven that the globally optimal partition must satisfy a nearest neighbour condition using divergence as the distance measure. These results generalize similar results of L. Breiman et al. (1984) to an arbitrary number of classes or regression variables and to an arbitrary number of bills. Experimental results on a text-to-speech example are provided and additional applications of the algorithm, including the design of variable combinations, surrogate splits, composite nodes, and decision graphs, are suggested

Published in:

Pattern Analysis and Machine Intelligence, IEEE Transactions on  (Volume:13 ,  Issue: 4 )