By Topic

Properties of the k-norm pruning algorithm for decision tree classifiers

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Mingyu Zhong ; Sch. of EECS, Univ. of Central Florida, Orlando, FL ; Georgiopoulos, M. ; Anagnostopoulos, G.C.

Pruning is one of the key procedures in training decision tree classifiers. It removes trivial rules from the raw knowledge base built from training examples, in order to avoid over-using noisy, conflicting, or fuzzy inputs, so that the refined model can generalize better with unseen cases. In this paper, we present a number of properties of k-norm pruning, a recently proposed pruning algorithm, which has clear theoretical interpretation. In an earlier paper it was shown that k-norm pruning compares very favorably in terms of accuracy and size with minimal cost-complexity pruning and error based pruning, two of the most cited decision tree pruning methods; it was also shown that k-norm pruning is more efficient, at times orders of magnitude more efficient than minimal cost-complexity pruning and error based pruning. In this paper, we demonstrate the validity of the k-norm properties through a series of theorems, and explain their practical significance.

Published in:

Pattern Recognition, 2008. ICPR 2008. 19th International Conference on

Date of Conference:

8-11 Dec. 2008