Skip to Main Content
Decision tree learning is one of the most widely used and practical methods for inductive inference. A fundamental issue in decision tree inductive learning is the attribute selection measure at each non-terminal node of the tree. However, existing literatures have not taken both classification ability and cost-sensitive into account well. In this paper, we present a new strategy for attributes selection, which is a trade-off method between attributes' information and cost-sensitive learning including misclassification costs and test costs with different units, for selecting splitting attributes in cost-sensitive decision trees induction. The experimental results show our method outperform than the existing methods, such as, information gain method, total costs methods, in terms of the decrease of misclassification costs with different missing rate and various costs in UCI datasets.