Abstract:
Decision Tree is a well-accepted supervised classifier in machine learning. It splits the given data points based on features and considers a threshold value. In general,...Show MoreMetadata
Abstract:
Decision Tree is a well-accepted supervised classifier in machine learning. It splits the given data points based on features and considers a threshold value. In general, a single predefined splitting criterion is used which may lead to poor performance. To this end, in this paper, we investigate joint splitting criteria using two of the most used criterion i.e. Information Gain and Gini index. We propose to split the data points when Information Gain is maximum and Gini index is minimum. The proposed approach is rigorously tested and compared by constructing decision tree based random forests. All the experiments are performed on UCI machine learning datasets.
Published in: TENCON 2018 - 2018 IEEE Region 10 Conference
Date of Conference: 28-31 October 2018
Date Added to IEEE Xplore: 24 February 2019
ISBN Information: