Loading [MathJax]/extensions/MathMenu.js
Investigation of a Joint Splitting Criteria for Decision Tree Classifier Use of Information Gain and Gini Index | IEEE Conference Publication | IEEE Xplore

Investigation of a Joint Splitting Criteria for Decision Tree Classifier Use of Information Gain and Gini Index


Abstract:

Decision Tree is a well-accepted supervised classifier in machine learning. It splits the given data points based on features and considers a threshold value. In general,...Show More

Abstract:

Decision Tree is a well-accepted supervised classifier in machine learning. It splits the given data points based on features and considers a threshold value. In general, a single predefined splitting criterion is used which may lead to poor performance. To this end, in this paper, we investigate joint splitting criteria using two of the most used criterion i.e. Information Gain and Gini index. We propose to split the data points when Information Gain is maximum and Gini index is minimum. The proposed approach is rigorously tested and compared by constructing decision tree based random forests. All the experiments are performed on UCI machine learning datasets.
Date of Conference: 28-31 October 2018
Date Added to IEEE Xplore: 24 February 2019
ISBN Information:

ISSN Information:

Conference Location: Jeju, Korea (South)

Contact IEEE to Subscribe

References

References is not available for this document.