Skip to Main Content
In literature multi-class SVM is constructed using one against all, one against one and decision tree based SVM using Euclidean and Mahalanobis distance. To maintain high generalization ability, the most separable classes should be separated at the upper nodes of decision tree. Among statistical measures information gain, gini index and chi-square are few commonly used class separability measures in pattern recognition community. In this paper, we evaluate and determine the structure of decision tree SVM using information gain, gini index and chi-square. It is shown that the decision tree based SVM requires less computation time in comparison to conventional One against All SVM. Experimental results on UCI repository dataset demonstrates better or equivalent performance of our proposed decision tree scheme in comparison to conventional one against all SVM in terms of classification accuracy for most of the datasets. The proposed scheme outperforms conventional One against All SVM in terms of computation time for both training and testing phase using all the three measures employed for determining the structure of decision tree.