Improving Svm Learning Accuracy with Adaboost | IEEE Conference Publication | IEEE Xplore

Improving Svm Learning Accuracy with Adaboost


Abstract:

Support vector machine (SVM) is based on the VC theory and the principle of structural risk minimization. For some learning domains that need more accurate learning perfo...Show More

Abstract:

Support vector machine (SVM) is based on the VC theory and the principle of structural risk minimization. For some learning domains that need more accurate learning performance, SVM can be improved for this objective. This paper describes an algorithm - Boost-SVM, which puts SVM into AdaBoost framework to improve the learning accuracy of the SVM algorithm. By changing the weights of the training examples in the re-sampling process of AdaBoost, SVM appears to be more accurate. The experimental results show that the proposed method has a competitive learning ability and acquires better accuracy than SVM.
Date of Conference: 18-20 October 2008
Date Added to IEEE Xplore: 07 November 2008
Print ISBN:978-0-7695-3304-9

ISSN Information:

Conference Location: Jinan, China

Contact IEEE to Subscribe

References

References is not available for this document.