By Topic

Improving the Performance of Support Vector Machines by Learning Feature Maps

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Wada, K. ; Dept. of Electr., Electron. & Commun. Eng., Chuo Univ., Tokyo ; Saito, H. ; Tsukahara, H. ; Chao, J.

Support vector machines are known for their high capability of generalization and have been successfully applied to various classification and regression problems by employing kernel techniques to define nonlinear feature maps from a low dimensional input space into a very high dimensional feature space. Kernel techniques have an advantage in making possible to work in the implicitly introduced feature spaces without cost of computations. However, kernel functions are exploited without specific insight into problems. Given a feature map explicitly, a kernel function can naturally be defined by the inner product between data pairs in the feature space. This paper proposes an approach to acquire optimal feature maps which realize both the linear separability and the maximization of margin by adaptive learning on training data

Published in:

Neural Networks and Brain, 2005. ICNN&B '05. International Conference on  (Volume:3 )

Date of Conference:

13-15 Oct. 2005