By Topic

Fast Bayesian support vector machine parameter tuning with the Nystrom method

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Gold, C. ; Comput. & Neural Syst., California Inst. of Technol., Pasadena, CA, USA ; Sollich, P.

We experiment with speeding up a Bayesian method for tuning the hyperparameters of a support vector machine (SVM) classifier. The Bayesian approach gives the gradients of the evidence as averages over the posterior, which can be approximated using hybrid Monte Carlo simulation (HMC). By using the Nystrom approximation to the SVM kernel, our method significantly reduces the dimensionality of the space to be simulated in the HMC. We show that this speeds up the running time of the HMC simulation from O(n2) (with a large prefactor) to effectively O(n), where n is the number of training samples. We conclude that the Nystrom approximation has an almost insignificant effect on the performance of the algorithm when compared to the full Bayesian method, and gives excellent performance in comparison with other approaches to hyperparameter tuning.

Published in:

Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on  (Volume:5 )

Date of Conference:

31 July-4 Aug. 2005