By Topic

Empirical error based optimization of SVM kernels: application to digit image recognition

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
3 Author(s)

We address the problem of optimizing kernel parameters in support vector machine modeling, especially when the number of parameters is greater than one as in polynomial kernels and KMOD, our newly introduced kernel. The present work is an extended experimental study of the framework proposed by Chapelle et al. (2001) for optimizing SVM kernels using an analytic upper bound of the error. However our optimization scheme minimizes an empirical error estimate using a quasi-Newton optimization method. To assess our method, the approach is further used for adapting KMOD, RBF and polynomial kernels on synthetic data and NIST database. The method shows a much faster convergence with satisfactory results in comparison with the simple gradient descent method.

Published in:

Frontiers in Handwriting Recognition, 2002. Proceedings. Eighth International Workshop on

Date of Conference: