Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Pruning error minimization in least squares support vector machines

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
de Kruif, B.J. ; Drebbel Inst. for Mechatronics, Univ. of Twente, Enschede, Netherlands ; de Vries, T.J.A.

The support vector machine (SVM) is a method for classification and for function approximation. This method commonly makes use of an ε-insensitive cost function, meaning that errors smaller than ε remain unpunished. As an alternative, a least squares support vector machine (LSSVM) uses a quadratic cost function. When the LSSVM method is used for function approximation, a nonsparse solution is obtained. The sparseness is imposed by pruning, i.e., recursively solving the approximation problem and subsequently omitting data that has a small error in the previous pass. However, omitting data with a small approximation error in the previous pass does not reliably predict what the error will be after the sample has been omitted. In this paper, a procedure is introduced that selects from a data set the training sample that will introduce the smallest approximation error when it will be omitted. It is shown that this pruning scheme outperforms the standard one.

Published in:

Neural Networks, IEEE Transactions on  (Volume:14 ,  Issue: 3 )