By Topic

Building Efficient Radial Basis Function Kernel Classifiers using Iterative Methods

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Barsic, D. ; Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD ; Carmen, C. ; Renjifo, C. ; Norman, K.
more authors

Training algorithms for radial basis function Kernel classifiers (RBFKCs), such as the canonical support vector machine (SVM), often produce computationally burdensome classifiers when large training data sets are used. Additionally, this complexity is not directly controllable by the developer. A least-squares variant of the SVM is used as a starting point for a proposed algorithm called the incremental asymmetric proximal support vector machine (IAPSVM). IAPSVM employs a greedy search method across the training data to select the centers of each RBF transform. This iterative building process produces a final classifier that compares favorably with both the SVM and another available complexity reduction algorithm (as measured by the number of RBF kernel transforms that must be evaluated to classify an unknown sample). Unlike SVM methods, IAPSVM enables an a priori decision for the complexity of the classifier. This capability is often important for developers when building RBFKCs for resource-constrained systems.

Published in:

Machine Learning for Signal Processing, 2006. Proceedings of the 2006 16th IEEE Signal Processing Society Workshop on

Date of Conference:

6-8 Sept. 2006