By Topic

Multiple kernels for generalised discriminant analysis

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Liang, Z. ; Sch. of Comput. Sci. & Technol., China Univ. of Min. & Technol., China ; Li, Y.

Kernel-based learning methods have been widely used in various machine learning tasks such as dimensionality reduction, classification and regression. Because the performance of kernel-based learning methods depends on the selection of kernels, how to optimise kernel functions becomes an important issue in kernel-based learning methods. A novel formulation for automatically learning kernels over a linear combination of kernel functions in terms of discriminant criteria is proposed. One not only extracts features, but also carries out the selection of kernels when optimising the discriminant criteria. It is found that the proposed method is available for any discriminant criterion formulated in a pairwise manner as the objective function. Therefore the proposed method can provide a framework for optimising multiple kernel subspace analysis. Extensive experiments on UCI data sets, handwritten numerical characters, face images and gene data sets are implemented to demonstrate the effectiveness of the proposed method.

Published in:

Computer Vision, IET  (Volume:4 ,  Issue: 2 )