By Topic

L_{p} Norm Localized Multiple Kernel Learning via Semi-Definite Programming

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Han, Y. ; School of Marine Engineering, Northwestern Polytechnical University, Xi''an, China ; Kunde Yang ; Guizhong Liu

Our objective is to train SVM based Localized Multiple Kernel Learning with arbitrary l_{p} -norm constraint using the alternating optimization between the standard SVM solvers with the localized combination of base kernels and associated sample-specific kernel weights. Unfortunately, the latter forms a difficult l_{p} -norm constraint quadratic optimization. In this letter, by approximating the l_{p} -norm using Taylor expansion, the problem of updating the localized kernel weights is reformulated as a non-convex quadratically constraint quadratic programming, and then solved via associated convex Semi-Definite Programming relaxation. Experiments on ten benchmark machine learning datasets demonstrate the advantages of our approach.

Published in:

Signal Processing Letters, IEEE  (Volume:19 ,  Issue: 10 )