By Topic

Dimensionality reduction of RKHS model using Reduced Kernel Principal Component Analysis (RKPCA)

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Ilyes, E. ; Res. Unit ATSI, Nat. Eng. Sch. of Monastir, Monastir, Tunisia ; Okba, T. ; Hassani, M.

This paper deals with the problem of complexity reduction of RKHS models developed on the Reproducing Kernel Hilbert Space (RKHS) using the statistical learning theory (SLT) devoted to supervised learning problems. However, the provided RKHS model suffers from the parameter number which equals the observations used in the learning phase. In this paper we propose a new way to reduce the number of parameters of RKHS model. The proposed method titled Reduced Kernel Principal Component Analysis (RKPCA) consists on approximating the retained principal components given by the KPCA method by a set of observation vectors which point to the directions of the largest variances with the retained principal components. The proposed method has been tested on a chemical reactor and the results were successful.

Published in:

Control & Automation (MED), 2010 18th Mediterranean Conference on

Date of Conference:

23-25 June 2010