By Topic

Optimal Learning Rates for Some Principal Component Analysis Algorithms

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)

Principal component analysis (PCA) has been shown to be very fruitful for extracting the most useful information from a given sequence of observations. Quite a number of PCA methods can be found in the literature. In this work, we concentrate on the derivation of optimal learning rates for some well-known adaptive PCA algorithms. A detailed derivation procedure is described which results in closed-form formulae of the optimal learning rates. These optimal learning rates can be obtained from the solution of either quadratic or cubic equations which can be analytically solved, where no numerical procedures are needed. The key advantage of the optimal learning rate is to offer a wise mechanism to automatically adjust the learning stepsize.

Published in:

Neural Networks, 2007. IJCNN 2007. International Joint Conference on

Date of Conference:

12-17 Aug. 2007