By Topic

Principal component extraction using recursive least squares learning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
S. Bannour ; Dept. of Electr. Eng., Colorado State Univ., Fort Collins, CO, USA ; M. R. Azimi-Sadjadi

A new neural network-based approach is introduced for recursive computation of the principal components of a stationary vector stochastic process. The neurons of a single-layer network are sequentially trained using a recursive least squares squares (RLS) type algorithm to extract the principal components of the input process. The optimality criterion is based on retaining the maximum information contained in the input sequence so as to be able to reconstruct the network inputs from the corresponding outputs with minimum mean squared error. The proof of the convergence of the weight vectors to the principal eigenvectors is also established. A simulation example is given to show the accuracy and speed advantages of this algorithm in comparison with the existing methods. Finally, the application of this learning algorithm to image data reduction and filtering of images degraded by additive and/or multiplicative noise is considered

Published in:

IEEE Transactions on Neural Networks  (Volume:6 ,  Issue: 2 )