By Topic

On the Discrete-Time Dynamics of a Class of Self-Stabilizing MCA Extraction Algorithms

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Xiangyu Kong ; Xi'' an Res. Inst. of High Technol., Xi''an, China ; Changhua Hu ; Chongzhao Han

The minor component analysis (MCA) deals with the recovery of the eigenvector associated to the smallest eigenvalue of the autocorrelation matrix of the input dada, and it is a very important tool for signal processing and data analysis. This brief analyzes the convergence and stability of a class of self-stabilizing MCA algorithms via a deterministic discrete-time (DDT) method. Some sufficient conditions are obtained to guarantee the convergence of these learning algorithms. Simulations are carried out to further illustrate the theoretical results achieved. It can be concluded that these self-stabilizing algorithms can efficiently extract the minor component (MC), and they outperform some existing MCA methods.

Published in:

Neural Networks, IEEE Transactions on  (Volume:21 ,  Issue: 1 )