Skip to Main Content
The minor component analysis (MCA) deals with the recovery of the eigenvector associated to the smallest eigenvalue of the autocorrelation matrix of the input dada, and it is a very important tool for signal processing and data analysis. This brief analyzes the convergence and stability of a class of self-stabilizing MCA algorithms via a deterministic discrete-time (DDT) method. Some sufficient conditions are obtained to guarantee the convergence of these learning algorithms. Simulations are carried out to further illustrate the theoretical results achieved. It can be concluded that these self-stabilizing algorithms can efficiently extract the minor component (MC), and they outperform some existing MCA methods.