By Topic

Global analysis of Oja's flow for neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Yan, W.-Y. ; Dept. of Syst. Eng., Australian Nat. Univ., Canberra, ACT, Australia ; Helmke, U. ; Moore, J.B.

A detailed study of Oja's learning equation in neural networks is undertaken in this paper. Not only are such fundamental issues as existence, uniqueness, and representation of solutions completely resolved, but also the convergence issue is resolved. It is shown that the solution of Oja's equation is exponentially convergent to an equilibrium from any initial value. Moreover, the necessary and sufficient conditions are given on the initial value for the solution to converge to a dominant eigenspace of the associated autocorrelation matrix. As a by-product, this result confirms one of Oja's conjectures that the solution converges to the principal eigenspace from almost all initial values. Some other characteristics of the limiting solution are also revealed. These facilitate the determination of the limiting solution in advance using only the initial information. Two examples are analyzed demonstrating the explicit dependence of the limiting solution on the initial value. In another respect, it is found that Oja's equation is the gradient flow of generalized Rayleigh quotients on a Stiefel manifold

Published in:

Neural Networks, IEEE Transactions on  (Volume:5 ,  Issue: 5 )