Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Global analysis of Oja's flow for neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
3 Author(s)
Yan, W.-Y. ; Dept. of Syst. Eng., Australian Nat. Univ., Canberra, ACT, Australia ; Helmke, U. ; Moore, J.B.

A detailed study of Oja's learning equation in neural networks is undertaken in this paper. Not only are such fundamental issues as existence, uniqueness, and representation of solutions completely resolved, but also the convergence issue is resolved. It is shown that the solution of Oja's equation is exponentially convergent to an equilibrium from any initial value. Moreover, the necessary and sufficient conditions are given on the initial value for the solution to converge to a dominant eigenspace of the associated autocorrelation matrix. As a by-product, this result confirms one of Oja's conjectures that the solution converges to the principal eigenspace from almost all initial values. Some other characteristics of the limiting solution are also revealed. These facilitate the determination of the limiting solution in advance using only the initial information. Two examples are analyzed demonstrating the explicit dependence of the limiting solution on the initial value. In another respect, it is found that Oja's equation is the gradient flow of generalized Rayleigh quotients on a Stiefel manifold

Published in:

Neural Networks, IEEE Transactions on  (Volume:5 ,  Issue: 5 )