By Topic

Common Nature of Learning Exemplified by BP and Hopfield Neural Networks for Solving Online a System of Linear Equations

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Yunong Zhang ; Sun Yat-Sen Univ., Guangzhou ; Zhan Li ; Ke Chen ; Binghuang Cai

Many computational problems widely encountered in scientific and engineering applications could finally be transformed to the online linear-equations solving. Classic numerical methods for solving linear equations include Gaussian elimination and matrix factorization methods, which are usually of O(n3) operations. Being important parallel-computational models, both BP (back propagation) and Hopfield neural networks could be exploited for solving such linear equations. BP neural network is evidently different from Hopfield neural network in terms of network definition, architecture and learning pattern. However, both of these two neural networks could have a common nature of learning (i.e., governed by the same mathematical iteration formula) during the online solution of linear equations. In addition, computer-simulation results substantiate the theoretical analysis of both BP and Hopfield neural networks for solving online such a set of linear equations.

Published in:

Networking, Sensing and Control, 2008. ICNSC 2008. IEEE International Conference on

Date of Conference:

6-8 April 2008