By Topic

A neural-type parallel algorithm for fast matrix inversion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Polycarpou, M.M. ; Dept. of Electr. Syst., Univ. of Southern California, Los Angeles, CA, USA ; Ioannou, P.A.

The paper introduces the orthogonalized back-propagation algorithm (OBA), a training procedure for adjusting the weights of a neural-type network used for matrix inversion. In this framework the adjustable weights correspond to the estimate of the inverse of the matrix. The algorithm is iterative, in the sense that an initial estimate of the solution is chosen and then updated according to some error measure. However, it is also a direct algorithm since, it guarantees exact convergence after n steps, independent of the initial estimate, where n is the dimension of the matrix to be inverted. The method can also be directly applied to solving linear equations and to computing the pseudoinverse of matrices with full row or column rank. From an optimization point of view, it is shown that the OBA is an optimal algorithm for minimizing a quadratic least-squares cost functional

Published in:

Parallel Processing Symposium, 1991. Proceedings., Fifth International

Date of Conference:

30 Apr-2 May 1991