By Topic

Generalized inverse computations using two-layer feedforward neural networks with pruning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yongfeng Miao ; Dept. of Electr. & Electron. Eng., Melbourne Univ., Parkville, Vic., Australia ; Hua, Y.

We study the computations of generalized inverses (GI's) of a matrix using two-layer feedforward neural networks (TFNN), which provides an alternative to the traditional methods based on matrix decompositions. The required GI solution of a given matrix is obtained from outputs of the converged networks while the minimum point of a cost function reveals useful rank information about the matrix. The stability and convergence of the back-propagation (BP) learning algorithm is shown to be closely related to the spread of nonzero singular values of some underlying matrix. Also by identifying the relation, between the optimal number of hidden nodes of the TFNN and the matrix rank, we are able to prune the network by removing those redundant hidden nodes so that an optimal network structure is produced upon the convergence of learning. Simulation results are presented to confirm the analysis

Published in:

Neural Networks, 1995. Proceedings., IEEE International Conference on  (Volume:6 )

Date of Conference:

Nov/Dec 1995