By Topic

Two-stage parallel partial retraining scheme for defective multi-layer neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Yamamori, K. ; Japan Inst. of Sci. & Technol., Ishikawa, Japan ; Abe, T. ; Horiguchi, S.

We address a high-speed defect compensation method for multi-layer neural networks implemented in hardware devices. To compensate stuck defects of the neurons and weights, we have proposed a partial retraining scheme that adjusts the weights of a neuron affected by stuck defects between two layers by a backpropagation (BP) algorithm. Since the functions of defect compensation can be achieved by using learning circuits, we can save chip area. To reduce the number of weights to adjust, it also leads to high-speed defect compensation. We propose a two-stage partial retraining scheme to compensate input unit stuck defects. Our simulation results show that the two-stage partial retraining scheme can be about 100 times faster than whole network retraining by the BP algorithm.

Published in:

High Performance Computing in the Asia-Pacific Region, 2000. Proceedings. The Fourth International Conference/Exhibition on  (Volume:2 )

Date of Conference:

14-17 May 2000