Skip to Main Content
We address a high-speed defect compensation method for multi-layer neural networks implemented in hardware devices. To compensate stuck defects of the neurons and weights, we have proposed a partial retraining scheme that adjusts the weights of a neuron affected by stuck defects between two layers by a backpropagation (BP) algorithm. Since the functions of defect compensation can be achieved by using learning circuits, we can save chip area. To reduce the number of weights to adjust, it also leads to high-speed defect compensation. We propose a two-stage partial retraining scheme to compensate input unit stuck defects. Our simulation results show that the two-stage partial retraining scheme can be about 100 times faster than whole network retraining by the BP algorithm.