By Topic

An error perturbation for learning and detection of local minima in binary 3-layered neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Yatsuzuka, Y. ; Res. & Dev. Lab., Kokusai Denshin Denwa Co. Ltd., Kamifukuoka, Japan

In binary multilayer neural networks with a backpropagation algorithm, achievement of quick and stable convergence in binary space is a major issue for a wide range of applications. We propose a learning technique in which tenacious local minima can be evaded by using a perturbation of the unit output errors in an output layer in polarity and magnitude. Simulation results showed that a binary 3-layered neural network can converge very rapidly in binary space with insensitivity to a set of initial weights, providing high generalization ability. It is also pointed out that tenacious local minima can be detected by monitoring a minimum magnitude of the unit output errors for the erroneous binary outputs, and that the overtraining concerning to generalization performance for test inputs is roughly estimated by monitoring the minimum and maximum magnitudes of the unit output errors for the correct binary outputs

Published in:

Neural Networks, 1995. Proceedings., IEEE International Conference on  (Volume:1 )

Date of Conference:

Nov/Dec 1995