By Topic

Noise injection into inputs in back-propagation learning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Matsuoka, K. ; Div. of Control Eng., Kyushu Inst. of Technol., Kitakyushu, Japan

Back-propagation can be considered a nonlinear regression technique, allowing a nonlinear neural network to acquire an input/output (I/O) association using a limited number of samples chosen from a population of input and output patterns. A crucial problem on back-propagation is its generalization capability. A network successfully trained for given samples is not guaranteed to provide desired associations for untrained inputs as well. Concerning this problem some authors showed experimentally that the generalization capability could remarkably be enhanced by training the network with noise injected inputs. The author mathematically explains why and how the noise injection to inputs has such an effect

Published in:

Systems, Man and Cybernetics, IEEE Transactions on  (Volume:22 ,  Issue: 3 )