By Topic

On-line learning with minimal degradation in feedforward networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
de Angulo, V.R. ; Joint Res. Centre, Comm. of the Eur. Communities, Ispra, Italy ; Torras, C.

Dealing with nonstationary processes requires quick adaptation while at the same time avoiding catastrophic forgetting. A neural learning technique that satisfies these requirements, without sacrificing the benefits of distributed representations, is presented. It relies on a formalization of the problem as the minimization of the error over the previously learned input-output patterns, subject to the constraint of perfect encoding of the new pattern. Then this constrained optimization problem is transformed into an unconstrained one with hidden-unit activations as variables. This new formulation leads to an algorithm for solving the problem, which we call learning with minimal degradation (LMD). Some experimental comparisons of the performance of LMD with backpropagation are provided which, besides showing the advantages of using LMD, reveal the dependence of forgetting on the learning rate in backpropagation. We also explain why overtraining affects forgetting and fault tolerance, which are seen as related problems

Published in:

Neural Networks, IEEE Transactions on  (Volume:6 ,  Issue: 3 )