By Topic

An adaptive least squares algorithm for the efficient training of artificial neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Kollias, S. ; Dept. of Electr. Eng., Columbia Univ., NY, USA ; Anastassiou, D.

A novel learning algorithm is developed for the training of multilayer feedforward neural networks, based on a modification of the Marquardt-Levenberg least-squares optimization method. The algorithm updates the input weights of each neuron in the network in an effective parallel way. An adaptive distributed selection of the convergence rate parameter is presented, using suitable optimization strategies. The algorithm has better convergence properties than the conventional backpropagation learning technique. Its performance is illustrated, using examples from digital image halftoning and logical operations such as the XOR function

Published in:

Circuits and Systems, IEEE Transactions on  (Volume:36 ,  Issue: 8 )