By Topic

A novel neural network training technique based on a multi-algorithm constrained optimization strategy

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Karras, D.A. ; Dept. of Inf., Ioannina Univ., Greece ; Lagaris, I.E.

A novel methodology for efficient offline training of multilayer perceptrons (MLPs) is presented. The training is formulated as an optimization problem subject to box-constraints for the weights, so as to enhance the network's generalization capability. An optimization strategy is used combining variable metric, conjugate gradient and no-derivative pattern search methods that renders the training process robust and efficient. The superiority of this approach, over Off-line Backpropagation algorithm, the RPROP training procedure as well as over the stand alone algorithms involved in the proposed complex optimization strategy, is demonstrated by direct application to two real world benchmarks and the parity-4 problem. These problems have been obtained from a standard collection of such benchmarks and special care has been taken on the statistical significance of the results by organizing the experimental study so as to compare the averages and variances of the training and generalization performance of the algorithms involved

Published in:

Euromicro Conference, 1998. Proceedings. 24th  (Volume:2 )

Date of Conference:

25-27 Aug 1998