By Topic

Evolutionary generation and training of recurrent artificial neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
J. Santos ; Departimento de Ingenieria Ind., Univ. de La Coruna, Spain ; R. J. Duro

An evolutionary artificial neural network training and design methodology is presented, aimed at obtaining optimum or quasi-optimum synchronous recurrent neural networks capable of processing sequential inputs. We show that, through the use of this method and working with floating point and integer valued chromosomes, it is possible to achieve optimum results, considering very small populations and few generations. In order to implement this methodology, we have developed GENIAL, a genetic algorithm development environment which is specifically designed for solving this type of problem. It offers ways of testing adequate fitness functions and many tools for improving results. Finally, we comment on the sequential introduction of different constraints in genetic algorithms, presenting a classical example where several design requirements are met simultaneously and which demonstrates the power of this method

Published in:

Evolutionary Computation, 1994. IEEE World Congress on Computational Intelligence., Proceedings of the First IEEE Conference on

Date of Conference:

27-29 Jun 1994