By Topic

A comparative study between real and discrete genetic algorithms for the design of beta basis function neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Aouiti, C. ; REGIM: Res. Group on Intelligent Machines, Univ. of Sfax, Tunisia ; Alimi, A.M. ; Maalej, A.

Classic training algorithms for neural networks start with a predetermined network structure, and so the quality of the response of the neural network depends strongly on its structure. Generally the neural network resulting from such classical learning approach applied to a predetermined architecture is either insufficient or overcomplicated. This paper describes two genetic learning models of the BBFNN. The first is a continuous genetic and the second is a discrete genetic model. In the two cases each network is coded as a variable length string and some genetic operators are proposed to evolve a population of individuals. A function is proposed to evaluate the fitness of individual networks. Applications to function approximation problems are considered to demonstrate the performance of the BBFNN and of the two evolutionary algorithms.

Published in:

Systems, Man and Cybernetics, 2002 IEEE International Conference on  (Volume:3 )

Date of Conference:

6-9 Oct. 2002