Skip to Main Content
Classic training algorithms for neural networks start with a predetermined network structure, and so the quality of the response of the neural network depends strongly on its structure. Generally the neural network resulting from such classical learning approach applied to a predetermined architecture is either insufficient or overcomplicated. This paper describes two genetic learning models of the BBFNN. The first is a continuous genetic and the second is a discrete genetic model. In the two cases each network is coded as a variable length string and some genetic operators are proposed to evolve a population of individuals. A function is proposed to evaluate the fitness of individual networks. Applications to function approximation problems are considered to demonstrate the performance of the BBFNN and of the two evolutionary algorithms.