By Topic

High-order neural network structure selection for function approximation applications using genetic algorithms

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Rovithakis, G.A. ; Dept. of Electr. & Comput. Eng., Aristotle Univ. of Thessaloniki, Greece ; Chalkiadakis, I. ; Zervakis, M.E.

Neural network literature for function approximation is by now sufficiently rich. In its complete form, the problem entails both parametric (i.e., weights determination) and structural learning (i.e., structure selection). The majority of works deal with parametric uncertainty assuming knowledge of the appropriate neural structure. In this paper we present an algorithmic approach to determine the structure of high order neural networks (HONNs), to solve function approximation problems. The method is based on a genetic algorithm (GA) and is equipped with a stable update law to guarantee parametric learning. Simulation results on an illustrative example highlight the performance and give some insight of the proposed approach.

Published in:

Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on  (Volume:34 ,  Issue: 1 )