By Topic

Hybrid methods using genetic algorithms for global optimization

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
J. -M. Renders ; Fac. des Sci. Appliquees, Univ. Libre de Bruxelles, Belgium ; S. P. Flasse

This paper discusses the trade-off between accuracy, reliability and computing time in global optimization. Particular compromises provided by traditional methods (Quasi-Newton and Nelder-Mead's simplex methods) and genetic algorithms are addressed and illustrated by a particular application in the field of nonlinear system identification. Subsequently, new hybrid methods are designed, combining principles from genetic algorithms and “hill-climbing” methods in order to find a better compromise to the trade-off. Inspired by biology and especially by the manner in which living beings adapt themselves to their environment, these hybrid methods involve two interwoven levels of optimization, namely evolution (genetic algorithms) and individual learning (Quasi-Newton), which cooperate in a global process of optimization. One of these hybrid methods appears to join the group of state-of-the-art global optimization methods: it combines the reliability properties of the genetic algorithms with the accuracy of Quasi-Newton method, while requiring a computation time only slightly higher than the latter

Published in:

IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics)  (Volume:26 ,  Issue: 2 )