By Topic

Using genetic differential competitive learning for unsupervised training in multispectral image classification systems

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Chih-Cheng Hung ; Dept. of Math. & Comput. Sci., Alabama A&M Univ., Normal, AL, USA ; Coleman, T.L. ; Scheunders, P.

This paper describes a genetic differential competitive learning algorithm, which is proposed to prevent fixation to the local minima and improve the unsupervised training results for the classification of remotely sensed data. The differential competitive learning (DCL) combines competitive and differential-Hebbian learning and represents a neural version of adaptive delta modulation. This learning law uses the neural signal velocity as a local unsupervised reinforcement mechanism. The Jeffries-Matusita (J-M) distance, which is a measure of statistical separability of pairs of the `trained' clusters, is used for the evaluation of the proposed algorithm. The Landsat Thematic Mapper (TM) data will be used for simulation to show the effectiveness of the algorithm

Published in:

Systems, Man, and Cybernetics, 1998. 1998 IEEE International Conference on  (Volume:5 )

Date of Conference:

11-14 Oct 1998