By Topic

Constructing neural networks for multiclass-discretization based on information entropy

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Shie-Jue Lee ; Dept. of Electr. Eng., Nat. Sun Yat-Sen Univ., Kaohsiung, Taiwan ; Mu-Tune Jone ; Hsien-Leing Tsai

Cios and Liu (1992) proposed an entropy-based method to generate the architecture of neural networks for supervised two-class discretization. For multiclass discretization, the inter-relationship among classes is reduced to a set of binary relationships, and an independent two-class subnetwork is created for each binary relationship. This two-class-based method ends up with the disability of sharing hidden nodes among different classes and a low recognition rate. We keep the interrelationship among classes when training a neural network. Entropy measure is considered in a global sense, not locally in each independent subnetwork. Consequently, our method allows hidden nodes and layers to be shared among classes, and presents higher recognition rates than the two-class-based method

Published in:

Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on  (Volume:29 ,  Issue: 3 )