By Topic

Toward optimizing a self-creating neural network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Jung-Hua Wang ; Dept. of Electr. Eng., Nat. Taiwan Ocean Univ., Keelung, Taiwan ; Jen-Da Rau ; Chung-Yun Peng

This paper optimizes the performance of the growing cell structures (GCS) model in learning topology and vector quantization. Each node in GCS is attached with a resource counter. During the competitive learning process, the counter of the best-matching node is increased by a defined resource measure after each input presentation, and then all resource counters are decayed by a factor α. We show that the summation of all resource counters conserves. This conservation principle provides useful clues for exploring important characteristics of GCS, which in turn provide an insight into how the GCS can be optimized. In the context of information entropy, we show that performance of GCS in learning topology and vector quantization can be optimized by using α=0 incorporated with a threshold-free node-removal scheme, regardless of input data being stationary or nonstationary. The meaning of optimization is twofold: (1) for learning topology, the information entropy is maximized in terms of equiprobable criterion and (2) for leaning vector quantization, the use is minimized in terms of equi-error criterion

Published in:

Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on  (Volume:30 ,  Issue: 4 )