By Topic

Distributed coding for data representation of back-propagation neural network classifiers

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Chong, C.C. ; Sch. of Electr. & Electron. Eng., Nanyang Technol. Inst. ; Jia, J.C.

A new distributed input coding is derived by distributing the feature variables over a number of input nodes based on the distribution of the training data. Using this coding method representation. The range of each input node will be fully optimised: this enables the network to converge at a higher rate during training. The coding method also enables the network to maintain the generalisation capability of conventional normalisation coding

Published in:

Electronics Letters  (Volume:31 ,  Issue: 21 )