By Topic

Learning capability assessment and feature space optimization for higher-order neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Villalobos, L. ; Dept. of Electr. Eng., Case Western Reserve Univ., Cleveland, OH, USA ; Merat, F.L.

A technique for evaluating the learning capability and optimizing the feature space of a class of higher-order neural networks is presented. It is shown that supervised learning can be posed as an optimization problem in which inequality constraints are used to code the information contained in the training patterns and to specify the degree of accuracy expected from the neural network. The approach establishes: (a) whether the structure of the network can effectively learn the training patterns and, if it can, a connectivity which corresponds to satisfactorily learning; (b) those features which can be suppressed from the definition of the feature space without deteriorating performance; and (c) if the structure is not appropriate for learning the training patterns, the minimum set of patterns which cannot be learned. The technique is tested with two examples and results are discussed

Published in:

Neural Networks, IEEE Transactions on  (Volume:6 ,  Issue: 1 )