By Topic

The influence of training sets on generalization in feed-forward neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)

A nontrivial computation task that attempts to recognize two-or-more-clumps in a 5-b string is used to illustrate the important influence of training set selection on generalization properties of back-propagation networks. For this problem, the input patterns can be clustered into four groups indexed by their distances from the class boundary. With various combinations of these groups, the authors constructed training sets ranging from those containing only typical patterns of each class to those of border patterns. A series of simulation experiments were carried out to study the generalization capability of networks trained with these sets. The results are consistent with the following conclusions: (1) larger sizes of training examples do not guarantee better generalization performance: (2) there exists a proper subset of border patterns, which constitutes a critical training set for perfect generalization; and (3) a network trained with an arbitrary subset of border sets is not necessarily a better performer compared with one trained with a typical or other collection of input patterns

Published in:

Neural Networks, 1990., 1990 IJCNN International Joint Conference on

Date of Conference:

17-21 June 1990