By Topic

Bias and variance of validation methods for function approximation neural networks under conditions of sparse data

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
J. M. Twomey ; Dept. of Ind. Eng., Wichita State Univ., KS, USA ; A. E. Smith

Neural networks must be constructed and validated with strong empirical dependence, which is difficult under conditions of sparse data. The paper examines the most common methods of neural network validation along with several general validation methods from the statistical resampling literature, as applied to function approximation networks with small sample sizes. It is shown that an increase in computation, necessary for the statistical resampling methods, produces networks that perform better than those constructed in the traditional manner. The statistical resampling methods also result in lower variance of validation, however some of the methods are biased in estimating network error

Published in:

IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews)  (Volume:28 ,  Issue: 3 )