By Topic

On the generalization ability of neural network classifiers

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
M. T. Musavi ; Dept. of Electr. & Comput. Eng., Maine Univ., Orono, ME, USA ; K. H. Chan ; D. M. Hummels ; K. Kalantri

This correspondence presents a method for evaluation of artificial neural network (ANN) classifiers. In order to find the performance of the network over all possible input ranges, a probabilistic input model is defined. The expected error of the output over this input range is taken as a measure of generalization ability. Two essential elements for carrying out the proposed evaluation technique are estimation of the input probability density and numerical integration. A nonparametric method, which depends on the nearest M neighbors, is used to locally estimate the distribution around each training pattern. An orthogonalization procedure is utilized to determine the covariance matrices of local densities. A Monte Carlo method is used to perform the numerical integration. The proposed evaluation technique has been used to investigate the generalization ability of back propagation (BP), radial basis function (RBF) and probabilistic neural network (PNN) classifiers for three test problems

Published in:

IEEE Transactions on Pattern Analysis and Machine Intelligence  (Volume:16 ,  Issue: 6 )