By Topic

A note on least-squares learning procedures and classification by neural network models

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
P. A. Shoemaker ; US Naval Ocean Syst. Center, San Diego, CA, USA

Neural network models are considered as mathematical classifiers whose inputs comprise random variables generated according to arbitrary stationary class distributions, and the implication of learning based on minimization of sum-square classification error over a training set of these observations for which class assignments are absolutely determined is addressed. Expectations for network outputs in such cases are weighted least-squares approximations to a posteriori probabilities for the classes, which justifies interpretation of network outputs as indicating degree of confidence in class membership. The author demonstrates this with a straightforward proof in which class probability densities are regarded as primitives and which for simplicity does not rely on probability theory or statistics. The author cites more detailed results giving conditions for consistency of the estimators and discusses some issues relating to the suitability of neural network models and back-propagation training for approximation of conditional probabilities in classification tasks

Published in:

IEEE Transactions on Neural Networks  (Volume:2 ,  Issue: 1 )