By Topic

A neural network approach to statistical pattern classification by `semiparametric' estimation of probability density functions

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
H. G. C. Traven ; Dept. of Numerical Anal. & Comput. Sci., R. Inst. of Technol., Stockholm, Sweden

A method for designing near-optimal nonlinear classifiers, based on a self-organizing technique for estimating probability density functions when only weak assumptions are made about the densities, is described. The method avoids disadvantages of other existing methods by parametrizing a set of component densities from which the actual densities are constructed. The parameters of the component densities are optimized by a self-organizing algorithm, reducing to a minimum the labeling of design samples. All the required computations are realized with the simple sum-of-product units commonly used in connectionist models. The density approximations produced by the method are illustrated in two dimensions for a multispectral image classification task. The practical use of the method is illustrated by a small speech recognition problem. Related issues of invariant projections, cross-class pooling of data, and subspace partitioning are discussed

Published in:

IEEE Transactions on Neural Networks  (Volume:2 ,  Issue: 3 )