By Topic

Presupervised and post-supervised prototype classifier design

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Kuncheva, L.I. ; Sch. of Math., Univ. of Wales, Bangor, UK ; Bezdek, J.C.

We extend the nearest prototype classifier to a generalized nearest prototype classifier (GNPC). The GNPC uses “soft” labeling of the prototypes in the classes, thereby encompassing a variety of classifiers. Based on how the prototypes are found we distinguish between presupervised and post-supervised GNPC designs. We derive the conditions for optimality of two designs where prototypes represent: 1) the components of class-conditional mixture densities (presupervised design); or 2) the components of the unconditional mixture density (post-supervised design). An artificial data set and the “satimage” data set from the database ELENA are used to experimentally study the two approaches. A radial basis function network is used as a representative of each GNPC type. Neither the theoretical nor the experimental results indicate clear reasons to prefer one of the approaches. The post-supervised GNPC design tends to be more robust and less accurate than the presupervised one

Published in:

Neural Networks, IEEE Transactions on  (Volume:10 ,  Issue: 5 )