By Topic

On Extensions to Fisher's Linear Discriminant Function

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

This correspondence describes extensions to Fisher's linear discriminant function which allow both differences in class means and covariances to be systematically included in a process for feature reduction. It is shown how the Fukunaga-Koontz transform can be combined with Fisher's method to allow a reduction of feature space from many dimensions to two. Performance is seen to be superior in general to the Foley-Sammon method. The technique is developed to show how a new radius vector (or pair of radius vectors) can be combined with Fisher's vector to produce a classifier with even more power of discrimination. Illustrations of the technique show that good discrimination can be obtained even if there is considerable overlap of classes in any one projection.

Published in:

Pattern Analysis and Machine Intelligence, IEEE Transactions on  (Volume:PAMI-9 ,  Issue: 2 )