By Topic

A class of feature extraction criteria and its relation to the Bayes risk estimate

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)

Feature extraction criteria of the form f(D_{l}, \cdots ,D_{M},\Sigma _{1}, \cdots , \Sigma _{M}) are considered where D_{i} and \Sigma _{i} are the conditional means and covariances. The function f is assumed to be invariant under nonsingular linear transformations and coordinate shifts. For the case f(D_{1},D_{2},\Sigma _{o}) , f is shown to depend only upon the distance (D_{2}-D_{1})^{T} \Sigma _{o}^{-1}(D_{2}-D_{1}) between classes. The M class case, f(D_{1}, \cdots ,D_{M}, \Sigma _{o}) , is shown to depend upon the (M-1)M/2 between-class distances (D_{j}-D_{k})^{T} \Sigma ^{-1}_{O}(D_{j}-D_{k}) . This criterion is also shown to be equivalent to the mean-square-error of the general Bayes risk estimate. The most general f is reduced to a function of M sets of between-class distances with metrics induced by the conditional covariances. When M=2 this dependence is reduced to two parameters which may be regarded as different between-class distance measures. Finally, the linear mapping is reduced to a one-parameter problem as in the work of Peterson and Mattson.

Published in:

IEEE Transactions on Information Theory  (Volume:26 ,  Issue: 1 )