By Topic

The Relationship of the Bayes Risk to Certain Separability Measures in Normal Classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yablon, Marvin ; MEMBER, IEEE, Department of Mathematics, John Jay College of Criminal Justice, The City University of New York, New York, NY 10019. ; Chu, J.T.

For the problem of classifying an element (e.g., an unknown pattern) into one of two given categories where the associated observables are distributed according to one of two known multivariate normal populations having a common covariance matrix, it is shown that the minimum Bayes risk is a strict monotonic function of certain separability or statistical distance measures regardless of the a priori probabilities and the assigned loss function. However, for the associated conditional expected losses, strict monotonicity holds, if and only if a certain condition dependent on these probabilities and the given loss function is satisfied. These results remain valid for classification problems in which the observable can be transformed by a one-to-one differentiable mapping to normality.

Published in:

Pattern Analysis and Machine Intelligence, IEEE Transactions on  (Volume:PAMI-2 ,  Issue: 2 )