By Topic

Statistical discrimination using inaccurate models

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

The performance of a multiclass maximum likelihood decision rule is analyzed, when inaccurate versions of the true probability density functions are used. A general bound to the error probability is developed, and it is valid for both finite observation sizenandn rightarrow infty. A necessary and sufficient condition is developed for the bound to be less than one and to converge exponentially to zero, assuming that we exclude the case of equality between informational divergence expressions. The condition is given in terms of the information divergence per sample, both for the finitenand asymptotic case. As long as the inaccurate density lies in a "tolerance region" around the true density of the class, exponential convergence of the error to zero is maintained. Specific expressions for the bounds and informational divergence are obtained for homogeneous Markov chain observations and Gaussian stationary process observations in discrete time. The computational complexity of evaluating the asymptotic bounding expression for thes-dimensional Gaussian process case is shown to beO(ns^{2} + 2n log_{2}n), which is much smaller than the complexityO((sn)^{3})required for the evaluation of the bound for finite sample sizen.

Published in:

Information Theory, IEEE Transactions on  (Volume:28 ,  Issue: 5 )