By Topic

On the best finite set of linear observables for discriminating two Gaussian signals

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)

Consider the problem of discriminating two Gaussian signals by using only a finite number of linear observables. How to choose the set of n observables to minimize the error probability P_{e} , is a difficult problem. Because H , the Hellinger integral, and H^{2} form an upper and a lower bound for P_{e} , we minimize H instead. We find that the set of observables that minimizes H is a set of coefficients of the simultaneously orthogonal expansions of the two signals. The same set of observables maximizes the Hájek J -divergence as well.

Published in:

IEEE Transactions on Information Theory  (Volume:13 ,  Issue: 2 )