By Topic

Asymptotic analysis of error probabilities for the nonzero-mean Gaussian hypothesis testing problem

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
R. K. Bahr ; Dept. of Electr. & Comput. Eng., Arizona Univ., Tucson, AZ, USA

Using a large-deviation theory approach, the rate at which the probability of detection error vanishes as sample size increases in the testing of nonzero-mean Gaussian stochastic processes is studied. After suitable transformation, the likelihood ratio test statistic is expressed as a sum of independent Gaussian random variables. The precise asymptotic rate at which the tail probability of this sum vanishes is derived by use of Ellis' theorem in conjunction with asymptotic analysis of Toeplitz matrices. As a specific example, a signal composed of a deterministic mean component, a zero-mean stochastic component, and a white-noise background was tested against white noise alone. Results confirm the obvious: for fixed stochastic signal power the rate of error decrease increases as the power in the deterministic mean increases. With higher signal-to-noise values, the probability of error must vanish more quickly. For fixed deterministic mean component, as the stochastic signal power increases there is curious dip in the rate of error decrease; however, as this power is increased, eventually the rate of error decrease increases

Published in:

IEEE Transactions on Information Theory  (Volume:36 ,  Issue: 3 )