By Topic

On the performance degradation from one-bit quantized detection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Willett, P. ; Connecticut Univ., Storrs, CT, USA ; Swaszek, P.F.

It is common signal detection practice to base tests on quantized data and frequently, as in decentralized detection, this quantization is extreme: to a single bit. As to the accompanying degradation in performance, certain cases (such as that of an additive signal model and an efficacy measure) are well-understood. However, there has been little treatment of more general cases. In this correspondence we explore the possible performance loss from two perspectives. We examine the Chernoff exponent and discover a nontrivial lower bound on the relative efficiency of an optimized one-bit quantized detector as compared to unquantized. We then examine the case of finite sample size and discover a family of nontrivial bounds. These are upper bounds on the probability of detection for an unquantized system given a specified quantized performance, given that both systems operate at the same false-alarm rate

Published in:

Information Theory, IEEE Transactions on  (Volume:41 ,  Issue: 6 )