Skip to Main Content
It is well known that quantization cannot increase the Kullback-Leibler divergence which can be thought of as the expected value or first moment of the log-likelihood ratio. In this paper, we investigate the quantization effects on the second moment of the log-likelihood ratio. It is shown via the convex domination technique that quantization may result in an increase in the case of the second moment, but the increase is bounded above by 2/e. The result is then applied to decentralized sequential detection problems not only to provide simpler sufficient conditions for asymptotic optimality theories in the simplest models, but also to shed new light on more complicated models. In addition, some brief remarks on other higher-order moments of the log-likelihood ratio are also provided.