Skip to Main Content
Page's test for the quick detection of a change in distribution is optimized by utilizing the log-likelihood ratio (LLR) as a detector nonlinearity. For signal detection applications, locally optimal nonlinearities are optimal as the signal strength γ goes to zero, however, for non-zero values of γ, the performance of Page's test may be improved by applying a subtractive bias. The bias that maximizes an asymptotic (i.e. as the average time between false alarms goes to infinity) performance measure for Page's test for a fixed signal strength is derived for a general detector nonlinearity. Additionally, the bias is derived to minimize the signal strength required to achieve a desired asymptotic performance. These two methods for choosing the bias are shown to be equivalent The asymptotically optimal bias for the LLR nonlinearity is shown to be zero, which is consistent with the optimality of the LLR. Subject to a first order approximation, it is shown that the proposed asymptotically optimal bias is equivalent to an extension of the bias derived by Dyson (1986) which approximately maximizes the relative efficiency between Page's test with the biased locally optimal nonlinearity and Page's test with the LLR nonlinearity. The use of the asymptotically optimal bias is illustrated through an example.