Skip to Main Content
In this paper, a new class of lower bounds on the mean square error (MSE) of unbiased estimators of deterministic parameters is proposed. Derivation of the proposed class is performed by projecting each entry of the vector of estimation error on a Hilbert subspace of L2. This Hilbert subspace contains linear transformations of elements in the domain of an integral transform of the likelihood-ratio function. The integral transform generalizes the traditional derivative and sampling operators, which are applied on the likelihood-ratio function for computation of performance lower bounds, such as Cramér-Rao, Bhattacharyya, and McAulay-Seidman bounds. It is shown that some well-known lower bounds on the MSE of unbiased estimators can be derived from this class by modifying the kernel of the integral transform. A new lower bound is derived from the proposed class using the kernel of the Fourier transform. In comparison with other existing bounds, the proposed bound is computationally manageable and provides better prediction of the threshold region of the maximum-likelihood estimator, in the problem of single tone estimation.