Skip to Main Content
Linear filtering has been extensively studied under the assumption that the noise is Gaussian. The most commonly used least-mean-square-error (LMSE) solution is optimal when the noise is Gaussian. However, in many practical applications, the noise can be modeled more accurately using a non-Gaussian distribution. In this paper, we consider the linear filtering problem where the noise comes from a generalized Gaussian distribution (GGD) and solve the problem using maximum likelihood (ML) estimation. To estimate the likelihood, we consider two approaches. One uses explicitly the GGD form that leads to an adaptive filtering algorithm same as the least mean p-norm (LMP) algorithm. Second approach uses a semi-parametric model where the entropy of the noise is estimated using entropy bound minimization (EBM), a flexible approach to density estimation. We derive the Cramer-Rao lower bound for the ML-LMP estimators by using a second-order Taylor series expansion of the likelihood function. Simulation results show that when the noise comes from a GGD, ML-LMP achieves the best performance, while EBM provides very competitive performance and offers the advantage that it does not assume a certain parametric model for the noise.