Skip to Main Content
Least absolute deviation (LAD) optimization model, also called the unconstrained minimum L1-norm optimization model, has found extensive applications in linear parameter estimations. L1-norm model is superior to Lp-norm (p>1) models in non-Gaussian noise environments or even in chaos, especially for signals that contain sharp transitions (such as biomedical signals with spiky series or motion artifacts) or chaotic dynamic processes. However, its implementation is more difficult due to discontinuous derivatives, especially compared with the least-squares model (L2-norm). In this paper, neural implementation of LAD optimization model is presented, where a new neural network is constructed and its performance in LAD optimization is evaluated theoretically and experimentally. Then, the application of the proposed LAD neural network (LADNN) to time delay estimation (TDE) is presented. In TDE, a given signal is modeled using the moving average (MA) model. The MA parameters are estimated by using the LADNN and the time delay corresponds to the time index at which the MA coefficients have a peak. Compared with higher order spectra (HOS)-based TDE methods, the LADNN-based method is free of the assumption that the signal is non-Gaussian and the noises are Gaussian, which is closer to real situations. Experiments under three different noise environments, Gaussian, non-Gaussian and chaotic, are conducted to compare the proposed TDE method with the existing HOS-based method.