Skip to Main Content
In this paper, we propose a simple and robust algorithm for compressive sensing (CS) signal reconstruction based on the weighted median (WM) operator. The proposed approach addresses the reconstruction problem by solving a l0-regularized least absolute deviation (l0-LAD) regression problem with a tunable regularization parameter, being suitable for applications where the underlying contamination follows a statistical model with heavier-than-Gaussian tails. The solution to this regularized LAD regression problem is efficiently computed, under a coordinate descent framework, by an iterative algorithm that comprises two stages. In the first stage, an estimation of the sparse signal is found by recasting the reconstruction problem as a parameter location estimation for each entry in the sparse vector leading to the minimization of a sum of weighted absolute deviations. The solution to this one-dimensional minimization problem turns out to be the WM operator acting on a shifted-and-scaled version of the measurement samples with weights taken from the entries in the measurement matrix. The resultant estimated value is then passed to a second stage that identifies whether the corresponding entry is relevant or not. This stage is achieved by a hard threshold operator with adaptable thresholding parameter that is suitably tuned as the algorithm progresses. This two-stage operation, WM operator followed by a hard threshold operator, adds the desired robustness to the estimation of the sparse signal and, at the same time, ensures the sparsity of the solution. Extensive simulations demonstrate the reconstruction capability of the proposed approach under different noise models. We compare the performance of the proposed approach to those yielded by state-of-the-art CS reconstruction algorithms showing that our approach achieves a better performance for different noise distributions. In particular, as the distribution tails become heavier the performance ga- n achieved by the proposed approach increases significantly.