We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Analysis of Sparse Regularization Based Robust Regression Approaches

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Mitra, K. ; Dept. of Electr. & Comput. Eng., Rice Univ., Houston, TX, USA ; Veeraraghavan, A. ; Chellappa, R.

Regression in the presence of outliers is an inherently combinatorial problem. However, compressive sensing theory suggests that certain combinatorial optimization problems can be exactly solved using polynomial-time algorithms. Motivated by this connection, several research groups have proposed polynomial-time algorithms for robust regression. In this paper we specifically address the traditional robust regression problem, where the number of observations is more than the number of unknown regression parameters and the structure of the regressor matrix is defined by the training dataset (and hence it may not satisfy properties such as Restricted Isometry Property or incoherence). We derive the precise conditions under which the sparse regularization (l0 and l1-norm) approaches solve the robust regression problem. We show that the smallest principal angle between the regressor subspace and all k-dimensional outlier subspaces is the fundamental quantity that determines the performance of these algorithms. In terms of this angle we provide an estimate of the number of outliers the sparse regularization based approaches can handle. We then empirically evaluate the sparse (l1-norm) regularization approach against other traditional robust regression algorithms to identify accurate and efficient algorithms for high-dimensional regression problems.

Published in:

Signal Processing, IEEE Transactions on  (Volume:61 ,  Issue: 5 )