By Topic

Verifiable and computable ℓ performance evaluation of ℓ1 sparse signal recovery

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Gongguo Tang ; Dept. of Electr. & Syst. Eng., Washington Univ. in St. Louis, St. Louis, MO, USA ; Nehorai, Arye

In this paper, we develop verifiable and computable performance analysis of the ℓ norms of the recovery errors for ℓ1 minimization algorithms. We define a family of goodness measures for arbitrary sensing matrices as a set of optimization problems, and design algorithms with a theoretical global convergence guarantee to compute these goodness measures. The proposed algorithms solve a series of second-order cone programs, or linear programs. As a by-product, we implement an efficient algorithm to verify a sufficient condition for exact ℓ1 recovery in the noise-free case. This implementation performs orders-of-magnitude faster than the state-of-the-art techniques. We derive performance bounds on the ℓ norms of the recovery errors in terms of these goodness measures. We establish connections between other performance criteria (e.g., the ℓ2 norm, ℓ1 norm, and support recovery) and the ℓ norm in a tight manner. We also analytically demonstrate that the developed goodness measures are non-degenerate for a large class of random sensing matrices, as long as the number of measurements is relatively large. Numerical experiments show that, compared with the restricted isometry based performance bounds, our error bounds apply to a wider range of problems and are tighter, when the sparsity levels of the signals are relatively low.

Published in:

Information Sciences and Systems (CISS), 2011 45th Annual Conference on

Date of Conference:

23-25 March 2011