Abstract:
Many applications in signal processing lead to the optimization problems min parxpar1 subject to y = Ax, and min parxpar1 subject to pary - Axpar les epsi, where A is a g...Show MoreMetadata
Abstract:
Many applications in signal processing lead to the optimization problems min parxpar1 subject to y = Ax, and min parxpar1 subject to pary - Axpar les epsi, where A is a given d times n matrix, d < n, and y is a given n times 1 vector. In this work we consider l1 minimization by using LARS, Lasso, and homotopy methods (Efron et al., Tibshirani, Osborne et al.). While these methods were first proposed for use in statistical model selection, we show that under certain conditions these methods find the sparsest solution rapidly, as opposed to conventional general purpose optimizers which are prohibitively slow. We define a phase transition diagram which shows how algorithms behave for random problems, as the ratio of unknowns to equations and the ratio of the sparsity to equations varies. We find that whenever the number k of nonzeros in the sparsest solution is less than d/2log(n) then LARS/homotopy obtains the sparsest solution in k steps each of complexity O(d2)
Published in: 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings
Date of Conference: 14-19 May 2006
Date Added to IEEE Xplore: 24 July 2006
Print ISBN:1-4244-0469-X