Skip to Main Content
Group Lasso is a mixed l1/l2-regularization method for a block-wise sparse model that has attracted a lot of interests in statistics, machine learning, and data mining. This paper establishes the possibility of stably recovering original signals from the noisy data using the adaptive group Lasso with a combination of sufficient block-sparsity and favorable block structure of the overcomplete dictionary. The corresponding theoretical results about the solution uniqueness, support recovery and representation error bound are derived based on the properties of block-coherence and subcoherence. Compared with the theoretical results on the parametrized quadratic program of conventional sparse representation, our stability results are more general. A comparison with block-based orthogonal greedy algorithm is also presented. Numerical experiments demonstrate the validity and correctness of theoretical derivation and also show that in case of noisy situation, the adaptive group Lasso has a better reconstruction performance than the quadratic program approach if the observed sparse signals have a natural block structure.