Skip to Main Content
In this paper, we propose an overcomplete, nonnegative dictionary learning method for sparse representation of signals, which is based on the nonnegative matrix factorization (NMF) with 1/2-norm as the sparsity constraint. By introducing the 1/2-norm as the sparsity constraint into NMF, we show that the problem can be cast as sequential optimization problems of quadratic functions and quartic functions. The optimization problem of each quadratic function can be solved easily since the problem has closed-form unique solution. The optimization problem of quartic function can also be formulated as solving a cubic equation, which can be efficiently solved by the Cardano formula and selecting one of solutions with a rule. To implement this nonnegative dictionary learning, we develop an algorithm by employing coordinate-wise decent strategy, i.e., coordinate-wise decent based nonnegative dictionary learning (CDNDL). Numerical experiments show that the proposed algorithm performs better than the nonnegative K-SVD (NN-KSVD) and the other two compared algorithms.