By Topic

Dictionary learning by nonnegative matrix factorization with 1/2-norm sparsity constraint

Sign In

Full text access may be available.

To access full text, please use your member or institutional sign in.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Zhenni Li ; Sch. of Comput. Sci. & Eng., Univ. of Aizu, Aizu-Wakamatsu, Japan ; Zunyi Tang ; Shuxue Ding

In this paper, we propose an overcomplete, nonnegative dictionary learning method for sparse representation of signals, which is based on the nonnegative matrix factorization (NMF) with 1/2-norm as the sparsity constraint. By introducing the 1/2-norm as the sparsity constraint into NMF, we show that the problem can be cast as sequential optimization problems of quadratic functions and quartic functions. The optimization problem of each quadratic function can be solved easily since the problem has closed-form unique solution. The optimization problem of quartic function can also be formulated as solving a cubic equation, which can be efficiently solved by the Cardano formula and selecting one of solutions with a rule. To implement this nonnegative dictionary learning, we develop an algorithm by employing coordinate-wise decent strategy, i.e., coordinate-wise decent based nonnegative dictionary learning (CDNDL). Numerical experiments show that the proposed algorithm performs better than the nonnegative K-SVD (NN-KSVD) and the other two compared algorithms.

Published in:

Cybernetics (CYBCONF), 2013 IEEE International Conference on

Date of Conference:

13-15 June 2013