By Topic

On l_q Optimization and Matrix Completion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Marjanovic, G. ; Sch. of Electr. Eng. & Telecommun., Univ. of New South Wales, Sydney, NSW, Australia ; Solo, V.

Rank minimization problems, which consist of finding a matrix of minimum rank subject to linear constraints, have been proposed in many areas of engineering and science. A specific problem is the matrix completion problem in which a low rank data matrix can be recovered from incomplete samples of its entries by solving a rank penalized least squares problem. The rank penalty is in fact the l0 “norm” of the matrix singular values. A recent convex relaxation of this penalty is the commonly used l1 norm of the matrix singular values. In this paper, we bridge the gap between these two penalties and propose the lq, 0 <; q <; 1 penalized least squares problem for matrix completion. An iterative algorithm is developed by solving a non-standard optimization problem and a non-trivial convergence result is proved. We illustrate with simulations comparing the reconstruction quality of the three matrix singular value penalty functions: l0, l1, and lq, 0 <; q <; 1.

Published in:

Signal Processing, IEEE Transactions on  (Volume:60 ,  Issue: 11 )