Loading [MathJax]/extensions/MathMenu.js
Block-Randomized Stochastic Proximal Gradient for Low-Rank Tensor Factorization | IEEE Journals & Magazine | IEEE Xplore

Block-Randomized Stochastic Proximal Gradient for Low-Rank Tensor Factorization


Abstract:

This work considers the problem of computing the canonical polyadic decomposition (CPD) of large tensors. Prior works leverage data sparsity to handle this problem, which...Show More

Abstract:

This work considers the problem of computing the canonical polyadic decomposition (CPD) of large tensors. Prior works leverage data sparsity to handle this problem, which is not suitable for handling dense tensors that often arise in applications such as medical imaging, computer vision, and remote sensing. Stochastic optimization is known for its low memory cost and per-iteration complexity when handling dense data. However, existing stochastic CPD algorithms are not flexible to incorporate a variety of constraints/regularization terms that are of interest in signal and data analytics. Convergence properties of many such algorithms are also unclear. In this work, we propose a stochastic optimization framework for large-scale CPD with constraints/regularization terms. The framework works under a doubly randomized fashion, and can be regarded as a judicious combination of randomized block coordinate descent (BCD) and stochastic proximal gradient (SPG). The algorithm enjoys lightweight updates and small memory footprint. This framework entails considerable flexibility-many frequently used regularizers and constraints can be readily handled. The approach is supported by convergence analysis. Numerical results on large-scale dense tensors are presented to showcase the effectiveness of the proposed approach.
Published in: IEEE Transactions on Signal Processing ( Volume: 68)
Page(s): 2170 - 2185
Date of Publication: 20 March 2020

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.