Scheduled System Maintenance
On Tuesday, January 22, IEEE Xplore will undergo scheduled maintenance from 1:00-4:00 PM ET
During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

# IEEE Journal of Selected Topics in Signal Processing

## Issue 6 • Dec. 2018

Purchase of this issue is not available.

## Filter Results

Displaying Results 1 - 25 of 48
• ### [Front cover]

Publication Year: 2018, Page(s): C1
| PDF (290 KB)
• ### IEEE Journal of Selected Topics in Signal Processing publication information

Publication Year: 2018, Page(s): C2
| PDF (52 KB)
• ### [Blank page]

Publication Year: 2018, Page(s): B1123
| PDF (4 KB)
• ### [Blank page]

Publication Year: 2018, Page(s): B1124
| PDF (4 KB)

Publication Year: 2018, Page(s):1125 - 1126
| PDF (255 KB)
• ### Introduction to the Issue on Robust Subspace Learning and Tracking: Theory, Algorithms, and Applications

Publication Year: 2018, Page(s):1127 - 1130
| PDF (433 KB) | HTML
• ### Adaptive L1-Norm Principal-Component Analysis With Online Outlier Rejection

Publication Year: 2018, Page(s):1131 - 1143
Cited by:  Papers (1)
| | PDF (2557 KB) | HTML

L1-norm principal-component analysis (L1-PCA) is known to attain sturdy resistance against faulty points (outliers) among the processed data. However, computing the L1-PCA of large datasets, with high number of measurements and/or dimensions, may be computationally impractical; in such cases, incremental solutions could be preferred. At the same time, in many applications it is desired to track th... View full abstract»

• ### Robust Multinomial Logistic Regression Based on RPCA

Publication Year: 2018, Page(s):1144 - 1154
| | PDF (5331 KB) | HTML

Multiclass classification tasks are ubiquitous recently. In this scenario, the class label usually takes more than two possible discrete outcomes. As a simple and successful model, the multinomial logistic regression, also known as the softmax regression, is widely used in many multiclass classification applications. However, the existing method often experiences significant performance degradatio... View full abstract»

• ### Compressed Randomized UTV Decompositions for Low-Rank Matrix Approximations

Publication Year: 2018, Page(s):1155 - 1169
| | PDF (2543 KB) | HTML

Low-rank matrix approximations play a fundamental role in numerical linear algebra and signal processing applications. This paper introduces a novel rank-revealing matrix decomposition algorithm termed compressed randomized UTV (CoR-UTV) decomposition along with a CoR-UTV variant aided by the power method technique. CoR-UTV is primarily developed to compute an approximation to a low-rank input mat... View full abstract»

• ### Low-Rank Matrix Recovery With Simultaneous Presence of Outliers and Sparse Corruption

Publication Year: 2018, Page(s):1170 - 1181
| | PDF (1874 KB) | HTML

We study a data model in which the data matrix$\mathbf {D}\in \mathbb {R}^{N_1 \times N_2}$can be expressed as$\mathbf {D}= \mathbf {L}+ \mathbf {S}+ \mathbf {C},$where$\mathbf {L}$is a l... View full abstract»

• ### Turbo-Type Message Passing Algorithms for Compressed Robust Principal Component Analysis

Publication Year: 2018, Page(s):1182 - 1196
| | PDF (1662 KB) | HTML

Compressed robust principal component analysis (RPCA), in which a low-rank matrix L and a sparse matrix S are recovered from an underdetermined amount of noisy linear measurements of their sum L + S, arises in various applications such as face recognition and video foreground/background separation. This problem can be solved by Bayesian inference based iterative algorithms. However, most existing ... View full abstract»

• ### Low-Complexity Adaptive Algorithms for Robust Subspace Tracking

Publication Year: 2018, Page(s):1197 - 1212
| | PDF (4951 KB) | HTML

This paper introduces new, low-complexity, adaptive algorithms for robust subspace tracking in certain adverse scenarios of noisy data. First, an adequate weighted least-squares criterion is considered for the design of a robust subspace tracker that is most efficient in the burst noise case. Second, by using data pre-processing and robust statistics estimate, we introduce a second method that is ... View full abstract»

• ### Wasserstein Stationary Subspace Analysis

Publication Year: 2018, Page(s):1213 - 1223
| | PDF (1578 KB) | HTML

Learning under nonstationarity can be achieved by decomposing the data into a subspace that is stationary and a nonstationary one [stationary subspace analysis (SSA)]. While SSA has been used in various applications, its robustness and computational efficiency have limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper, we contribute by extending ... View full abstract»

• ### Subspace Change-Point Detection: A New Model and Solution

Publication Year: 2018, Page(s):1224 - 1239
| | PDF (1910 KB) | HTML

Change-point detection has a long history of research in statistical signal processing and remains a fundamental problem in many real-world applications involving information extraction from streaming data. Exploiting low-dimensional structures (subspace in particular) of high-dimensional signals is another research topic of significance, as it improves not only efficiency in computation, storage,... View full abstract»

• ### Subspace Estimation From Incomplete Observations: A High-Dimensional Analysis

Publication Year: 2018, Page(s):1240 - 1252
| | PDF (751 KB) | HTML Media

We present a high-dimensional analysis of three popular algorithms, namely, Oja's method, GROUSE, and PETRELS, for subspace estimation from streaming and highly incomplete observations. We show that, with proper time scaling, the time-varying principal angles between the true subspace and its estimates given by the algorithms converge weakly to deterministic processes when the ambient dimension View full abstract»

• ### Binary Matrix Factorization via Dictionary Learning

Publication Year: 2018, Page(s):1253 - 1262
| | PDF (559 KB) | HTML Media

Matrix factorization is a key tool in data analysis; its applications include recommender systems, correlation analysis, signal processing, among others. Binary matrices are a particular case, which has received significant attention for over 30 years, especially within the field of data mining. Dictionary learning refers to a family of methods for learning overcomplete basis (also called frames) ... View full abstract»

• ### Unsupervised Joint Subspace and Dictionary Learning for Enhanced Cross-Domain Person Re-Identification

Publication Year: 2018, Page(s):1263 - 1275
| | PDF (2655 KB) | HTML

Person re-identification (Re-ID) has drawn increasing attention from both academia and industry due to its great potentials in surveillance applications. Most existing research efforts have attempted to tackle cross-view variation in single-domain person Re-ID. However, there is still a lack of effective approaches to cross-domain person Re-ID problem. In this paper, an unsupervised joint subspace... View full abstract»

• ### M-Estimation-Based Subspace Learning for Brain Computer Interfaces

Publication Year: 2018, Page(s):1276 - 1285
| | PDF (5146 KB) | HTML

Many problems in signal processing, statistical learning, and data science can be posed as the problem of learning lower dimensional representation of the data. Particularly, we consider the brain computer interface (BCI) application where electroencephalography (EEG) data are used to determine user's intent to type letters through stimulation with rapid serial visual presentations (RSVP). Such a ... View full abstract»

• ### Successive Convex Approximation Algorithms for Sparse Signal Estimation With Nonconvex Regularizations

Publication Year: 2018, Page(s):1286 - 1302
| | PDF (692 KB) | HTML

In this paper, we propose a successive convex approximation framework for sparse optimization where the nonsmooth regularization function in the objective function is nonconvex and it can be written as the difference of two convex functions. The proposed framework is based on a nontrivial combination of the majorization-minimization framework and the successive convex approximation framework propo... View full abstract»

• ### PF-FELM: A Robust PCA Feature Selection for Fuzzy Extreme Learning Machine

Publication Year: 2018, Page(s):1303 - 1312
| | PDF (2874 KB) | HTML

Principal component analysis (PCA) is one of the crucial dimensionality reduction (DR) techniques in which the original features are transformed into lower dimensional space. Though the PCA space has orthogonal principal components (PC), it does not provide a real reduction of dimensionality in terms of the original features (variables), as all features including irrelevant and redundant features ... View full abstract»

• ### Moving Object Detection Through Robust Matrix Completion Augmented With Objectness

Publication Year: 2018, Page(s):1313 - 1323
| | PDF (5058 KB) | HTML Media

We present a novel approach for unsupervised detection of moving objects with nonsalient movements (e.g., rodents in their home cage). The proposed approach starts with separating the moving object from its background by modeling the background in a computationally efficient way. The background modeling is based on the assumption that background in natural videos lies on a low-dimensional subspace... View full abstract»

• ### Multi-Attribute Robust Component Analysis for Facial UV Maps

Publication Year: 2018, Page(s):1324 - 1337
| | PDF (10755 KB) | HTML Media

The collection of large-scale three-dimensional (3-D) face models has led to significant progress in the field of 3-D face alignment “in-the-wild,” with several methods being proposed toward establishing sparse or dense 3-D correspondences between a given 2-D facial image and a 3-D face model. Utilizing 3-D face alignment improves 2-D face alignment in many ways, such as alleviating issues with ar... View full abstract»

• ### Enhance Neighbor Reversibility in Subspace Learning for Image Retrieval

Publication Year: 2018, Page(s):1338 - 1350
| | PDF (3066 KB) | HTML

Two images that describe similar content usually have the neighbor-reversibility (NR) correlation, i.e., each image is among the neighbors of the other one. This phenomenon can be frequently observed in image retrieval. Some previous works have successfully utilized the NR correlation to improve search accuracy. In these methods, the retrieved images that have the NR correlation with the query ima... View full abstract»

• ### SULoRA: Subspace Unmixing With Low-Rank Attribute Embedding for Hyperspectral Data Analysis

Publication Year: 2018, Page(s):1351 - 1363
Cited by:  Papers (1)
| | PDF (11429 KB) | HTML

To support high-level analysis of spaceborne imaging spectroscopy (hyperspectral) imagery, spectral unmixing has been gaining significance in recent years. However, from the inevitable spectral variability, caused by illumination and topography change, atmospheric effects and so on make it difficult to accurately estimate abundance maps in spectral unmixing. Classical unmixing methods, e.g., linea... View full abstract»

• ### Tensor Nuclear Norm-Based Low-Rank Approximation With Total Variation Regularization

Publication Year: 2018, Page(s):1364 - 1377
| | PDF (5388 KB) | HTML

Some existing low-rank approximation approaches either need to predefine the rank values (such as the matrix/tensor factorization-based methods) or fail to consider local information of data (e.g., spatial or spectral smooth structure). To overcome these drawbacks, this paper proposes a new model called the tensor nuclear norm-based low-rank approximation with total variation regularization (TLR-T... View full abstract»

## Aims & Scope

The Journal of Selected Topics in Signal Processing (J-STSP) solicits special issues on topics that cover the entire scope of the IEEE Signal Processing Society including the theory and application of filtering, coding, transmitting, estimating, detecting, analyzing, recognizing, synthesizing, recording, and reproducing signals by digital or analog devices or techniques.

Full Aims & Scope

## Meet Our Editors

Editor-in-Chief

Lina Karam
School of Electrical, Computer, and Energy Engineering
Arizona State University
Tempe, AZ 85287-5706 USAkaram@asu.edu