A Feature Article Cluster on Exploiting Structure in Data Analytics: Low-Rank and Sparse Structures

You are here

Top Reasons to Join SPS Today!

1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.

A Feature Article Cluster on Exploiting Structure in Data Analytics: Low-Rank and Sparse Structures

By: 
Namrata Vaswani
Individual feature articles and special issues are two major mechanisms of full-length tutorial surveys of IEEE Signal Processing Magazine (SPM). Since the May 2016 feature article cluster by Jane Wang et al. on brain signal analytics, SPM’s current and past editors-in-chief and their teams have been exploring a different way to complement this existing structure—a feature article cluster (or mini special issue) that allows for a set of three to five solicited articles on a current topic, instead of just one (feature article) or ten to 11 (a special issue). This issue of SPM offers a feature article cluster on exploiting structure in data analytics: low-rank and sparse structures and is the second such cluster. It is the first in SPM’s planned yearly series on data science and includes the following four articles:
  • “Harnessing Structures in Big Data via Guaranteed Low-Rank Matrix Estimation” by Chen and Chi
  • “Robust Subspace Learning” by Vaswani et al. 
  • “Correlation-Awareness in Low-Rank Models” by Pal 
  • “Theoretical Foundations of Deep Learning via Sparse Representations” by Papyan et al.
In today’s big data age, data is generated everywhere around us. Examples include texts, Tweets, network traffic, changing Facebook connections, or video surveillance feeds coming in from one or multiple cameras. Much of this data is high dimensional, and a lot of it is also highly noisy, outlier corrupted, or incomplete. Thus, the first step before processing such data is noise/outlier removal and/or filling in the missing entries, along with dimension reduction. All of these tasks are hard and ill-posed without any structural (prior) assumptions on the data. Two of the most commonly used and practically valid structural assumptions for high-dimensional data sets are sparsity and low rank. When sparsity is exploited, it is now well known that the signal can be recovered from a highly undersampled set of its linear projection measurements, under mild assumptions. This idea, referred to as sparse recovery or compressive sensing and its various extensions, is now well known, well studied, and well reviewed.
 
In this issue: This feature article cluster focuses on data recovery methods that exploit the other most commonly valid structural assumption on data sets—low rank, as well as on those that exploit both lowrank and sparse structures for different parts of a data set.
 
The first article by Chen and Chi fo - cuses on low-rank matrix recovery from incomplete data, with and without the use of additional constraints. Low-rank modeling plays a pivotal role in signal processing and machine learning, with applications ranging from collaborative filtering to dimensionality reduction and adaptive filtering. In recent years, progress has been made in understanding how to exploit the low-rank assumption while still obtaining computationally efficient and provably correct solutions. This article reviews the literature on convex relaxation approaches such as nuclear norm minimization, as well as more recent nonconvex procedures, such as projected gradient descent or alternating minimization, which are much faster, while needing only a little more measurements, provably.

SPS on Twitter

SPS Videos


Signal Processing in Home Assistants

 


Multimedia Forensics


Careers in Signal Processing             

 


Under the Radar