Abstract

Models or signals exhibiting low dimensional behavior (e.g., sparse signals) play an important role in signal processing and machine learning. In this talk, we focus on models that have multiple structures simultaneously; e.g., matrices that are both low rank and sparse, arising in phase retrieval, quadratic compressed sensing, and sparse PCA. We consider estimating such models from observations corrupted by additive Gaussian noise.

We provide tight upper and lower bounds on the mean squared error (MSE) of a convex denoising program that uses a combination of regularizers. In the case of low rank and sparse matrices, we quantify the gap between the MSE of the convex program and the best achievable error, and present a simple (nonconvex) thresholding algorithm that outperforms its convex counterpart and achieves almost optimal MSE.

Video Recording