Abstract

Recovering a structured object given a small number of linear observations has been well-studied in recent years. Examples include sparse vectors, low-rank matrices, and the sum of sparse and low-rank matrices, among others. In many applications in signal processing and machine learning, the object of interest has several structures simultaneously, e.g., a matrix that is simultaneously sparse and low-rank. Often norms that promote each structure are known, and allow for recovery using an order-wise optimal number of measurements (e.g., l1 norm for sparsity, nuclear norm for matrix rank), and it is common in practice to minimize a combination of such norms.

We show that multi-objective optimization with these norms can do no better, order-wise, than exploiting only one of the present structures. This suggests that to fully exploit the multiple structures, we need an entirely new convex relaxation, not one that combines the convex relaxations for each structure. We then specialize our result to the case of sparse and low-rank matrices, and  show that a nonconvex formulation of the problem can recover the model from very few measurements  (on the order of the degrees of freedom of the matrix), whereas the convex problem obtained from a combination of the l1 and nuclear norms requires many more measurements. Our framework applies to arbitrary norms as well as to a wide range of measurement ensembles. This allows us to give performance bounds for problems such as sparse phase retrieval and low-rank tensor completion.

This talk is based on joint work with Samet Oymak, Amin Jalali, Yonina Eldar, and Babak Hassibi.

Video Recording