Abstract

Approximate Message Passing (AMP) refers to a class of iterative algorithms that have been successfully applied to a number of high-dimensional statistical estimation problems like linear regression, generalized linear models, and low-rank matrix estimation, and a variety of engineering and computer science applications such as imaging, communications, and deep learning.

AMP algorithms have two features that make them particularly attractive: they can easily be tailored to take advantage of prior information on the structure of the signal, such as sparsity, and under suitable assumptions on a design matrix, AMP theory provides precise asymptotic guarantees for statistical procedures in the high-dimensional regime.

In this talk, I will present the main ideas of AMP from a statistical perspective to illustrate the power and flexibility of the AMP framework. Time permitting, we will discuss how AMP has been used to obtain lower bounds on the estimation error of first-order methods, and how it plays a fundamental role in understanding the performance gap between information-theoretically optimal and computationally feasible estimators in some problems.

Video Recording