![Computational Complexity of Statistical Inference_hi-res logo](/sites/default/files/styles/workshop_banner_sm_1x/public/2023-03/Computational%20Complexity%20of%20Statistical%20Inference_hi-res.jpg?h=ccd3d9f7&itok=qCiIzWU5)
Abstract
In many problems in modern statistics and machine learning, it is often of interest to establish that a first order method on a non-convex risk function eventually enters a region of local strong convexity. In this talk, I will present an asymptotic comparison inequality, which we call the Sudakov-Fernique post-AMP inequality, which, in a certain class of problems involving a GOE matrix, is able to probe properties of an optimization landscape locally around the iterates of an approximate message passing (AMP) algorithm. As an example of its use, I will describe a new proof of a result in Celentano, Fan, Mei (2021), which establishes that the so-called TAP free energy in the ℤ2-synchronization problem is locally convex in the region to which AMP converges. As a consequence, the proof also confirms a conjecture of El Alaoui, Montanari, Sellke (2022) that their algorithm efficiently samples from the Sherrington-Kirkpatrick Gibbs measure throughout the "easy" regime.