Abstract

In this talk, we are going to discuss a new polynomial-time algorithmic framework for inference, based on the celebrated Lenstra-Lenstra-Lovasz lattice basis reduction algorithm. Potentially surprisingly this algorithmic framework is able to successfully bypass multiple notions of "computational hardness for inference" for a number of "exponentially-low-noise" inference settings. Such examples include the algorithm provably succeeding in previously believed "hard" regimes of 1) sparse regression, where there is Overlap Gap Property and Low-Degree failure, 2) phase retrieval, where AMP fails, and 3) cosine neuron learning, where SQ-methods fail. Time permitted, we are going to present the framework, and study the details of some of the mentioned applications. This is based on joint works with David Gamarnik, Eren Kizildag, Min Jae Song, and Joan Bruna.

Video Recording