![Computational Complexity of Statistical Inference_hi-res logo](/sites/default/files/styles/workshop_banner_sm_1x/public/2023-03/Computational%20Complexity%20of%20Statistical%20Inference_hi-res.jpg?h=ccd3d9f7&itok=qCiIzWU5)
Abstract
We consider the task of recovering a pN-sparse vector planted in an n-dimensional random subspace of R^N, given an arbitrary basis for the subspace. We give an improved analysis of (a slight variant of) a spectral method proposed by Hopkins, Schramm, Shi, and Steurer (STOC 2016), showing that it succeeds when np << sqrt(N). This condition improves upon the best known polynomial-time guarantees of any algorithm. Our analysis also applies to the dense case p=1, provided that the planted vector has non-Gaussian entries. Furthermore, we give a matching lower bound, showing that when np >> sqrt(N), any spectral method from a large class (and more generally, any low-degree polynomial of the input) fails to detect the planted vector. Specifically, we rule out any method based on thresholding the leading eigenvalue of any polynomial-size matrix whose entries are constant-degree polynomials of the input. This yields a tight characterization of the power of spectral methods and suggests that no polynomial-time algorithm can succeed when np >> sqrt(N). This is joint work with Cheng Mao, available at https://arxiv.org/abs/2105.15081