Abstract

Typically, hardness results for many high dimensional statistical tasks, and in particular, for robust estimation tasks, fall into two categories. They are either against restricted classes of algorithms such as SQ algorithms, or start from average-case assumptions such as planted clique (and its relatives). However, there is relatively little work that achieves hardness bounds from worst case complexity theoretic assumptions, arguably the "gold standard" for lower bounds.

In this talk, we take the first steps towards achieving such results. We show that several natural approaches to improving error rates of current polynomial-time robust mean estimation algorithms would imply efficient algorithms for the small-set expansion problem, refuting Raghavendra and Steurer's small-set expansion hypothesis (so long as P≠NP). We also give the first direct reduction to the robust mean estimation problem, starting from a plausible but nonstandard variant of the small-set expansion problem.

Based on joint work with Sam Hopkins.

Video Recording