Abstract

Recently, there has been some success in using the sum-of-squares hierarchy to solve classical problems in machine learning. However, the sum-of-squares algorithms are very slow, making them impractical for the machine learning setting. In this talk I will show that ideas from sum-of-squares rounding schemes can be used to obtain fast, spectral algorithms. I will focus on the problem of decomposing an overcomplete low-rank random tensor into rank-1 components, showing that ideas from the quasi-polynomial sum-of-squares algorithm can be adapted to achieve similar performance in subquadratic time.

Based on joint work with Sam Hopkins, Jonathan Shi and David Steurer.