Abstract

The problem of finding low-rank decompositions of tensors has a wide-range applications, e.g., for learning various mixture models. In the overcomplete regime---when the rank is superlinear in the dimension---most known decomposition methods for third-order tensors break down.
 
In this talk, we describe a polynomial-time algorithm based on the sum-of-squares method that is able to approximately decompose random n-dimensional third-order tensors of rank up to n^{3/2}/polylog n. The previous best algorithm by Ge and Ma also used sum-of-squares but required quasi-polynomial time to achieve such decompositions.
 
Finally, we demonstrate how to reap the benefits of the sum-of-squares method without solving large semidefinite programs: Using the sum-of-squares toolkit, we devise new kinds of spectral algorithms that achieve similar decomposition guarantees as sum-of-squares but with running times that are close to linear in the size of the input (exponent 1.08 using fast matrix-multiplication).
 
Based on joint works with Sam Hopkins, Tengyu Ma, Tselil Schramm, and Jonathan Shi.

Video Recording