Abstract

Low-rank tensor decompositions are a powerful tool that arise in machine learning, statistics and signal processing. However, tensors pose significant algorithmic challenges and tensors analogs of much of the matrix algebra toolkit are unlikely to exist because of hardness results. For instance, efficient tensor decompositions in the overcomplete case (where rank exceeds dimension) are particularly challenging.

I will introduce a smoothed analysis model for studying tensor decompositions. Smoothed analysis gives an appealing way to analyze non worst-case instances: the assumption here is that the model is not adversarially chosen, formalized by a perturbation of model parameters. We give efficient algorithms for decomposing tensors, even in the highly overcomplete case (rank being polynomial in the dimension) using smoothed analysis. This gives new polynomial time algorithms for learning probabilistic models like mixtures of axis-aligned gaussians and multi-view models even in highly overcomplete settings.

Based on joint work with Aditya Bhaskara, Moses Charikar and Ankur Moitra.

Video Recording