This talk surveys a surprising connection between Matrix Completion and privacy-preserving singular vector computation. Matrix Completion is the problem of finding the dominant singular vectors of an unknown matrix from a subsample of its entries. Alternating Minimization is a widely used and empirically successful heuristic for Matrix Completion. We will cast Alternating Minimization as a noisy instance of the well-known Subspace Iteration algorithm (aka Power Method). Building on a new robust convergence analysis of Subspace Iteration, we prove the strongest known sample complexity bounds in the Alternating Minimization framework. Surprisingly, this analysis of Subspace Iteration also leads to an algorithm for approximating the top singular vectors of a given matrix while achieving the privacy guarantee known as Differential Privacy. The algorithm achieves nearly optimal error rates together with a nearly linear running time. But the connection between the two problems goes even further. In both settings the fundamental parameter that enters the picture is the coherence of the underlying matrix—a measure of how delocalized the singular vectors of the matrix are. In fact, our analysis of Alternating Minimization uses ideas for reasoning about coherence that were developed in the privacy setting.

Video Recording