Abstract

The study of large scale matrices and networks often utilize high-level algorithmic primitives for them: tools such as solving linear systems and computing eigenvectors are now taught in courses on machine learning and statistics. On the other hand, the rapid growth in data sizes exposed significant shortcomings in the scalability of such primitives, and accentuated the need for provably efficient algorithms. Here the Laplacian paradigm for graph algorithms has led to improvements on many fundamental problems in scientific computing and combinatorial optimization, as well as highly efficient code packages that have been applied to image processing and data mining. These works started from the rich mathematical connections between graphs and matrices provided by spectral graph theory, but increasingly rely on fine-grained coupling between combinatorial and numerical components. This talk will discuss some of the key ideas in these algorithms, with focus on the uses of efficient structure-preserving sampling and high dimensional concentration.

Video Recording