Semidefinite Programming Hierarchies

Lecture 1: Convex Relaxations for Hard Optimization Problems
Lecture 2: Hierarchies of Relaxations and their Strengths and Limitations
Lecture 3: Applications of the Sum-of-squares Method

This series of talks was part of the Algorithmic Spectral Graph Theory Boot Camp. Videos for each talk are available through the links above.


Speaker: David Steurer, Cornell University

Convex programming relaxations achieve for a broad range of computationally hard optimization problems the best known approximation guarantees. For many such problems, stronger relaxations are the most promising approach to obtain better and potentially optimal guarantees. We will discuss systematic methods for devising increasingly stronger relaxations, both in the linear programming and semidefinite programming case, and how to reason about their guarantees. We will introduce the sum-of-squares method which is a conceptually simple meta-algorithm that unifies all existing relaxation methods. Finally, we will illustrate its connections to spectral methods, proof complexity, and real algebraic geometry and applications to quantum information, multilinear algebra, sparse recovery, and machine learning.