Calvin Lab Auditorium
Convex programming relaxations achieve for a broad range of computationally hard optimization problems the best known approximation guarantees. For many such problems, stronger relaxations are the most promising approach to obtain better and potentially optimal guarantees. We will discuss systematic methods for devising increasingly stronger relaxations, both in the linear programming and semidefinite programming case, and how to reason about their guarantees. We will introduce the sum-of-squares method which is a conceptually simple meta-algorithm that unifies all existing relaxation methods. Finally, we will illustrate its connections to spectral methods, proof complexity, and real algebraic geometry and applications to quantum information, multilinear algebra, sparse recovery, and machine learning.