Abstract

Techniques and insights from convex optimization play an important role in the design of algorithms for discrete optimization, as evidenced by the success of convex relaxations in providing a framework for constructing and analyzing approximation algorithms. In the last decade, iterative methods from convex optimization have served as a springboard to a number of algorithmic advances for many discrete optimization tasks, including fundamental ones such as submodular optimization and maximum flow problems.

In this boot camp series, we will provide a walk-through of the continuous approach to algorithm design in discrete optimization. We will consider the algorithmic challenges that arise in the choice of formulation and iterative algorithm by casting first-order methods as discretizations of continuous-time dynamics. We will see that the non-smooth nature of discrete optimization problems leads us to smoothen our objectives via regularization and discuss how different choices of regularizers arise in different settings. We will then put these principles to work in the contexts of submodular optimization and competitive analysis of online algorithms.

The first session of this mini course will take place on Tuesday, August 22 from 11:00 AM - 12:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 1:30 - 2:30 PM; the third session of this mini course will take place on Friday, August 25 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Friday, August 25 from 11:00 AM - 12:00 PM.

Video Recording