About

Notice:

We experienced some technical difficulties on Wednesday, September 11. The videos for the talks that have taken place so far are viewable by clicking on the schedule & video link. We apologize for any inconvenience.

The Boot Camp is intended to acquaint program participants with the key themes of the program. It will consist of three mini-courses as follows:

Inapproximability of Constraint Satisfaction Problems (5 lectures)
Johan Håstad (KTH Royal Institute of Technology)

Starting from the PCP-theorem (which we will take as given) we show how to design and analyze efficient PCPs for NP-problems. We describe how an efficient PCP using small amounts of randomness can be turned into an inapproximability result for a maximum constraint satisfaction problem where each constraint corresponds to the acceptance criterion of the PCP. We then discuss how to design efficient PCPs with perfect completeness in some interesting cases like proving the hardness of approximating satisfiable instances of 3-Sat.

We go on to discuss gadget construction and how to obtain optimal reductions between approximation problems. We present Chan's result on how to take products of PCPs to get hardness for very sparse CSPs and give some preliminary new results using these predicates as a basis for a gadget reduction.

Finally we discuss approximation in a different measure, and in particular the following problem. Given a (2k+1)-CNF formula which admits an assignment that satisfies k literal in each clause, is it possible to efficiently find a standard satisfying assignment?

Analytic Methods for Supervised Learning (4 lectures)
Adam Klivans (University of Texas, Austin)

In this mini-course we will show how to use tools from analysis and probability (e.g., contraction, surface area and limit theorems) to develop efficient algorithms for supervised learning problems with respect to well-studied probability distributions (e.g., Gaussians). One area of focus will be understanding the minimal assumptions needed for convex relaxations of certain learning problems (thought to be hard in the worst-case) to become tractable.

Introduction to Analysis on the Discrete Cube (4 lectures)
Krzysztof Oleszkiewicz (University of Warsaw)

The basic notions and ideas of analysis on the discrete cube will be discussed, in an elementary and mostly self-contained exposition. These include the Walsh-Fourier expansion, random walk and its connection to the heat semigroup, hypercontractivity and related functional inequalities, influences, the invariance principle and its application to the Majority is Stablest problem. The mini-course will also contain some other applications and examples, as well as several open questions.

Chairs/Organizers