Description

This talk is part of the Advances in Boolean Function Analysis Lecture Series. The series will feature weekly two-hour lectures that aim to address both the broad context of the result and the technical details. Though closely related in theme, each lecture will be self-contained. Join us weekly at 10:00 a.m. PDT, from July 15, 2020 to August 18, 2020. There is a five minute break at the end of the first hour.

Abstract:
We revisit several classical inequalities which relate the influences of a Boolean function to its variance - the Kahn-Kalai-Linial (KKL) inequality and its generalizations by Friedgut and Talagrand, and the relation between influences and noise stability by Benjamini-Kalai-Schramm. We will introduce a new method towards the proofs of these inequalities (based on stochastic calculus and the analysis of jump processes). Our method resolves a '96 conjecture of Talagrand, deriving a bound which strengthens both Talagrand's sensitivity inequality and the KKL inequality. Our method also produces robust versions of some of the aforementioned bounds. Joint work with Renan Gross.  

YouTube Video
Remote video URL
Remote video URL