Speaker 1: Sinho Chewi
Title 1: Dimension-free log-Sobolev inequalities for mixture distributions
Abstract 1: We show that if a family of probability distributions is such that (1) each distribution satisfies a log-Sobolev inequality and (2) the pairwise chi-squared divergences are bounded, then any mixture of the distributions also satisfies a log-Sobolev inequality with explicit constant. In particular, we resolve a conjecture of Zimmermann, and Bardet, Gozlan, Malrieu, and Zitt, that Gaussian convolutions of bounded support measures enjoy dimension-free log-Sobolev inequalities. This is joint work with Hong-Bin Chen and Jonathan Niles-Weed.
Speaker 2: Enric Boix-Adserà
Title 2: The staircase property: How hierarchical structure can guide deep learning
Abstract 2: We identify the “staircase property”, which is a structural property of data distributions that enables deep neural networks to learn hierarchically. Specifically, for functions over the Boolean hypercube, the staircase property posits that high-order Fourier coefficients are reachable from lower-order Fourier coefficients along increasing chains. We provide empirical and theoretical evidence that neural networks learn data satisfying the staircase property by first learning simple features of the data, and then leveraging these to learn more complex features of the data. Joint work with Emmanuel Abbe, Matthew Brennan, Guy Bresler, and Dheeraj Nagaraj.