Summer 2019

Generalization IV

Wednesday, May 29th, 2019 2:00 pm3:20 pm

Add to Calendar


Peter Bartlett (UC Berkeley) and Sasha Rakhlin (Massachusetts Institute of Technology)

We review tools useful for the analysis of the generalization performance of deep neural networks on classification and regression problems.  We review uniform convergence properties, which show how this performance depends on notions of complexity, such as Rademacher averages, covering numbers, and combinatorial dimensions, and how these quantities can be bounded for neural networks.  We also review the analysis of the performance of nonparametric estimation methods such as nearest-neighbor rules and kernel smoothing.  Deep networks raise some novel challenges, since they have been observed to perform well even with a perfect fit to the training data.  We review some recent efforts to understand the performance of interpolating prediction rules, and highlight the questions raised for deep learning.