This cluster will bring together a small group of computer scientists, statisticians, mathematicians and electrical engineers with the aim of developing the theoretical foundations of deep learning, particularly the aspects of this methodology that are very different from classical statistical approaches. The cluster is supported by the NSF/Simons Foundation Collaboration on the Theoretical Foundations of Deep Learning, which aims to understand the mathematical mechanisms that underpin the practical success of deep learning, both to elucidate the limitations of deep learning methods and to allow their extension beyond the domains where they are currently applicable, and to initiate the study of the mathematical problems that emerge. Specific objectives of the cluster include understanding: the role of overparametrization in efficient optimization, mechanisms by which interpolation with implicit regularization enables generalization; and how compositionality confers representational richness.
This cluster is supported by the NSF through grant DMS-2031883 and the Simons Foundation through award #814639.
Peter Bartlett (UC Berkeley), Mikhail Belkin (UC San Diego), Nathan Srebro (TTI Chicago), Bin Yu (UC Berkeley)
Long-term participants (including organizers):
Peter Bartlett, Mikhail Belkin, Amit Daniely, Andrea Montanari, Alexander Rakhlin, Nathan Srebro, Roman Vershynin, Bin Yu
Long-Term Participants (including Organizers):
Visiting Graduate Students and Postdocs:
Participation in this summer cluster is by invitation only.