About

This cluster will bring together a small group of computer scientists, statisticians, mathematicians and electrical engineers with the aim of developing the theoretical foundations of deep learning, particularly the aspects of this methodology that are very different from classical statistical approaches. The cluster is supported by the NSF/Simons Foundation Collaboration on the Theoretical Foundations of Deep Learning, which aims to understand the mathematical mechanisms that underpin the practical success of deep learning, both to elucidate the limitations of deep learning methods and to allow their extension beyond the domains where they are currently applicable, and to initiate the study of the mathematical problems that emerge. Specific objectives of the cluster include understanding: the role of overparametrization in efficient optimization, mechanisms by which interpolation with implicit regularization enables generalization; and how compositionality confers representational richness.

This cluster is supported by the NSF through grant DMS-2031883 and the Simons Foundation through award #814639.


 

Organizers

Nati Srebro (Toyota Technological Institute at Chicago)

Long-Term Participants (including Organizers)

Nati Srebro (Toyota Technological Institute at Chicago)
Nike Sun (Massachusetts Institute of Technology)
Yizhe Zhu (University of California Irvine)

Visiting Graduate Students and Postdocs

Wei Hu (University of Michigan)
Gal Vardi (TTI Chicago & Hebrew University)