Much progress has been made over the past several years in understanding computational and statistical issues surrounding deep learning, which lead to changes in the way we think about deep learning, and machine learning theory more broadly.  This includes an emphasis on the power of overparameterization, interpolation learning, the importance of algorithmic regularization, insights derived using methods from statistical physics, and more. The summer school and workshop will consist of tutorials on these developments, workshop talks presenting current and ongoing research in the area, and panel discussions on these topics and more.  Details on tutorial speakers and topics will be confirmed shortly.     

We welcome applications from researchers interested in the theory of deep learning.  The summer school has funding for a small number of participants.  If you would like to be considered for funding, we request that you provide an application to be a Supported Workshop & Summer School Participant by June 8th.  The application for funding and details on its requirements can be found on the workshop registration page

If funding from the Collaboration is not necessary for your attendance, you may apply to be a workshop participant on the workshop registration page and check the box indicating that you are not seeking funding. 

Everyone is welcome to attend this workshop. Registration is required. Space may be limited, and you are advised to register early. To submit your name for consideration, please register and await confirmation of your acceptance before booking your travel.

Nati Srebro (Toyota Technological Institute at Chicago; chair)
Invited Participants

Niladri Chatterji (Stanford University), Ethan Dyer (Stanford), Ahmed El Alaoui (Cornell), Chelsea Finn (Stanford University), Dan Hendrycks (Berkeley), Chloe Hsu (UC Berkeley), Brandon McKinzie (Apple), Subhabrata Sen (Harvard University), Pragya Sur (Harvard University), Matus Telgarsky (University of Illinois at Urbana-Champaign)