![](/sites/default/files/styles/workshop_banner_sm_1x/public/foundations_of_deep_learning_logo.png.jpg?itok=GRvWKCyM)
Abstract
We survey recent developments in the optimization and learning of deep neural networks. The three focus topics are on: 1) geometric results for the optimization of neural networks , 2) Overparametrized neural networks in the kernel regime (Neural Tangent Kernel) and its implications and limitations , and 3) potential strategies to prove SGD improves on kernel predictors.