Description

 Large-scale Machine Learning: Theory and Practice

Large-scale machine learning requires blending computational thinking with statistical frameworks. Designing fast, efficient and distributed learning algorithms with statistical guarantees is an outstanding grand challenge. I will present perspectives from theory and practice. I will demonstrate how spectral optimization can reach the globally optimal solution for many learning problems despite being non-convex. This includes unsupervised learning of latent variable models, training neural networks and reinforcement learning of partially observable Markov decision processes.  In practice, tensor methods yield enormous gains both in running times and learning accuracy over traditional methods  such as variational inference. I will also discuss the broad challenges in non-convex optimization and the recent progress in this area.

 

All scheduled dates:

Upcoming

No Upcoming activities yet

Past