![Foundations of Machine Learning( smaller text) _hi-res](/sites/default/files/styles/workshop_banner_sm_1x/public/2023-01/Foundations%20of%20Machine%20Learning_hi-res.jpg?h=01395199&itok=AaUSmaOK)
Description
What Non-Convex Functions Can We Optimize?
Many machine learning problems require optimizing a non-convex objective. In this talk we identify a class of non-convex functions where all local minima are also globally optimal. For such functions, stochastic gradient descent efficiently converges to the global optimum . Several interesting problems are known to have this property, and we will in particular show matrix completion has no spurious local minimum.