Description

What Non-Convex Functions Can We Optimize?

Many machine learning problems require optimizing a non-convex objective. In this talk we identify a class of non-convex functions where all local minima are also globally optimal. For such functions, stochastic gradient descent efficiently converges to the global optimum . Several interesting problems are known to have this property, and we will in particular show matrix completion has no spurious local minimum.

All scheduled dates:

Upcoming

No Upcoming activities yet

Past