Abstract

Following up on Ashia's introduction to optimization in Euclidean spaces, I will show that many of the core tools extend to optimization on Riemannian manifolds, with little friction. We will spend some time reviewing basic concepts from differential geometry, building up to Riemannian gradient descent. I will then give pointers to further theoretical tools, algorithms and resources. Time permitting, I will highlight the intriguing prevalence of smooth geometry and symmetry in benign non-convexity. This talk also sets the stage for Suvrit's introduction to geodesically convex optimization.

Attachment

Video Recording