Spring 2022

Learning & Games Visitor Speaker Series: Universal Acceleration for Minimax Optimization

Thursday, Feb. 10, 2022 1:30 pm2:30 pm PST

Add to Calendar

Parent Program: 

Niao He (ETH)


Calvin Lab Room 116 or Zoom

Title: Universal Acceleration for Minimax Optimization

Abstract: We present a generic acceleration recipe for smooth minimax optimization. By simply combing with existing solvers such as extra-gradient method as the workhorse for subproblems,  one can achieve best-known convergence rates for minimax optimization in various regimes such as the strongly-convex-(strongly)-concave,  nonconvex-(strongly)-concave settings.  Our key idea is largely inspired by the Catalyst framework in [Lin, Mairal, and Harchaoui, 2015] and can be framed as an inexact accelerated proximal point algorithm. The framework can be extended to solving finite-sum minimax problems and special classes of nonconvex-nonconcave minimax problems with best-known rates. 

Bio: Niao He is currently an Assistant Professor in the Department of Computer Science at ETH Zurich,  where she leads the Optimization and Decision Intelligence (ODI) Group. She is an ELLIS Scholar and a core faculty member of ETH AI Center. Previously, she was an assistant professor at the University of Illinois at Urbana-Champaign from 2016 to 2020. Before that, she received her Ph.D. degree in Operations Research from Georgia Institute of Technology in 2015. Her research interests lie in the intersection of optimization and machine learning, with a primary focus on minimax optimization and reinforcement learning.