
Recent advances in optimization, such as the development of fast iterative methods based on gradient descent, have enabled many breakthroughs in algorithm design. This workshop focuses on these recent advances in optimization and their implications for algorithm design. The workshop will explore both advances and open problems in the specific area of optimization as well as improvements in other areas of algorithm design that have leveraged optimization results as a key routine.
Specific topics to cover include gradient descent methods for convex and non-convex optimization problems; algorithms for solving structured linear systems; algorithms for graph problems such as maximum flows and cuts, connectivity, and graph sparsification; submodular optimization.