The design of efficient online learning algorithms is a central topic in machine learning. Researchers have investigated a wide variety of algorithmic templates that lead to efficient algorithms provided the learning problem has the right structure. Two examples of such algorithmic templates are Follow the Regularized Leader (FTRL) and Follow the Perturbed Leader (FTPL). In this talk, I will briefly describe some recent work that sheds light on the connections between FTRL and FTPL. I will highlight the role that stability plays in the theoretical analyses of these algorithms. I will also try to connect stability to two topics outside of machine learning: the Littlewood-Offord theorem and the theory of Differential Privacy. (Talk is based on joint work with Jacob Abernethy, Chansoo Lee, Audra McMillan and Zifan Li.)

Video Recording