Talks
Fall 2016

Learning as a Tool for Algorithm Design and Beyond-Worst-Case Analysis

Wednesday, November 16th, 2016 12:00 pm12:40 pm

Add to Calendar

All known algorithms for solving NP-complete prob­lems require exponential time in the worst case; however, these algorithms nevertheless solve many problems of practical impor­tance astoundingly quickly, and are hence relied upon in a broad range of applications. This talk is built around the observation that“Empirical Hardness Models”—statistical models that predict algorithm runtime on novel instances from a given distribution—work surprisingly well. These models can serve as powerful tools for algorithm design, specifically by facilitating automated methods for algorithm design and for constructing algorithm portfolios. They also offer a statistical alternative to beyond-worst-case analysis and a starting point for theoretical investigations.All known algorithms for solving NP-complete prob­lems require exponential time in the worst case; however, these algorithms nevertheless solve many problems of practical impor­tance astoundingly quickly, and are hence relied upon in a broad range of applications. This talk is built around the observation that“Empirical Hardness Models”—statistical models that predict algorithm runtime on novel instances from a given distribution—work surprisingly well. These models can serve as powerful tools for algorithm design, specifically by facilitating automated methods for algorithm design and for constructing algorithm portfolios. They also offer a statistical alternative to beyond-worst-case analysis and a starting point for theoretical investigations.