Description

Computational complexity theory measures the complexity of a problem by the best possible asymptotic growth rate of a correct algorithm for the exact or approximate solution. The phenomena of NP completeness and hardness of approximation often lead to a pessimistic conclusion. In practice, one seeks algorithms that perform well on typical instances, as measured by computational experiments or probabilistic analysis. This latter approach often yields an assessment very different from the complexity-theoretic yardstick. We will compare and contrast the two approaches to several canonical problems: satisfiability solving, linear programming, integer programming, the traveling-salesman problem, bin packing, matching and number partitioning.

In celebration of the 50th anniversary of computer science at UC Berkeley and the university’s sesquicentennial, EECS and the Simons Institute are launching a special series of lectures by winners of the ACM A.M. Turing Award, considered the field’s equivalent of a Nobel Prize. In addition to their technical talk, the lecturers will reflect on their time at UC Berkeley and look toward the future of research and technological development in their fields.

Light refreshments will be served before the lecture at 3:30 p.m.

All scheduled dates:

Upcoming

No Upcoming activities yet