The articles in this Spring 2016 issue of the Simons Institute Newsletter exhibit the theory of computation’s broad outreach to the physical, biological and social sciences and its use of, and contribution to, deep mathematics. The authors describe connections to statistical physics, quantum physics, molecular biology and economics; applications of combinatorics, algebra, probability theory, statistics, machine learning, game theory and mathematical logic; and new insights into the power and limits of algorithms.
Our Research Vignette series showcases interesting research results that emerged during the most recent semester’s programs. Constantinos Daskalakis’s Vignette, “Simplicity, Optimality and Efficiency Tradeoffs in Economics,” arising from the Fall 2015 program on Economics and Computation, introduces economic mechanism design, the science of creating incentive structures to induce desired economic behavior. Applications include the design of markets, auctions, online advertising platforms, voting rules, and road and information networks. The essay focuses on combinatorial auctions, in which a set of indivisible items is to be sold to a group of buyers who have personal valuations for different subsets of the items. Daskalakis demonstrates that the problem of setting prices to achieve maximum total welfare is theoretically difficult, but shows that, in the case of repetitions of the same auction (such as the sale of on-line advertising slots), a certain “no-envy learning” mechanism achieves at least half the maximum possible total welfare.
Amir Abboud’s Research Vignette, “Strengthening SETH Lower Bounds,” on a result from the Fall 2015 program on Fine-Grained Complexity and Algorithm Design, notes that for many central computation problems arising in pattern matching, graph theory, computational geometry, data mining and economics, certain algorithms are believed, but not proven, to be essentially the most efficient possible. For example, there is a classical quadratic-time dynamic programming algorithm for computing the number of edit operations (insertions, deletions and substitutions) required to transform one sequence of n symbols to another. It is conjectured that no algorithm can achieve a significantly better worst-case running time. This conjecture would follow from the Strong Exponential Time Hypothesis (SETH), which states that every possible algorithm for the NP-complete satisfiability problem of propositional logic is no better in the worst case than brute-force enumeration of all possible solutions. While the truth of SETH is in doubt, Abboud describes a new result giving a more plausible hypothesis that yields the same conclusion.
Alan Sly’s From the Inside article on the current program on Counting Complexity and Phase Transitions addresses problems of counting combinatorial arrangements subject to constraints. Well-known examples include counting the number of ways of coloring the vertices of a graph with three colors so that no two adjacent vertices are assigned the same color, and counting the number of satisfying assignments to a propositional formula. It turns out that such constraint satisfaction problems (CSPs) fall into three classes: the #P-complete problems, which are the hardest counting problems; problems that are solvable in polynomial time; and those problems on graphs that are solvable in polynomial time in the case of planar graphs, but not in general. Miraculously, every polynomial-time problem in the latter class is solvable by a holographic algorithm that reduces it to the efficiently solvable problem of counting perfect matchings in a planar graph.
Sly also discusses applications of methods from statistical physics. These include the analysis of random CSPs using methods originally used to study phase transitions such as the evaporation and freezing of water, and the Markov Chain Monte Carlo method of sampling from complicated multidimensional distributions.
Our From the Inside series showcases programs currently underway at the Institute. Cenk Sahinalp’s From the Inside article on the parallel program on Algorithmic Challenges in Genomics gives a historical overview of genomics, which has its roots in the 1970s and has moved into a new era of Big Data science in the wake of the completion of the Human Genome Project and the emergence of next-generation sequencing technologies. Current research areas include cancer biology, the analysis of gene expression in cells, and network biology, which explores the web of interactions among cellular components. The interactions among molecular biology, theoretical computer science, statistics and machine learning are deepening in the effort to pursue these problems.
Finally, Luca Trevisan’s regular column, Looking Ahead, gives a preview of the programs planned for the coming semester. The Fall 2016 program on Logical Structures in Computation will involve researchers interested in logical models of probabilistic and quantum systems, in database theory, in algorithmic model theory and in reasoning about quantum and probabilistic systems. The program on Algorithms Under Uncertainty will explore the design and analysis of algorithms that do not have complete knowledge of their input, for example because the input is probabilistic or because it is revealed over time as the execution of the algorithm progresses.
I invite you to read further about our work, to watch the online videos from our workshops and public lectures, and to visit the Institute if your travels take you to Berkeley.
Richard Karp, Director
From the Inside: Counting Complexity and Phase Transitions
From the Inside: Algorithmic Challenges in Genomics
Research Vignette: Strengthening SETH Lower Bounds
Research Vignette: Simplicity, Optimality and Efficiency Tradeoffs in Economics
Looking Ahead: Logic and Uncertainty