News

In this month’s newsletter, we’re highlighting a 2015 talk by Chris Umans on some of the then state-of-the-art approaches to bound the matrix...

Venkat Wider Aspect Ratio

Greetings from Berkeley! We are gearing up for a busy wrap-up of the spring semester, with five back-to-back workshop weeks at the Simons Institute...

Ten years ago, researchers proved that adding full memory can theoretically aid computation. They’re just now beginning to understand the implications...

News archive

This week we say goodbye to the participants in our programs on Computational Complexity of Statistical Inference, and on Geometric Methods in Optimization and Sampling. It’s been a gift to share Calvin Lab with them this fall for our first in-person programs since early 2020.

In August 2014, a significant advance in computing made the cover of the journal Science. It was IBM’s 5.4 billion-transistor chip that had a million hardware neurons and 256 million synapses. Algorithms running on this “neuromorphic” chip, when fed a video stream, could identify multiple objects, such as people, bicycles, trucks, and buses. Crucially, the hardware neural network consumed a mere 63 milliwatts, about 176,000 times less energy per synaptic event than the same network simulated on a general-purpose microprocessor.

Radcliffe fellows with divergent backgrounds came together with the goal of deciphering the clicks of sperm whales.

The l'Oréal-UNESCO For Women in Science Festival will take place December 7, 2021. Join us there!

It is natural to believe that an accurate model for a certain phenomenon can always be found, given enough data. How much data is "enough"? Somewhat tautologically: the data must contain enough information to identify the right model. This intuition can be made precise using statistics and information theory.

This week, an article showing that sleeping longer than 6.5 hours a night may be bad for your health popped up on my Google feed. It made me realize yet again why theory and mathematics are so satisfying — at least the soundness of theorems doesn’t change every decade.

“More and more, I’m starting to wonder whether P equals NP,” Toniann Pitassi, a computer scientist at the University of Toronto and a former PhD student of Stephen Cook’s, says. Her approach in circling around the problem is to study both scaled-up and scaled-down analogues, harder and easier models. “Sometimes generalizing the question makes it clearer,” she says. But overall, she hasn’t achieved clarity: “Most people think P doesn’t equal NP. And I don’t know. Maybe it’s just me, but I feel like it’s become less and less clear that that’s the truth.” (Published with permission from MIT Technology Review.)

A computer scientist at the University of Pennsylvania and a consultant to the Algorand Foundation, Tal Rabin conducts research on cryptography. In this episode of Polylogues, Simons Institute Journalist in Residence Siobhan Roberts sits down with Rabin to discuss her career and key developments in cryptographic research.

Erik and Martin Demaine, a father-and-son team of “algorithmic typographers,” have confected an entire suite of mathematically inspired typefaces.