The ubiquity of large data sets has had a significant impact on the design of algorithms and has led to the emergence of computational models that capture various aspects of massive data computations, most notably that the data is too large to store or view in its entirety.  Such models include streaming algorithms, sublinear time algorithms, sublinear query time algorithms with preprocessing, property testing algorithms, distributed and massively parallel algorithms. Though each of these models have somewhat different properties, what is common to them is that algorithms compute and make decisions based only on small local portions of the input.  

In the streaming setting, big data streams go by and the algorithm has limited space in which to store information and intermediate computations; sketching algorithms compress the input in such a way that certain operations can be supported on the sketches; sublinear time algorithms approximate a parameter of the data after viewing a miniscule fraction of the data; local algorithms compute and make decisions on parts of the output after considering only a portion of the input; property testing algorithms quickly determine what properties are held by the data; local distributed algorithms are such that processors must output a decision based only a small number of rounds of interaction with only nearby processors, and hence based on inputs within a small local radius; the Massively Parallel Computation model of computation theoretically abstracts parallel cluster computing in practical settings where several commodity-grade computers interact to solve large-scale problems; distribution testing algorithms understand statistical properties of the data with sublinear samples in the size of the domain.

Sublinear algorithms in the above models have been studied for at least two decades, and a variety of very powerful tools have been developed in each.  The past few years have seen a flurry of exciting developments and results, as well as the establishing of new connections between these sublinear models and common algorithmic and analysis techniques. Further, new paradigms have emerged, to incorporate new resources, constraints, or objectives to the sublinear setting: ranging from robustness, privacy and data erasure to interactive proofs and oracle-aided algorithms. The program will bring together researchers from the different communities of sublinear algorithms.   After a bootcamp that will foster common language, there will be three workshops that focus on trends that cut across the communities:  (1) Sublinear graph simplification, (2) Sublinear algorithms with help, and (3) Extroverted sublinear algorithms.

Clément Canonne (University of Sydney), Artur Czumaj (University of Warwick), Piotr Indyk (Massachusetts Institute of Technology), Jelani Nelson (UC Berkeley), Noga Ron-Zewi (University of Haifa), Ronitt Rubinfeld (Massachusetts Institute of Technology), Asaf Shapira (Tel-Aviv University)

Long-Term Participants (tentative, including organizers):
Jayadev Acharya (Cornell University), Sepehr Assadi (University of Waterloo), Arnab Bhattacharyya (National University of Singapore), Clément Canonne (University of Sydney), Artur Czumaj (University of Warwick), Jacob Fox (Stanford University), Mohsen Ghaffari (Massachusetts Institute of Technology), Elena Grigorescu (Purdue University), Tom Gur (University of Cambridge), Venkat Guruswami (UC Berkeley), Piotr Indyk (Massachusetts Institute of Technology), Michael Kapralov (EPFL), Sanjeev Khanna (University of Pennsylvania), Robi Krauthgamer (Weizmann Institute of Science), Ravi Kumar (Google), Jelani Nelson (UC Berkeley), Ilan Newman (University of Haifa), Seth Pettie (University of Michigan), Sofya Raskhodnikova (Boston University), Noga Ron-Zewi (University of Haifa), Guy Rothblum (Weizmann Institute), Ronitt Rubinfeld (Massachusetts Institute of Technology), Rocco Servedio (Columbia University), Asaf Shapira (Tel-Aviv University), Madhu Sudan (Harvard University), Li-Yang Tan (Stanford University), David Woodruff (Carnegie Mellon University), Ning Xie (Florida International University ), Yuichi Yoshida (National Institute of Informatics)