Communication and computation are intrinsically intertwined. To compute, one must often communicate. To communicate reliably, one needs to compute. It is no wonder that information theory, invented by Claude Shannon in 1948 as the mathematical underpinning of communication engineering, has had many ties with the theory of computing. From an engineering point of view, information theory focuses on how much information an engineered system can extract and provides a complementary viewpoint to the theory of computing, which focuses on how efficiently such information can be extracted.
Traditionally, much of the research in this area has been done by two separate communities, one in electrical engineering and one in theoretical computer science. The purpose of the program is to strengthen the intellectual ties between these two communities and to exploit their complementary viewpoints to solve engineering and fundamental mathematical problems of current interest. In particular, the program will explore three main themes: (1) information theoretic techniques in complexity theory and combinatorics, focusing particularly on areas where there have been significant recent advances, such as information theoretic lower bounds in communication complexity and streaming; (2) coding theory and applications, focusing on new requirements driven by modern applications, as well as cross-fertilization of concepts recently developed in one of the two communities (e.g., polar codes, spatial coupling, sub-linear decoding, computationally bounded channels); (3) information theory, machine learning, and big data, focusing on the problem of learning high dimensional structures from multiple points of view: high-dimensional statistics, large-alphabet compression, compressed sensing and sublinear sampling.
The program will benefit from significant participation from the Center for Science of Information (an NSF Science and Technology Center), including sponsorship of the workshop on Information Theory, Learning and Big Data.