Abstract

Ran Raz: We will talk about the interactive channel capacity of a noisy channel: the minimal ratio between the communication complexity of a problem over a non-noisy channel, and the communication complexity of the same problem over the noisy channel (where the communication complexity tends to infinity).

We will discuss a lower bound for communication complexity over the binary symmetric channel, that shows a separation between interactive and non-interactive channel capacity (for small enough error rate).


Bernhard Haeupler: This talk presents a new, natural, and extremely easy way to achieve coding schemes for interactive communication that avoids the use of tree-codes and leads to the first coding scheme with good communication rate against adversarial noise.

The protocol essentially consist of both parties having their original conversation as-is (without any coding), interspersed only by short exchanges of hash values.  When hash values do not match, the parties backtrack. We feel the protocol gives the simplest explanation for how and why interactive coding schemes for adversarial noise are possible.

Our coding scheme achieves a communication rate of 1 - Theta(sqrt{eps log log 1/eps}) evenoutperforming the rate achieved for random noise achieved by a coding scheme of [Kol, Raz, STOC13]. For random, oblivious, or computationally bounded noise the communication rate becomes 1 - Theta(sqrt{eps}). Despite their odd form we conjecture these error rates to be optimal in their respective settings.

This talk is based on the FOCS'14 paper "Interactive Channel Capacity Revisited".

 

Video Recording