Abstract

Bogdanov and Mossel (2011) consider the following problem.
 
"Suppose Alice receives a string of unbiased independent random bits and Bob receives a noisy copy of the same bits, where each bit is flipped with probability eps < 1/2. Alice and Bob wish to extract a common sequence of k random bits."
 
We study the relationship between communication and the probability of agreement. Suppose Alice wishes to send Bob a message of delta k bits in order to ensure that their k-bit outputs agree with probability 2^{-gamma k}. How big must delta be as a function of gamma? We show the following:
 
delta(gamma) >= C (1-gamma) - 2 sqrt{C(1-C) gamma}.
 
where C = 4 eps (1-eps).
 
This implies that that for delta(gamma) = 0, we have gamma >= eps/(1-eps), recovering the original result of Bogdanov and Mossel.
 
In this talk, we will describe  the above trade-off, which is based on the standard hypercontractivity inequality.  We also obtain strategies that show that this trade-off between communication and the probability of error is asymptotically tight.
 
(This is part of joint work with Venkat Guruswami.)