Abstract

In the reliable transmission problem, a sender, Alice, wishes to transmit a bit-string x to a remote receiver, Bob, over a binary channel with adversarial noise. The solution to this problem is to encode x using an error-correcting code. As it is long known that the distance of binary codes is at most 1/2, reliable transmission is possible only if the channel corrupts (flips) at most a 1/4-fraction of the communicated bits.

We revisit the reliable transmission problem in the two-way setting, where both Alice and Bob can send bits to each other. Our main result is the construction of two-way error-correcting codes that are resilient to a constant fraction of corruptions strictly larger than 1/4. Moreover, our code has a constant rate and requires Bob to only send one short message.

Curiously, our new two-way code requires a fresh perspective on classical error-correcting codes: While classical codes have only one distance guarantee for all pairs of codewords (i.e., the minimum distance), we construct codes where the distance between a pair of codewords depends on the ``compatibility'' of the messages they encode. We also prove that such codes are necessary for our result.

Video Recording