Abstract
A basic combinatorial interpretation of Shannon's entropy function is via the ``20 questions'' game.
This cooperative game is played by two players, Alice and Bob:
Alice picks a distribution $\pi$ over the numbers $\{1,\ldots,n\}$,
and announces it to Bob.
She then chooses a number $x$ according to $\pi$,
and Bob attempts to identify $x$ using as few
Yes/No queries as possible, on average.
An optimal strategy for the ``20 questions'' game is given by a Huffman code for $\pi$:
Bob's questions reveal the codeword for $x$ bit by bit.
This strategy finds $x$ using fewer than $H(\pi)+1$ questions on average.
However, the questions asked by Bob could be arbitrary.
In this paper, we investigate the following question:
\emph{Are there restricted sets of questions that match the performance of Huffman codes, either exactly or approximately?}
Our first main result shows that for every distribution $\pi$,
Bob has a strategy that uses only questions of the form ``$x < c$?'' and ``$x = c$?'’,
and uncovers $x$ using at most $H(\pi)+1$ questions on average,
matching the performance of Huffman codes in this sense.
We also give a natural set of $O(rn^{1/r})$ questions that achieve a performance of at most $H(\pi)+r$,
and show that $\Omega(rn^{1/r})$ questions are required to achieve such a guarantee.
Our second main result gives a set $\Q$ of $1.25^{n+o(n)}$ questions such that
for every distribution $\pi$, Bob can implement an \emph{optimal} strategy for $\pi$ using only questions from $\cQ$.
We also show that $1.25^{n-o(n)}$ questions are needed, for infinitely many $n$.
Here, the set $\Q$ is derived via a random construction.
If time will allow, we will discuss the challenge of derandomizing
this construction, and a connection with hitting maximal antichains.
Based on joint work with Yuval Dagan, Yuval Filmus, and Ariel Gabizon