Estimating the entropy of a distribution by observing independent samples from it is an important problem with varied applications such as neural signal processing, extraction of secret keys from noisy observations, and image segmentation. In particular, the estimation of Shannon entropy has received a lot of attention, and results have emerged recently that establish sharp bounds on the number of samples needed for its reliable estimation. In this talk, we will consider the sample complexity for estimating a generalization of Shannon entropy, namely the Rényi entropy, and will establish almost tight lower and upper bounds on it. As a consequence, we will note that estimating Rényi entropy of order 2 requires samples proportional to roughly the square root of the support-size of the underlying distribution, the least among all Rényi entropies of different orders.

Video Recording