This series of talks is part of the Information Theory Boot Camp. Videos for each talk will be available through the links above.
Speaker: Alon Orlitsky, UC San Diego
Distribution estimation underlies many scientific and engineering endeavors and has therefore been studied extensively for centuries. Yet faced with modern challenges in compression, learning and statistics, researchers have recently revisited this topic with new perspectives and emphasis. Taking an information-theoretic perspective, we will explore this exciting and rapidly evolving area by addressing several related questions. What is the best estimation rate, when can structure help, why may few samples suffice, which distribution properties can be quickly evaluated, and how do these results apply to basic learning and compression tasks? Where possible, examples will accompany explanations, and we will close with several open problems.