About

Organizer:
Lalitha Sankar (Arizona State University)

Information-theoretic measures and methods play a vital role in quantifying the leakage of sensitive information in a variety of applications. In ‘context-aware’ data-sharing settings, i.e., settings wherein data features and distributions can be implicitly learned and exploited, information theory can: (i) serve as a design-driver for mechanisms that process datasets while providing algorithm-independent (or -dependent, if so desired) privacy/fairness guarantees; (ii) delineate fundamental trade-offs between privacy and utility and achieve better trade-offs than worst-case notions of privacy; and (iii) provide a rigorous foundation for quantifying and controlling information leakage over multiple disclosures (composition) and adversarial knowledge.

This two-day symposium will review emerging research that uses information theory to understand context-aware approaches to privacy and fairness. First, recent results on operationally motivated information measures will be presented, followed by a collection of theoretical tools that capture the role of adversarial side information in privacy and limits of privacy-assurance. These tools will then be used to demonstrate the design of private and fair data representations with theoretical guarantees using generative models. Discussions will also cover the relationship between information bottleneck and privacy/fairness trade-off problems. The symposium will also focus on interesting connections between information-theoretic measures for privacy and differential privacy.

Invited Speakers:
Flavio du Pin Calmon (Harvard University), Kamalika Chaudhuri (UCSD), Peter Kairouz (Google AI), Lalitha Sankar (Arizona State University), Aaron Wagner (Cornell University)