Differential privacy protects individuals against privacy harms, but that protection comes at a cost to accuracy of released data. We consider settings in which released data is used to decide who (i.e. which groups of people) will receive a desired resource or benefit. We show that if decisions are made using data released privately, the noise added to achieve privacy may disproportionately impact some groups over others. Thus, while differential privacy offers equal privacy protection to participating individuals, it may result in disparities in utility across groups, with potentially serious consequences for affected individuals.The talk will explain how disparities in accuracy can be caused by commonly-used privacy mechanisms and highlight some of the social choices that arise in the design and configuration of privacy mechanisms.
Based on joint work with Ashwin Machanavajjhala, Michael Hay, Ryan McKenna, David Pujol, Satya Kuppam.