Alex Wood (Harvard)
Lawyers and computer scientists hold very different notions of privacy. Notably, privacy laws rely on narrower and less formal conceptions of risk than those described by the computer science literature. As a result, the law often creates uncertainty and fails to protect against the full range of data privacy risks. Moreover, demonstrating that formal privacy models such as differential privacy satisfy legal requirements for privacy protection is a significant challenge due to conceptual gaps between the legal and technical definitions.
This presentation illustrates the gaps between legal and technical approaches to privacy and discusses how the use of differential privacy can be understood to be sufficient to satisfy a wide range of legal and policy requirements despite these definitional gaps. It draws from specific examples of privacy requirements from a selection of laws such as the Family Educational Rights and Privacy Act (FERPA), the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule, Title 13 of the U.S. Code (governing the US Census Bureau), and the Confidential Information Protection and Statistical Efficiency Act (CIPSEA). Key concepts from these regulatory requirements that are found to be relevant to privacy in statistical analysis include, among others, personally identifiable information, de-identification, singling out, linkage, inference, identification risk, expert determination, consent and opting out, and purpose and access limitations.
While none of these legal and policy concepts refer directly to differential privacy, the differential privacy guarantee can be interpreted to address these concepts while accommodating differences in how they are defined and interpreted. Work towards an approach to formally proving that a technological method for privacy protection satisfies the requirements of a particular law is presented. It involves two steps: first, translating a legal standard into a formal mathematical requirement of privacy and, second, constructing a rigorous proof for establishing that a technique satisfies the mathematical requirement derived from the law. This approach is demonstrated with an example bridging the requirements of the Family Educational Rights and Privacy Act (FERPA) and differential privacy. Additionally, a series of examples is provided to show how policymakers and privacy practitioners can interpret the differential privacy guarantee as sufficient to satisfy legal and policy requirements that rely on normative privacy concepts.