Differential Privacy: A Primer for a Non-Technical Audience


Wood A, Altman M, Bembenek A, Bun M, Gaboardi M, Honaker J, O'Brien DR, Steinke T, Vadhan S. Differential Privacy: A Primer for a Non-Technical Audience. Vanderbilt Journal of Entertainment and Technology Law (JETlaw) [Internet]. 2018;21 :209-276.
4_wood_final.pdf829 KB
Differential Privacy: A Primer for a Non-Technical Audience


ifferential privacy is a formal mathematical framework for quantifying and managing privacy risks. It provides provable privacy protection against a wide range of potential attacks, including those currently unforeseen. Differential privacy is primarily studied in the context of the collection, analysis, and release of aggregate statistics. These range from simple statistical estimations, such as averages, to machine learning. Tools for differentially private analysis are now in early stages of implementation and use across a variety of academic, industry, and government settings. Interest in the concept is growing among potential users of the tools, as well as within legal and policy communities, as it holds promise as a potential approach to satisfying legal requirements for privacy protection when handling personal information. In particular, differential privacy may be seen as a technical solution for analyzing and sharing data while protecting the privacy of individuals in accordance with existing legal or policy requirements for de-identification or disclosure limitation.



Publisher's Version

Last updated on 01/05/2019