Differential Privacy and the Power of (Formalizing) Negative Thinking
Differential privacy is a promise, made by a data curator to a data subject: you will not be affected, adversely or otherwise, by allowing your data to be used in any study, no matter what other studies, data sets, or information from other sources is, or may become, available. This talk describes the productive role played by negative results in the formulation of differential privacy and the development of techniques for achieving it, concluding with a new negative result having implications related to participation in multiple, independently operated, differentially private databases.
Keywordsdifferential privacy foundations of private data analysis lifetime privacy loss independently operated differentially private databases
- 1.Blum, A., Ligett, K., Roth, A.: A Learning Theory Approach to Non-Interactive Database Privacy. In: Proc. 40th ACM Symposium on Thoery of Computing (2008)Google Scholar
- 2.Dinur, I., Nissim, K.: Revealing Information While Preserving Privacy. In: Proc. 22nd ACM Symposium on Principles of Database Systems (2003)Google Scholar
- 5.Dwork, C., Naor, M.: On the Difficulties of Disclosure Prevention in Statistical Databases or The Case for Differential Privacy. Journal of Privacy and Confidentiality 2(1) (2010)Google Scholar
- 6.Dwork, C., Naor, M., Vadhan, S.: Coordination is Essential (working title) (manuscript in preparation)Google Scholar
- 7.Dwork, C., Rothblum, G., Vadhan, S.: Boosting and Differential Privacy. In: Proceedings of the 51st IEEE Symposium on Foundations of Computer Science (2010)Google Scholar