Differential Privacy: On the Trade-Off between Utility and Information Leakage

  • Mário S. Alvim
  • Miguel E. Andrés
  • Konstantinos Chatzikokolakis
  • Pierpaolo Degano
  • Catuscia Palamidessi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7140)

Abstract

Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities that two adjacent datasets give the same answer is bound by e ε . In the field of information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the observables. In recent years, researchers have proposed to quantify the leakage in terms of min-entropy leakage, a concept strictly related to the Bayes risk. In this paper, we show how to model the query system in terms of an information-theoretic channel, and we compare the notion of differential privacy with that of min-entropy leakage. We show that differential privacy implies a bound on the min-entropy leakage, but not vice-versa. Furthermore, we show that our bound is tight. Then, we consider the utility of the randomization mechanism, which represents how close the randomized answers are to the real ones, in average. We show that the notion of differential privacy implies a bound on utility, also tight, and we propose a method that under certain conditions builds an optimal randomization mechanism, i.e. a mechanism which provides the best utility while guaranteeing ε-differential privacy.

Keywords

Shannon Entropy Side Information Information Leakage Gain Function Input Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dalenius, T.: Towards a methodology for statistical disclosure control. Statistik Tidskrift 15, 429–444 (1977)Google Scholar
  2. 2.
    Dwork, C.: Differential Privacy. In: Bugliesi, M., Preneel, B., Sassone, V., Wegener, I. (eds.) ICALP 2006, Part II. LNCS, vol. 4052, pp. 1–12. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  3. 3.
    Dwork, C.: Differential privacy in new settings. In: Proc. of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2010, Austin, Texas, USA, January 17-19, pp. 174–183. SIAM (2010)Google Scholar
  4. 4.
    Dwork, C.: A firm foundation for private data analysis. Communications of the ACM 54(1), 86–96 (2011)CrossRefGoogle Scholar
  5. 5.
    Dwork, C., Lei, J.: Differential privacy and robust statistics. In: Proc. of the 41st Annual ACM Symposium on Theory of Computing, STOC 2009, Bethesda, MD, USA, May 31- June 2, pp. 371–380. ACM (2009)Google Scholar
  6. 6.
    Clark, D., Hunt, S., Malacaria, P.: Quantitative analysis of the leakage of confidential data. In: Proc. of QAPL. Electr. Notes Theor. Comput. Sci., vol. 59(3), pp. 238–251. Elsevier (2001)Google Scholar
  7. 7.
    Clark, D., Hunt, S., Malacaria, P.: Quantitative information flow, relations and polymorphic types. J. of Logic and Computation 18(2), 181–199 (2005)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Clarkson, M.R., Myers, A.C., Schneider, F.B.: Belief in information flow. J. of Comp. Security 17(5), 655–701 (2009)Google Scholar
  9. 9.
    Köpf, B., Basin, D.A.: An information-theoretic model for adaptive side-channel attacks. In: Proc. of CCS, pp. 286–296. ACM (2007)Google Scholar
  10. 10.
    Malacaria, P.: Assessing security threats of looping constructs. In: Proc. of POPL, pp. 225–235. ACM (2007)Google Scholar
  11. 11.
    Malacaria, P., Chen, H.: Lagrange multipliers and maximum information leakage in different observational models. In: Proc. of PLAS, pp. 135–146. ACM (2008)Google Scholar
  12. 12.
    Smith, G.: On the Foundations of Quantitative Information Flow. In: de Alfaro, L. (ed.) FOSSACS 2009. LNCS, vol. 5504, pp. 288–302. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  13. 13.
    Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 625–656 (1948)MathSciNetMATHGoogle Scholar
  14. 14.
    Rényi, A.: On Measures of Entropy and Information. In: Proc. of the 4th Berkeley Symposium on Mathematics, Statistics, and Probability, pp. 547–561 (1961)Google Scholar
  15. 15.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. J. Wiley & Sons, Inc. (2006)Google Scholar
  16. 16.
    Braun, C., Chatzikokolakis, K., Palamidessi, C.: Quantitative notions of leakage for one-try attacks. In: Proc. of MFPS. ENTCS, vol. 249, pp. 75–91. Elsevier (2009)Google Scholar
  17. 17.
    Braun, C., Chatzikokolakis, K., Palamidessi, C.: Compositional Methods for Information-Hiding. In: Amadio, R.M. (ed.) FOSSACS 2008. LNCS, vol. 4962, pp. 443–457. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  18. 18.
    Chatzikokolakis, K., Palamidessi, C., Panangaden, P.: On the Bayes risk in information-hiding protocols. J. of Comp. Security 16(5), 531–571 (2008)Google Scholar
  19. 19.
    Kasiviswanathan, S.P., Smith, A.: A note on differential privacy: Defining resistance to arbitrary side information. CoRR abs/0803.3946 (2008)Google Scholar
  20. 20.
    Ghosh, A., Roughgarden, T., Sundararajan, M.: Universally utility-maximizing privacy mechanisms. In: Proc. of the 41st Annual ACM Symposium on Theory of Computing, STOC 2009, pp. 351–360. ACM (2009)Google Scholar
  21. 21.
    Dodis, Y., Ostrovsky, R., Reyzin, L., Smith, A.: Fuzzy extractors: How to generate strong keys from biometrics and other noisy data. SIAM J. Comput. 38(1), 97–139 (2008)MathSciNetMATHCrossRefGoogle Scholar
  22. 22.
    Nissim, K., Raskhodnikova, S., Smith, A.: Smooth sensitivity and sampling in private data analysis. In: Johnson, D.S., Feige, U. (eds.) STOC, pp. 75–84. ACM (2007)Google Scholar
  23. 23.
    Bernardo, J.M., Smith, A.F.M.: Bayesian Theory. J. Wiley & Sons, Inc. (1994)Google Scholar
  24. 24.
    Alvim, M.S., Chatzikokolakis, K., Degano, P., Palamidessi, C.: Differential privacy versus quantitative information flow. Technical report (2010)Google Scholar
  25. 25.
    Barthe, G., Köpf, B.: Information-theoretic bounds for differentially private mechanisms. In: Proc. of CSF (to appear, 2011)Google Scholar
  26. 26.
    Clarkson, M.R., Schneider, F.B.: Quantification of integrity Tech. Rep. (2011), http://hdl.handle.net/1813/22012
  27. 27.
    Heusser, J., Malacaria, P.: Applied Quantitative Information Flow and Statistical Databases. In: Degano, P., Guttman, J.D. (eds.) FAST 2009. LNCS, vol. 5983, pp. 96–110. Springer, Heidelberg (2010)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Mário S. Alvim
    • 1
  • Miguel E. Andrés
    • 1
  • Konstantinos Chatzikokolakis
    • 1
  • Pierpaolo Degano
    • 2
  • Catuscia Palamidessi
    • 1
  1. 1.INRIA and LIXEcole PolytechniqueFrance
  2. 2.Dipartimento di InformaticaUniversità di PisaItaly

Personalised recommendations