Bounds on the Sample Complexity for Private Learning and Private Data Release

  • Amos Beimel
  • Shiva Prasad Kasiviswanathan
  • Kobbi Nissim
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5978)

Abstract

Learning is a task that generalizes many of the analyses that are applied to collections of data, and in particular, collections of sensitive individual information. Hence, it is natural to ask what can be learned while preserving individual privacy. [Kasiviswanathan, Lee, Nissim, Raskhodnikova, and Smith; FOCS 2008] initiated such a discussion. They formalized the notion of private learning, as a combination of PAC learning and differential privacy, and investigated what concept classes can be learned privately. Somewhat surprisingly, they showed that, ignoring time complexity, every PAC learning task could be performed privately with polynomially many samples, and in many natural cases this could even be done in polynomial time.

While these results seem to equate non-private and private learning, there is still a significant gap: the sample complexity of (non-private) PAC learning is crisply characterized in terms of the VC-dimension of the concept class, whereas this relationship is lost in the constructions of private learners, which exhibit, generally, a higher sample complexity.

Looking into this gap, we examine several private learning tasks and give tight bounds on their sample complexity. In particular, we show strong separations between sample complexities of proper and improper private learners (such separation does not exist for non-private learners), and between sample complexities of efficient and inefficient proper private learners. Our results show that VC-dimension is not the right measure for characterizing the sample complexity of proper private learning.

We also examine the task of private data release (as initiated by [Blum, Ligett, and Roth; STOC 2008]), and give new lower bounds on the sample complexity. Our results show that the logarithmic dependence on size of the instance space is essential for private data release.

References

  1. 1.
    Beimel, A., Kasiviswanathan, S., Nissim, K.: Bounds on the Sample Complexity for Private Learning and Private Data Release (Full version) (2009)Google Scholar
  2. 2.
    Blum, A., Dwork, C., McSherry, F., Nissim, K.: Practical privacy: The SuLQ framework. In: PODS, pp. 128–138. ACM, New York (2005)Google Scholar
  3. 3.
    Blum, A., Ligett, K., Roth, A.: A learning theory approach to non-interactive database privacy. In: STOC, pp. 609–618. ACM, New York (2008)Google Scholar
  4. 4.
    Blum, A., Ligett, K., Roth, A.: Private communication (2008)Google Scholar
  5. 5.
    Blumer, A., Ehrenfeucht, A., Haussler, D., Warmuth, M.K.: Learnability and the Vapnik-Chervonenkis dimension. Journal of the Association for Computing Machinery 36(4), 929–965 (1989)MathSciNetCrossRefMATHGoogle Scholar
  6. 6.
    Dwork, C.: The differential privacy frontier (extended abstract). In: Reingold, O. (ed.) TCC 2009. LNCS, vol. 5444, pp. 496–502. Springer, Heidelberg (2009)Google Scholar
  7. 7.
    Dwork, C., McSherry, F., Nissim, K., Smith, A.: Calibrating noise to sensitivity in private data analysis. In: Halevi, S., Rabin, T. (eds.) TCC 2006. LNCS, vol. 3876, pp. 265–284. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  8. 8.
    Dwork, C., Naor, M., Reingold, O., Rothblum, G., Vadhan, S.: On the complexity of differentially private data release. In: STOC, pp. 381–390. ACM, New York (2009)Google Scholar
  9. 9.
    Ehrenfeucht, A., Haussler, D., Kearns, M.J., Valiant, L.G.: A general lower bound on the number of examples needed for learning. Inf. Comput. 82(3), 247–261 (1989)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Kasiviswanathan, S.P., Lee, H.K., Nissim, K., Raskhodnikova, S., Smith, A.: What can we learn privately? In: FOCS, pp. 531–540. IEEE Computer Society, Los Alamitos (2008)Google Scholar
  11. 11.
    Kasiviswanathan, S.P., Smith, A.: A note on differential privacy: Defining resistance to arbitrary side information. CoRR, arXiv:0803.39461 [cs.CR] (2008)Google Scholar
  12. 12.
    Kearns, M.J.: Efficient noise-tolerant learning from statistical queries. Journal of the ACM 45(6), 983–1006 (1998); Preliminary version in Proceedings of STOC 1993MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    Kearns, M.J., Vazirani, U.V.: An Introduction to Computational Learning Theory. MIT Press, Cambridge (1994)Google Scholar
  14. 14.
    McSherry, F., Talwar, K.: Mechanism design via differential privacy. In: FOCS, pp. 94–103. IEEE, Los Alamitos (2007)Google Scholar
  15. 15.
    Mishra, N., Sandler, M.: Privacy via pseudorandom sketches. In: PODS, pp. 143–152. ACM, New York (2006)Google Scholar
  16. 16.
    Pitt, L., Valiant, L.G.: Computational limitations on learning from examples. Journal of the ACM 35(4), 965–984 (1988)MathSciNetCrossRefMATHGoogle Scholar
  17. 17.
    Valiant, L.G.: A theory of the learnable. Communications of the ACM 27, 1134–1142 (1984)CrossRefMATHGoogle Scholar
  18. 18.
    Vapnik, V.N., Chervonenkis, A.Y.: On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications 16, 264 (1971)CrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Amos Beimel
    • 1
  • Shiva Prasad Kasiviswanathan
    • 2
  • Kobbi Nissim
    • 1
    • 3
  1. 1.Dept. of Computer ScienceBen-Gurion UniversityIsrael
  2. 2.CCS-3, Los Alamos National LaboratoryIsrael
  3. 3.Microsoft Audience IntelligenceIsrael

Personalised recommendations