Privately Solving Linear Programs

  • Justin Hsu
  • Aaron Roth
  • Tim Roughgarden
  • Jonathan Ullman
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8572)

Abstract

In this paper, we initiate the systematic study of solving linear programs under differential privacy. The first step is simply to define the problem: to this end, we introduce several natural classes of private linear programs that capture different ways sensitive data can be incorporated into a linear program. For each class of linear programs we give an efficient, differentially private solver based on the multiplicative weights framework, or we give an impossibility result.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Arora, S., Hazan, E., Kale, S.: The multiplicative weights update method: a meta-algorithm and applications. Theory of Computing 8(1), 121–164 (2012)CrossRefMathSciNetGoogle Scholar
  2. 2.
    Blum, A., Dwork, C., McSherry, F., Nissim, K.: Practical privacy: the sulq framework. In: ACM SIGACTSIGMODSIGART Symposium on Principles of Database Systems (PODS), Baltimore, Maryland (2005)Google Scholar
  3. 3.
    Blum, A., Ligett, K., Roth, A.: A learning theory approach to noninteractive database privacy. Journal of the ACM 60(2), 12 (2013)CrossRefMathSciNetGoogle Scholar
  4. 4.
    Bun, M., Ullman, J., Vadhan, S.P.: Fingerprinting codes and the price of approximate di erential privacy. In: ACM SIGACT Symposium on Theory of Computing (STOC), pp. 1–3. ACM, New York (2014)Google Scholar
  5. 5.
    Chaudhuri, K., Monteleoni, C., Sarwate, A.D.: Differentially private empirical risk minimization. Journal of Machine Learning Research 12, 1069–1109 (2011)MATHMathSciNetGoogle Scholar
  6. 6.
    Dinur, I., Nissim, K.: Revealing information while preserving privacy. In: ACM SIGACT SIGMOD SIGART Symposium on Principles of Database Systems (PODS), San Diego, California, pp. 202–210 (2003)Google Scholar
  7. 7.
    Dwork, C.: Differential privacy: A survey of results. In: Agrawal, M., Du, D.-Z., Duan, Z., Li, A. (eds.) TAMC 2008. LNCS, vol. 4978, pp. 1–19. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  8. 8.
    Dwork, C., McSherry, F., Nissim, K., Smith, A.: Calibrating noise to sensitivity in private data analysis. In: Halevi, S., Rabin, T. (eds.) TCC 2006. LNCS, vol. 3876, pp. 265–284. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  9. 9.
    Dwork, C., Naor, M., Reingold, O., Rothblum, G.N., Vadhan, S.: On the complexity of differentially private data release: efficient algorithms and hardness results. In: ACM SIGACT Symposium on Theory of Computing (STOC), Bethesda, Maryland, pp. 381–390 (2009)Google Scholar
  10. 10.
    Dwork, C., Rothblum, G.N., Vadhan, S.: Boosting and di erential privacy. In: IEEE Symposium on Foundations of Computer Science (FOCS), Las Vegas, Nevada, pp. 51–60 (2010)Google Scholar
  11. 11.
    Gupta, A., Ligett, K., McSherry, F., Roth, A., Talwar, K.: Differentially private combinatorial optimization. In: ACM SIAM Symposium on Discrete Algorithms (SODA), Austin, Texas, pp. 1106–1125 (2010)Google Scholar
  12. 12.
    Gupta, A., Roth, A., Ullman, J.: Iterative constructions and private data release. In: Cramer, R. (ed.) TCC 2012. LNCS, vol. 7194, pp. 339–356. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  13. 13.
    Hardt, M., Rothblum, G.N.: A multiplicative weights mechanism for privacy-preserving data analysis. In: IEEE Symposium on Foundations of Computer Science (FOCS), Las Vegas, Nevada, pp. 61–70 (2010)Google Scholar
  14. 14.
    Hardt, M., Ligett, K., McSherry, F.: A simple and practical algorithm for differentially private data release. In: Conference on Neural Information Processing Systems (NIPS), Lake Tahoe, California, pp. 2348–2356 (2012)Google Scholar
  15. 15.
    Herbster, M., Warmuth, M.K.: Tracking the best linear predictor. Journal of Machine Learning Research 1, 281–309 (2001)MATHMathSciNetGoogle Scholar
  16. 16.
    Hsu, J., Roth, A., Ullman, J.: Differential privacy for the analyst via private equilibrium computation. In: ACM SIGACT Symposium on Theory of Computing (STOC), Palo Alto, California, pp. 341–350 (2013)Google Scholar
  17. 17.
    Kasiviswanathan, S.P., Lee, H.K., Nissim, K., Raskhodnikova, S., Smith, A.: What can we learn privately? SIAM Journal on Computing 40(3), 793–826 (2011)CrossRefMATHMathSciNetGoogle Scholar
  18. 18.
    Kifer, D., Smith, A., Thakurta, A.: Private convex empirical risk minimization and high-dimensional regression. Journal of Machine Learning Research 1, 41 (2012)Google Scholar
  19. 19.
    McSherry, F., Talwar, K.: Mechanism design via differential privacy. In: IEEE Symposium on Foundations of Computer Science (FOCS), Providence, Rhode Island (2007)Google Scholar
  20. 20.
    Nissim, K., Raskhodnikova, S., Smith, A.: Smooth sensitivity and sampling in private data analysis. In: ACM SIGACT Symposium on Theory of Computing (STOC), San Diego, Illinois, pp. 75–84 (2007)Google Scholar
  21. 21.
    Plotkin, S.A., Shmoys, D.B., Tardos, É.: Fast approximation algorithms for fractional packing and covering problems. Mathematics of Operations Research 20(2), 257–301 (1995)CrossRefMATHMathSciNetGoogle Scholar
  22. 22.
    Roth, A., Roughgarden, T.: Interactive privacy via the median mechanism. In: ACM SIGACT Symposium on Theory of Computing (STOC), Cambridge, Massachusetts, pp. 765–774Google Scholar
  23. 23.
    Ullman, J.: Answering n 2 + o(1) counting queries with differential privacy is hard. In: ACM SIGACT Symposium on Theory of Computing (STOC), Palo Alto, California, pp. 361–370 (2013)Google Scholar
  24. 24.
    Ullman, J., Vadhan, S.: PCPs and the hardness of generating private synthetic data. In: Ishai, Y. (ed.) TCC 2011. LNCS, vol. 6597, pp. 400–416. Springer, Heidelberg (2011)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Justin Hsu
    • 1
  • Aaron Roth
    • 1
  • Tim Roughgarden
    • 2
  • Jonathan Ullman
    • 3
  1. 1.University of PennsylvaniaUSA
  2. 2.Stanford UniversityUSA
  3. 3.Harvard UniversityUSA

Personalised recommendations