Encyclopedia of Algorithms

Living Edition
| Editors: Ming-Yang Kao

Private Analysis of Graph Data

  • Sofya  Raskhodnikova
  • Adam Smith
Living reference work entry
DOI: https://doi.org/10.1007/978-3-642-27848-8_549-1

Years and Authors of Summarized Original Work

  • 2013; Blum, Blocki, Datta, Sheffet

  • 2013; Kasiviswanatan, Nissim, Raskhodnikova, Smith

  • 2013; Chen, Zhou

  • 2015; Raskhodnikova, Smith

  • 2015; Borgs, Chayes, Smith

Problem Definition

Many datasets can be represented by graphs, where nodes correspond to individuals and edges capture relationships between them. On one hand, such datasets contain potentially sensitive information about individuals; on the other hand, there are significant public benefits from allowing access to aggregate information about the data. Thus, analysts working with such graphs are faced with two conflicting goals: protecting privacy of individuals and publishing accurate aggregate statistics. This article describes algorithms for releasing accurate graph statistics while preserving a rigorous notion of privacy, called differential privacy.

Differential privacy was introduced by Dwork et al. [6]. It puts a restriction on the algorithm that processes sensitive data and...

Keywords

Graphs Privacy Subgraph counts Degree distribution 
This is a preview of subscription content, log in to check access.

Notes

Acknowledgements

The authors were supported in part by NSF award IIS-1447700, Boston University’s Hariri Institute for Computing and Center for Reliable Information Systems and Cyber Security, and, while visiting the Harvard Center for Research on Computation & Society, by a Simons Investigator grant to Salil Vadhan.

Recommended Reading

  1. 1.
    Backstrom L, Dwork C, Kleinberg J (2007) Wherefore art thou r3579x? Anonymized social networks, hidden patterns, and structural steganography. In: Proceedings of the 16th international World Wide Web conference, Banff, pp 181–190Google Scholar
  2. 2.
    Blocki J, Blum A, Datta A, Sheffet O (2012) The Johnson-Lindenstrauss transform itself preserves differential privacy. In: 53rd annual IEEE symposium on foundations of computer science, FOCS 2012, New Brunswick, 20–23 Oct 2012. IEEE Computer Society, pp 410–419. doi:10.1109/FOCS.2012.67, http://dx.doi.org/10.1109/FOCS.2012.67
  3. 3.
    Blocki J, Blum A, Datta A, Sheffet O (2013) Differentially private data analysis of social networks via restricted sensitivity. In: Innovations in theoretical computer science (ITCS), Berkeley, pp 87–96Google Scholar
  4. 4.
    Borgs C, Chayes JT, Smith A (2015) Private graphon estimation for sparse graphs. arXiv:150606162 [mathST]Google Scholar
  5. 5.
    Chen S, Zhou S (2013) Recursive mechanism: towards node differential privacy and unrestricted joins. In: ACM SIGMOD international conference on management of data, New York, pp 653–664Google Scholar
  6. 6.
    Dwork C, McSherry F, Nissim K, Smith A (2006) Calibrating noise to sensitivity in private data analysis. In: Halevi S, Rabin T (eds) TCC, New York, vol 3876, pp 265–284MathSciNetGoogle Scholar
  7. 7.
    Gehrke J, Lui E, Pass R (2011) Towards privacy for social networks: a zero-knowledge based definition of privacy. In: Ishai Y (ed) TCC, Providence. Lecture notes in computer science, vol 6597. Springer, pp 432–449Google Scholar
  8. 8.
    Gupta A, Roth A, Ullman J (2012) Iterative constructions and private data release. In: TCC, TaorminaCrossRefGoogle Scholar
  9. 9.
    Hay M, Li C, Miklau G, Jensen D (2009) Accurate estimation of the degree distribution of private networks. In: International conference on data mining (ICDM), Miami, pp 169–178Google Scholar
  10. 10.
    Karwa V, Slavkovic A (2014) Inference using noisy degrees: differentially private ß-model and synthetic graphs. statME arXiv:1205.4697v3 [stat.ME]Google Scholar
  11. 11.
    Kasiviswanathan SP, Nissim K, Raskhodnikova S, Smith A (2013) Analyzing graphs with node-differential privacy. In: Theory of cryptography conference (TCC), Tokyo, pp 457–476Google Scholar
  12. 12.
    Kifer D, Machanavajjhala A (2011) No free lunch in data privacy. In: Sellis TK, Miller RJ, Kementsietsidis A, Velegrakis Y (eds) SIGMOD conference. ACM, Athens, Greece, pp 193–204Google Scholar
  13. 13.
    Lin BR, Kifer D (2013) Information preservation in statistical privacy and Bayesian estimation of unattributed histograms. In: ACM SIGMOD international conference on management of data, New York City, pp 677–688Google Scholar
  14. 14.
    Lu W, Miklau G (2014) Exponential random graph estimation under differential privacy. In: 20th ACM SIGKDD international conference on knowledge discovery and data mining, New York City, pp 921–930Google Scholar
  15. 15.
    Narayanan A, Shmatikov V (2009) De-anonymizing social networks. In: IEEE symposium on security and privacy, Oakland, pp 173–187Google Scholar
  16. 16.
    Nissim K, Raskhodnikova S, Smith A (2007) Smooth sensitivity and sampling in private data analysis. In: Symposium on theory of computing (STOC), San Diego, pp 75–84, full paper on authors’ web sitesGoogle Scholar
  17. 17.
    Raskhodnikova S, Smith A (2015) High-dimensional Lipschitz extensions and node-private analysis of network data. arXiv:150407912Google Scholar
  18. 18.
    Zhang J, Cormode G, Procopiuc CM, Srivastava D, Xiao X (2015) Private release of graph statistics using ladder functions. In: ACM SIGMOD international conference on management of data, Melbourne, pp 731–745Google Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.Computer Science and Engineering DepartmentPennsylvania State UniversityPAUSA