Skip to main content

Privacy by Design for Mobility Data Analytics

  • Chapter
  • First Online:
  • 1201 Accesses

Abstract

Privacy is an ever-growing concern in our society and is becoming a fundamental aspect to take into account when one wants to use, publish and analyze data involving human personal sensitive information, like data referring to individual mobility. Unfortunately, it is increasingly hard to transform the data in a way that it protects sensitive information: we live in the era of big data characterized by unprecedented opportunities to sense, store and analyze social data describing human activities in great detail and resolution. This is especially true when we work on mobility data, that are characterized by the fact that there is no longer a clear distinction between quasi-identifiers and sensitive attributes. Therefore, protecting privacy in this context is a significant challenge. As a result, privacy preservation simply cannot be accomplished by de-identification alone. In this chapter, we propose the Privacy by Design paradigm to develop technological frameworks for countering the threats of undesirable, unlawful effects of privacy violation, without obstructing the knowledge discovery opportunities of social mining and big data analytical technologies. Our main idea is to inscribe privacy protection into the knowledge discovery technology by design, so that the analysis incorporates the relevant privacy requirements from the start. We show three applications of the Privacy by Design principle on mobility data analytics. First we present a framework based on a data-driven spatial generalization, which is suitable for the privacy-aware publication of movement data in order to enable clustering analysis. Second, we present a method for sanitizing semantic trajectories, using a generalization of visited places based on a taxonomy of locations. The private data then may be used for extracting frequent sequential patterns. Lastly, we show how to apply the idea of Privacy by Design in a distributed setting in which movement data from individual vehicles is made private through differential privacy manipulations and then is collected, aggregated and analyzed by a centralized station.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    This definition of de-identified data is compliant with the General Data Protection Regulation (GDPR) [18], especially referring to Recital 26. Indeed, with the de-identification process we are going to transform identified persons in identifiable persons.

References

  1. Privacy by Design Resolution, International Conference of Data Protection and Privacy Commissioners (Jerusalem, Israel, October 27–29, 2010)

    Google Scholar 

  2. Article 29 Data Protection Working Party and Working Party on Police and Justice, The Future of Privacy: Joint Contribution to the Consultation of the European Commission on the Legal Framework for the Fundamental Right to Protection of Personal Data, 02356/09/EN, WP 168 (Dec. 1, 2009)

    Google Scholar 

  3. Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016, Official Journal of the European Union (2016)

    Google Scholar 

  4. O. Abul and F. Bonchi and M. Nanni, in Never Walk Alone: Uncertainty for Anonymity in Moving Objects Databases, ICDE 2008, pp. 376–385

    Google Scholar 

  5. O. Abul and F. Bonchi and M. Nanni, in Anonymization of Moving Objects Databases by Clustering and Perturbation, Inf. Syst., vol 35, num 8, pp. 884–910 (Elsevier Science Ltd., December, 2010), doi: 10.1016/j.is.2010.05.003

    Article  Google Scholar 

  6. G. Ács and C. Castelluccia, A case study: privacy preserving release of spatio-temporal density in Paris, KDD 2014, pp. 1679–1688

    Google Scholar 

  7. M. E. Andrés and N.E. Bordenabe and K. Chatzikokolakis and C. Palamidessi, Geo-indistinguishability: differential privacy for location-based systems, ACM Conference on Computer and Communications Security, p. 901–914, 2013

    Google Scholar 

  8. F. Bonchi and L.V.S. Lakshmanan and H. W. Wang, in Trajectory anonymity in publishing personal mobility data, SIGKDD Explor. Newsl. 2011, pp. 30–42, https://doi.org/10.1145/2031331.2031336

    Article  Google Scholar 

  9. R. Chen and B.C.M. Fung and B.C. Desai and N.M. Sossou, Differentially private transit data publication: a case study on the Montreal transportation system, KDD 2012, pp. 213–221

    Google Scholar 

  10. G. Cormode and C. Procopiuc and D. Srivastava and E. Shen and T. Yu, Differentially private spatial decompositions, ICDE 2012, pp. 20–31

    Google Scholar 

  11. G. Cormode and C. Procopiuc and D. Srivastava and T. Tran, Differentially private summaries for sparse data, ICDT 2012, pp. 299–311

    Google Scholar 

  12. J. Domingo-Ferrer and R. Trujillo-Rasua, in Microaggregation- and permutation-based anonymization of movement data, Inf. Sci. 2012, volume 208 pp. 55–80

    Google Scholar 

  13. C. Dwork, in Differential privacy: A survey of results, International Conference on Theory and Applications of Models of Computation, pages 1–19. Springer, 2008

    Google Scholar 

  14. C. Dwork, in Differential Privacy, ICALP 2006, Lecture Notes in Computer Science, vol 4052, https://doi.org/10.1007/11787006_1

    Google Scholar 

  15. C. Dwork and F. Mcsherry and K. Nissim and A. Smith, in Calibrating noise to sensitivity in private data analysis, Proceedings of the 3rd Theory of Cryptography Conference (Springer, 2006), pp. 265–284

    Google Scholar 

  16. C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, M. Naor, in Our Data, Ourselves: Privacy Via Distributed Noise Generation, Advances in Cryptology-EUROCRYPT 2006, pp. 486–503. Springer Berlin Heidelberg, 2006

    Chapter  Google Scholar 

  17. European Data Protection Supervisor in Opinion of the European Data Protection Supervisor on Promoting Trust in the Information Society by Fostering Data Protection and Privacy (Mar. 18, 2010)

    Google Scholar 

  18. European Parliament & Council. General data protection regulation, 2016. L119, 4/5/2016

    Google Scholar 

  19. Federal Trade Commission (Bureau of Consumer Protection, in Preliminary Staff Report, Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Business and Policy Makers (Dec. 2010)

    Google Scholar 

  20. F. Giannotti and D. Pedreschi, in Mobility, Data Mining and Privacy: A Vision of Convergence, Mobility, Data Mining and Privacy - Geographic Knowledge Discovery (2008), https://doi.org/10.1007/978-3-540-75177-9_1

    Chapter  Google Scholar 

  21. S. Ho and S. Ruan, Preserving Privacy for Interesting Location Pattern Mining from Trajectory Data, Transactions Data Privacy (2013) 6(1): 87–106, 2013

    Google Scholar 

  22. A. Monreale and D. Pedreschi and R. G. Pensa, in Anonymity Technologies for Privacy-Preserving Data Publishing and Mining, Privacy-Aware Knowledge Discovery: Novel Applications and New Techniques (2010), pp. 3–33

    Google Scholar 

  23. A. Monreale and G. L. Andrienko and N. V. Andrienko and F. Giannotti and D. Pedreschi and S. Rinzivillo and S. Wrobel, in Movement Data Anonymity through Generalization, Transactions on Data Privacy (2010), volume 3 number 2, pp. 91–121

    Google Scholar 

  24. A. Monreale, in Privacy by Design in Data Mining, PhD Thesis, Dept. of Computer Science (University of Pisa, 2011)

    Google Scholar 

  25. A. Monreale and R. Trasarti and D. Pedreschi and C. Renso and V. Bogorny in C-safety: a framework for the anonymization of semantic trajectories, Transactions on Data Privacy (2011), volume 4 number 2 pp. 73–101

    Google Scholar 

  26. A. Monreale and W. Hui Wang and F. Pratesi and S. Rinzivillo and D. Pedreschi and G. Andrienko and N. Andrienko, in Privacy-preserving Distributed Movement Data Aggregation, AGILE (Springer 2013), https://doi.org/10.1007/978-3-319-00615-4_13

    Chapter  Google Scholar 

  27. A. Monreale and S. Rinzivillo and F. Pratesi and F. Giannotti and D. Pedreschi, in Privacy-by-design in big data analytics and social mining, EPJ Data Science (2014)

    Google Scholar 

  28. R. G. Pensa and A. Monreale and F. Pinelli and D. Pedreschi, in Pattern-Preserving k-Anonymization of Sequences and its Application to Mobility Data Mining PiLBA 2008

    Google Scholar 

  29. W. H. Qardaji and W. Yang and N. Li, in Differentially private grids for geospatial data, ICDE 2013, pp 757–768

    Google Scholar 

  30. P. Samarati and L. Sweeney, in Protecting privacy when disclosing information: k-anonymity and its enforcement through generalization and suppression (SRI International, 1998)

    Google Scholar 

  31. S. Spaccapietra, C. Parent M.L. Damiani, J. Macedo, F. Porto, C. Vangenot. A conceptual view on trajectories. DKE Journal 65(1): 126–146 (2008).

    Article  Google Scholar 

  32. L. Sweeney, in Computational disclosure control: a primer, Ph.D. thesis, Dept. of Electrical Eng. and Computer Science (MIT, 2001)

    Google Scholar 

  33. L. Sweeney, in Simple Demographics Often Identify People Uniquely, Carnegie Mellon University, Data Privacy Working Paper 3. Pittsburgh 2000

    Google Scholar 

  34. W.K. Wong and D.W. Cheung and E. Hung and B. Kao and Nikos Mamoulis, in Security in Outsourcing of Association Rule Mining, VLDB 2007, pp. 111–122

    Google Scholar 

  35. Y. Xiao and L. Xiong, in Protecting Locations with Differential Privacy under Temporal Correlations, ACM Conference on Computer and Communications Security, p.298–1309, 2015

    Google Scholar 

  36. R. Yarovoy and F. Bonchi and L.V. S. Lakshmanan and W. H. Wang, in Anonymizing moving objects: how to hide a mob in a crowd?, International Conference on Extending DataBase Technology (2009), pp. 72–83

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francesca Pratesi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Pratesi, F., Monreale, A., Pedreschi, D. (2018). Privacy by Design for Mobility Data Analytics. In: Gkoulalas-Divanis, A., Bettini, C. (eds) Handbook of Mobile Data Privacy . Springer, Cham. https://doi.org/10.1007/978-3-319-98161-1_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-98161-1_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-98160-4

  • Online ISBN: 978-3-319-98161-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics