Advertisement

On the Anonymity of Home/Work Location Pairs

  • Philippe Golle
  • Kurt Partridge
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5538)

Abstract

Many applications benefit from user location data, but location data raises privacy concerns. Anonymization can protect privacy, but identities can sometimes be inferred from supposedly anonymous data. This paper studies a new attack on the anonymity of location data. We show that if the approximate locations of an individual’s home and workplace can both be deduced from a location trace, then the median size of the individual’s anonymity set in the U.S. working population is 1, 21 and 34,980, for locations known at the granularity of a census block, census track and county respectively. The location data of people who live and work in different regions can be re-identified even more easily. Our results show that the threat of re-identification for location data is much greater when the individual’s home and work locations can both be deduced from the data. To preserve anonymity, we offer guidance for obfuscating location traces before they are disclosed.

Keywords

Census Tract Census Block Location Privacy Location Pair Work Location 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Andersson, F., Freedman, M., Roemer, M., Vilhuber, L.: LEHD OnTheMap Technical documentation (February 21, 2008)Google Scholar
  2. 2.
    Beresford, A.R., Stajano, F.: Location privacy in pervasive computing. IEEE Pervasive Computing 2(1), 46–55 (2003)CrossRefGoogle Scholar
  3. 3.
    Duckham, M., Kulik, L.: A formal model of obfuscation and negotiation for location privacy. In: Gellersen, H.-W., Want, R., Schmidt, A. (eds.) Pervasive 2005. LNCS, vol. 3468, pp. 152–170. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  4. 4.
    Hoh, B., Gruteser, M., Xiong, H., Alrabady, A.: Preserving Privacy in GPS Traces via Density-Aware Path Cloaking. In: Proc. of ACM Conference on Computer and Communications Security (CCS) (2007)Google Scholar
  5. 5.
    Krumm, J.: Inference Attacks on Location Tracks. In: LaMarca, A., Langheinrich, M., Truong, K.N. (eds.) Pervasive 2007. LNCS, vol. 4480, pp. 127–143. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  6. 6.
    Schilit, B., Hong, J., Gruteser, M.: Wireless Location Privacy Protection. Computer 36(12), 135–137 (2003)CrossRefGoogle Scholar
  7. 7.
    Sweeney, L.: Uniqueness of Simple Demographics in the U.S. Population. Laboratory for International Data Privacy, Carnegie Mellon University (2000)Google Scholar
  8. 8.
    Sweeney, L.: K-anonymity: a Model for Protecting Privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems 10(5), 557–570 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    U.S. Census Bureau. Longitudinal Employer-Household Dynamics, http://lehd.did.census.gov/led/
  10. 10.
    VirtualRDC OnTheMap Data, http://www.vrdc.cornell.edu/onthemap

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Philippe Golle
    • 1
  • Kurt Partridge
    • 1
  1. 1.Palo Alto Research CenterUSA

Personalised recommendations