Privacy Enhancing Profile Disclosure

  • Péter Dornbach
  • Zoltán Németh
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2482)


To offer personalized services on the web and on mobile devices, service providers want to have as much information about their users as possible. In the ideal case, the user controls how much of this information is revealed during a transaction. This is a tradeoff between privacy and personalization: if the disclosed profile is too complex, it may become a pseudonym for the user, making it possible to recognize the user at a later time and link different revealed profile parts into one comprehensive profile of the individual. This paper introduces a model for profiles and analyzes it with the methods of probability theory: how much information is revealed and what is the the user’s probability of staying anonymous. The paper examines how likely it is that a provider can link different disclosed profiles and recommends algorithms to avoid a possible privacy compromise.


Service Provider Collaborative Filter Personalized Service Match Factor Extra Element 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Pfitzmann, A., Köhntopp, M.: Anonymity, Unobservability, and Pseudonymity: A Proposal for Terminology. Designing Privacy Enhancing Technologies, 2000.Google Scholar
  2. 2.
    Clarke, R.: Identified, Anonymous and Pseudonymous Transactions: The Spectrum of Choice. User Identification & Privacy Protection Conference, Stockholm, 1999.Google Scholar
  3. 3.
    Rao, J.R. and Rohatgi, P.: Can Pseudonymity Really Guarantee Privacy? Proc. of Ninth Usenix Security Symposium, Denver, Colorado, 2000.Google Scholar
  4. 4.
    Alamäki, T., Björksten, M., Dornbach, P., Gripenberg, C., Győrbíró, N., Márton, G., Németh, Z., Skyttä, T., Tarkiainen, M.: Privacy Enhancing Service Architectures. Submission to Privacy Enhancing Technologies 2002.Google Scholar
  5. 5.
    Customer Profile Exchange (CPExchange) Specification v1.0, 2000.
  6. 6.
    Goldberg, I.A.: A Pseudonymous Communications Infrastructure for the Internet. Ph.D. dissertation, University of California at Berkeley, Fall 2000, page 7.Google Scholar
  7. 7.
    Chaum, D.: Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms. Communications of the ACM, February 1981.Google Scholar
  8. 8.
    Jakobsson, M.: A Practical Mix. Eurocrypt’ 98.Google Scholar
  9. 9.
    Sutherland, E.: Bluetooth Security: An Oxymoron? M-Commerce Times web magazine, November 2000.
  10. 10.
    Breese, J., Heckermann, D., Kadie, C.: Empirical Analysis of Predictive Algorithms for Collaborative Filtering. Proc. of 14th Conference on Uncertainty in Artificial Intelligence, July 1998.Google Scholar
  11. 11.
    Resnick, P., Iacovou, N., Sushak, M., Bergstrom, P., and Riedl, J.: GroupLens: An open architecture for collaborative filtering of netnews. Proceedings of the 1994 Computer Supported Collaborative Work Conference.Google Scholar
  12. 12.
    Salton, G. and McGill, M.: Introduction to Modern Information Retrieval. McGraw-Hill, New York, 1983.zbMATHGoogle Scholar
  13. 13.
    Chickering, D., Heckermann, D. and Meek, C.: A Bayesian Approach to Learning Bayesian Networks with Local Structure. Proc. of 13th Conference on Uncertainty in Artificial Intelligence, 1997.Google Scholar
  14. 14.
    Dempster, A., Laird, N. and Rubin, D. Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Society, B 39:1–38, 1977.MathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Péter Dornbach
    • 1
  • Zoltán Németh
    • 1
  1. 1.Nokia Research CenterSoftware Technology LaboratoryBudapestHungary

Personalised recommendations