The Way Forward

  • Bart Custers
  • Toon Calders
  • Tal Zarsky
  • Bart Schermer
Part of the Studies in Applied Philosophy, Epistemology and Rational Ethics book series (SAPERE, volume 3)

Abstract

The growing use of data mining practices by both government and commercial entities leads to both great promises and challenges. They hold the promise of facilitating an information environment which is fair, accurate and efficient. At the same time, they might lead to practices which are both invasive and discriminatory, yet in ways the law has yet to grasp. This point is demonstrated by showing how the common measures for mitigating privacy concerns, such as a priori limiting measures (particularly access controls, anonymity and purpose specification) are mechanisms that are increasingly failing solutions against privacy and discrimination issues in this novel context.

Instead, a focus on (a posteriori) accountability and transparency may be more useful. This requires improved detection of discrimination and privacy violations as well as designing and implementing techniques that are discrimination-free and privacy-preserving. This requires further (technological) research.

But even with further technological research, there may be new situations and new mechanisms through which privacy violations or discrimination may take place. Novel predictive models can prove to be no more than sophisticated tools to mask the “classic” forms of discrimination, by hiding discrimination behind new proxies. Also, discrimination might be transferred to new forms of population segments, dispersed throughout society and only connected by some attributes they have in common. Such groups will lack political force to defend their interests. They might not even know what is happening.

With regard to privacy, the adequacy of the envisaged European legal framework is discussed in the light of data mining and profiling. The European Union is currently revising the data protection legislation. The question whether these new proposals will adequately address the issues raised in this book is dealt with.

Keywords

Data Mining Access Control Personal Data Data Protection Data Subject 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Calders, T., Verwer, S.: Three Naive Bayes Approaches for Discrimination-Free Classification. Special issue of ECML/PKDD (2010)Google Scholar
  2. Cate, F.H.: Government, Data Mining: The Need for a Legal Framework. Harvard Civil Rights-Civil Liberties Law Review 43, 436 (2008)Google Scholar
  3. Custers, B.H.M.: Data Mining with Discrimination Sensitive and Privacy Sensitive Attributes. In: Proceedings of ISP 2010, International Conference on Information Security and Privacy, Orlando, Florida, July 12-14 (2010)Google Scholar
  4. Denning, D.E.: Cryptography and Data Security, p. 17. Addison-Wesley, Amsterdam (1983)Google Scholar
  5. Ohm, P.: Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization. UCLA Law Review 57, 1701–1765 (2010)Google Scholar
  6. Pedreschi, D., Ruggieri, S., Turini, F.: Discrimination-aware Data Mining. In: 14th ACM International Conference on Knowledge Discovery and Data Mining (KDD 2008), pp. 560–568. ACM (August 2008)Google Scholar
  7. Schermer, Wagemans: Onze digitale schaduw, een verkennend onderzoek naar het aantal databases waarin de gemiddelde Nederlander geregistreerd staat. College Bescherming Persoonsgegevens, Den Haag (2009) (in Dutch) Google Scholar
  8. Shannon, C.E.: The mathematical theory of communication. Bell Systems Technology Journal 27, 379–423, 623–656 (1948)Google Scholar
  9. Shannon, C.E.: Communications theory of secrecy systems. Bell Systems Technology Journal 28, 656–715 (1949)MathSciNetMATHGoogle Scholar
  10. Solove, D.: Nothing to Hide: The False Tradeoff Between Privacy and Security. Yale University Press (2011)Google Scholar
  11. Taipale, K.A.: Technology, Security and Privacy: The Fear of Frankenstein, the Mythology of Privacy and the Lessons of King Ludd. Yale Journal of Law and Technology 7(123) (December 2004)Google Scholar
  12. Van den Berg, B., Leenes, R.: Audience Segregation in Social Network Sites. In: Proceedings for SocialCom2010/PASSAT2010 (Second IEEE International Conference on Social Computing/Second IEEE International Conference on Privacy, Security, Risk and Trust), pp. 1111–1117. IEEE, Minneapolis (2010)Google Scholar
  13. Vedder, A.H.: KDD: The Challenge to Individualism. Ethics and Information Technology (1), 275–281 (1999)Google Scholar
  14. Verwer, Calders: Three Naive Bayes Approaches for Discrimination-Free Classification. In: Data Mining: Special Issue with Selected Papers from ECML-PKDD 2010, Springer (2010)Google Scholar
  15. Warren, S.D., Brandeis, L.D.: The right to privacy; the implicit made explicit. Harvard Law Review, 193–220 (1890)Google Scholar
  16. Weitzner, D.J., Abelson, H., et al.: Transparent Accountable Data Mining: New Strategies for Privacy Protection. MIT Technical Report. MIT, Cambridge (2006)Google Scholar
  17. Westin, A.: Privacy and Freedom. Bodley Head, London (1967)Google Scholar
  18. Zarsky, T.Z.: ”Mine Your Own Business!”: Making the Case for the Implications of the Data Mining of Personal Information in the Forum of Public Opinion. Yale Journal of Law & Technology 5, 56 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Bart Custers
    • 1
  • Toon Calders
    • 2
  • Tal Zarsky
    • 3
  • Bart Schermer
    • 1
  1. 1.eLaw, Institute for Law in the Information SocietyLeiden UniversityLeidenThe Netherlands
  2. 2.Eindhoven University of TechnologyEindhovenvThe Netherlands
  3. 3.Faculty of LawUniversity of HaifaHaifaIsrael

Personalised recommendations