Skip to main content

Taming Artificial Intelligence: “Bots,” the GDPR and Regulatory Approaches

  • Chapter
  • First Online:
Robotics, AI and the Future of Law

Part of the book series: Perspectives in Law, Business and Innovation ((PLBI))

Abstract

Bots and AI have the potential to revolutionize the way that personal data is processed. Unlike processing performed by traditional methods, they have an unprecedented ability (and patience) to gather, analyze and combine information. However, the introduction of “smarter” computers does not always mean that the nature of the processing will change; often, the result will be substantially similar to processing by a human. We cannot, then, regulate processing by bots and AI as a sui generis concept. This chapter examines the different regulatory approaches that exist under the new General Data Protection Regulation (the GDPR )—the general regulatory approach (which treats all processing in the same way), the specific regulatory approach (which imposes specific rules for automated processing ) and the co-regulatory approach (where data controllers are required to analyze and mitigate the risks on their own). It then considers how these approaches interact and makes some recommendations for how they should be interpreted and implemented in the future.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation ) [2016] OJ L119/1.

  2. 2.

    GDPR, Recital 15.

  3. 3.

    Sartor (1993), p. 15.

  4. 4.

    Russell and Norvig (2016), p. 2.

  5. 5.

    See, e.g., Botnik’s chapter written in the style of Harry Potter. Available at: http://botnik.org/content/harry-potter.html. Accessed 3 January 2018.

  6. 6.

    See, e.g., the Reddit bot AutoTLDR. Available at: https://www.reddit.com/r/autotldr/comments/31b9fm/faq_autotldr_bot/?st=j041su3v&sh=295425b7. Accessed 3 January 2018.

  7. 7.

    Nwana (1996), Part 4.

  8. 8.

    This conclusion was reached at the FRII Analysis Workshop held at Aalto University, Helsinki, Finland on 14 June 2017 and reinforced at the FRII Seminar held at the Hanken School of Economics, Helsinki, Finland on 11 September 2017.

  9. 9.

    The GDPR, Recital 10.

  10. 10.

    The GDPR, Recital 4.

  11. 11.

    See, e.g., Maxwell (2015), pp. 212 et seq.

  12. 12.

    C-582/14 Breyer v Bundesrepublik Deutschland (Second Chamber, 19 October 2016). Published in the electronic Report of Cases.

  13. 13.

    C-582/14 Breyer v Bundesrepublik Deutschland, paras. 41–43.

  14. 14.

    C-582/14 Breyer v Bundesrepublik Deutschland, para. 46.

  15. 15.

    See, e.g., Googlebot. Available at: https://support.google.com/webmasters/answer/182072?hl=en. Accessed 26 November 2017.

  16. 16.

    See, e.g., Mayer-Schönberger and Cuckier (2013), pp. 154 et seq.

  17. 17.

    See, e.g., Council of the European Union (2014).

  18. 18.

    See, e.g., Pearce (2015).

  19. 19.

    See, e.g., Rubinstein (2013).

  20. 20.

    See, e.g., Míšek (2015), or Article 29 Working Party (2011), p. 7.

  21. 21.

    Pearce (2015), p. 151.

  22. 22.

    One issue with such a bot is that it will require appropriate data security measures to, e.g., ensure that the removal of consent is actually coming from the relevant data subject . The depth of this security will depend on the nature of the data and the processing involved, but should be manageable in most scenarios.

  23. 23.

    See, e.g., the MyData project. Available at: https://www.lvm.fi/documents/20181/859937/MyData-nordic-model/. Accessed 29 November 2017.

  24. 24.

    For a wider discussion of this Article, see Kamarinou et al. (2016).

  25. 25.

    Article 29 Working Party (2017a), as revised by Article 29 Working Party (2018).

  26. 26.

    Article 29 Working Party (2017a), p. 10.

  27. 27.

    Charter of Fundamental Rights of the European Union, Article 8.

  28. 28.

    Article 29 Working Party (2018), p. 21.

  29. 29.

    Article 29 Working Party (2017a), p. 11.

  30. 30.

    See Footnote 28.

  31. 31.

    See, e.g., Terms of Service: Didn’t Read. Available at: https://tosdr.org/. Accessed 3 January 2018; biggestlie.com. Available at: http://biggestlie.com/. Accessed 3 January 2018; and Obar and Oeldorf-Hirsch (2016).

  32. 32.

    European Commission (1992), p. 26.

  33. 33.

    Bygrave (2001), p. 26.

  34. 34.

    Such a test could draw inspiration for the rules of judicial review in England and Wales: See Associated Provincial Picture Houses v Wednesbury Corporation [1948] 1 KB 223 and subsequent case law for more details.

  35. 35.

    Article 29 Working Party (2018), p. 25.

  36. 36.

    European Parliament (2011), p. 21.

  37. 37.

    European Parliament (2011), p. 9.

  38. 38.

    European Commission (2012), p. 9.

  39. 39.

    Council of the European Union (2010).

  40. 40.

    Council of the European Union (2010), p. 6.

  41. 41.

    Binns (2017), p. 25.

  42. 42.

    See Binns (2017).

  43. 43.

    Parker (2007), pp. 14 and 98.

  44. 44.

    See Information & Privacy Commissioner of Ontario (2013).

  45. 45.

    Article 29 Working Party (2017b).

  46. 46.

    Koops (2014), pp. 254–255.

  47. 47.

    GDPR, Sect. 4.

  48. 48.

    Koops (2014), p. 259.

  49. 49.

    For a list of companies who have self-certified under the Privacy Shield, see https://www.privacyshield.gov/list Accessed 12 September 2017. For a list of companies who were self-certified under Safe Harbour, see https://www.export.gov/safeharbor_eu. Accessed 12 September 2017.

  50. 50.

    See, e.g., https://www.iso.org/standards.html. Accessed 12 September 2017.

  51. 51.

    See, e.g., Leonard (2014), p. 57. There is also great public fascination with the idea of “creepy” AI, even where the bots only seem creepy because of misinformation or inaccurate reporting; see Baraniuk (2017).

  52. 52.

    Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L281/31.

  53. 53.

    The GDPR, Recital 6.

  54. 54.

    See, e.g., de Hert (2016), p. 464.

  55. 55.

    Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) [2002] OJ L201/37.

References

  • Article 29 Working Party. (2011). Opinion 15/2011 on the Definition of Consent.

    Google Scholar 

  • Article 29 Working Party. (2017a). Guidelines on automated individual decision-making and profiling for the purposes of regulation 2016/679

    Google Scholar 

  • Article 29 Working Party. (2017b). Guidelines on Data protection impact assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679

    Google Scholar 

  • Article 29 Working Party. (2018). Guidelines on Automated individual decision-making and profiling for the purposes of regulation 2016/679 (revised).

    Google Scholar 

  • Baraniuk, C. (2017). The ‘Creepy Facebook AI’ story that captivated the media. BBC News, August 1, 2017.

    Google Scholar 

  • Binns, R. (2017). Data protection impact assessments: A meta-regulatory approach. International Data Privacy Law, 7(1), 22.

    Article  Google Scholar 

  • Bygrave, L. (2001). Automated profiling: Minding the machine: Article 15 of the EC data protection directive and automated profiling. Computer Law & Security Report, 17(1), 17.

    Article  Google Scholar 

  • Council of Europe. (2010). The protection of individuals with regard to automatic processing of personal data in the context of profiling: Recommendation CM/Rec(2010)13 and explanatory memorandum.

    Google Scholar 

  • Council of the European Union. (2014). Right to be forgotten and the Google Judgment. Interinstitutional File 2012/0011 (COD).

    Google Scholar 

  • de Hert, P. (2016). The future of privacy. Addressing singularities to identify bright-lines that speak to us. European Data Protection Law Review, 3(4), 461.

    Google Scholar 

  • European Commission. (2012). Proposal for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation). COM(2012) 11 final.

    Google Scholar 

  • European Parliament. (2011). Report on a comprehensive approach on personal data protection in the European Union 2011/2025(INI). A7-0244/2011.

    Google Scholar 

  • Information & Privacy Commissioner of Ontario. (2013). Privacy by design. Available at: https://www.ipc.on.ca/resource/privacy-by-design/. Accessed December 8, 2017.

  • Kamarinou, D., Millard, C., & Singh, J. (2016). Machine learning with personal data. Queen Mary Legal Studies Research Paper 247/2016.

    Google Scholar 

  • Koops, B. J. (2014). The trouble with European data protection law. International Data Privacy Law, 4(4), 250.

    Article  Google Scholar 

  • Leonard, P. (2014). Customer data analytics: Privacy settings for ‘big data’ businesses. International Data Privacy law, 4(1), 53.

    Article  Google Scholar 

  • Maxwell, W. (2015). Principles-based regulation of personal data: The case of ‘fair processing’. International Data Privacy Law, 5(3), 205.

    Article  Google Scholar 

  • Mayer-Schönberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work and think. John Murray, Kindle edition.

    Google Scholar 

  • Míšek, J. (2015). Consent to personal data processing—The Panacea or the dead end? Masaryk University Journal of Law & Tech, 8, 69.

    Google Scholar 

  • Nwana, H. (1996). Software agents: An overview. Knowledge Engineering Review, 11(3), 1.

    Article  Google Scholar 

  • Obar, J., Oeldorf-Hirsch, A. (2016). The biggest lie on the internet: Ignoring the privacy policies and terms of service policies of social networking services. In TRPC 44 Conference on Communication, Information and Internet Policy, Virginia, 2016. Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2757465. Accessed January 3, 2018.

  • Parker, C. (2007). Meta-regulation: Legal accountability for corporate social responsibility. In D. McBarnet, A. Voiculescu, & T. Campbell (Eds.), The new corporate accountability: Corporate social responsibility and the law. Cambridge: Cambridge University Press.

    Google Scholar 

  • Pearce, H. (2015). Online data transactions, consent and big data: Technological solutions to technological problems. Computer and Telecommunications Law Review, 21(6), 149.

    Google Scholar 

  • Rubinstein, I. (2013). Big data: The end of privacy or a new beginning. International Data Privacy Law, 3(2), 74.

    Article  Google Scholar 

  • Russell, S., & Norvig, P. (2016). Artificial intelligence: A modern approach. Upper Saddle River, New Jersey: Pearson.

    Google Scholar 

  • Sartor, G. (1993). Artificial intelligence and law: Legal philosophy and legal theory. Norweigian Research Centre for Computers and Law, CompLex 1/93, Tano, Oslo.

    Google Scholar 

Download references

Acknowledgements

This chapter was written as part of the Future Regulation of Industrial Internet project at the IPR University Center, Helsinki. I wish to express my gratitude to Olli Pitkänen for comments on an earlier draft of this chapter.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sam Wrigley .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Wrigley, S. (2018). Taming Artificial Intelligence: “Bots,” the GDPR and Regulatory Approaches. In: Corrales, M., Fenwick, M., Forgó, N. (eds) Robotics, AI and the Future of Law. Perspectives in Law, Business and Innovation. Springer, Singapore. https://doi.org/10.1007/978-981-13-2874-9_8

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-2874-9_8

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-2873-2

  • Online ISBN: 978-981-13-2874-9

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics