Skip to main content

Informational Privacy and Trust in Autonomous Intelligent Systems

  • Chapter
  • First Online:
Towards Trustworthy Artificial Intelligent Systems

Part of the book series: Intelligent Systems, Control and Automation: Science and Engineering ((ISCA,volume 102))

Abstract

For a successful societal deployment of Autonomous Intelligent Systems (AIS), citizen’s trust is crucial. The central theme of this Chapter is formed by the relationship between informational privacy protection, trust and acceptance of autonomous intelligent technology. This contribution is written from a legal perspective. Certain rules of the EU General Data Protection Regulation (GDPR) are analysed that may apply to AIS-solutions. It is investigated to what extent the application of these rules to AIS-solutions may influence citizen’s trust that their informational privacy is well-protected.

The themes and ideas comprised in this contribution are also encompassed in Roeland’s dissertation, which has been published in 2022 [1].

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    See inter alia Chopra, S. and White, L.F., A Legal Theory for Autonomous Intelligent Agents, Ann Arbor: University of Michigan Press 2011, p. 10 (autonomy) and Davies, C.R., ‘An evolutionary step in intellectual property rights—Artificial intelligence and intellectual property’, Computer Law & Security Review 27, 2011, p. 601–619 (intelligence); and Cock Buning, M. de, Belder, L., & Bruin, R.W. de, “Mapping the Legal Framework for the introduction into Society of Robots as Autonomous Intelligent Systems”, in: Muller S., et al., The Law of the Future and the Future of Law: Volume II, TOAP 2012, p. 198, and further references there.

  2. 2.

    Ibidem.

  3. 3.

    See for example Williams, A.P., and Scharre, P.D., Autonomous Systems—Issues for Defence Policymakers, The Hague: NATO Communications and Information Agency 2015, p. 4; Scharre, P., Army of None: Autonomous Weapons and the Future of War, New York: W. W. Norton & Company 2018; http://www.futureforall.org/transportation/future_of_transportation.htm; Strickland, E., “Autonomous Robot Surgeon Bests Humans in World First”, IEEE Spectrum 4 May 2016, via http://spectrum.ieee.org/the-human-os/robotics/medical-robots/autonomous-robot-surgeon-bests-human-surgeons-in-world-first; Bhorat, Z., ‘Do we still need judges in the age of Artificial Intelligence?’, Opendemocracy.net, 9 August 2017, available via https://www.opendemocracy.net/transformation/ziyaad-bhorat/do-we-still-need-human-judges-in-age-of-artificial-intelligence.

  4. 4.

    See for example on the relationship between innovation in the field of autonomous vehicles and societal acceptance thereof: Rezvani, Z, Jansson, J., and Bodin, J., “Advances in consumer electric vehicle adoption research: A review and research agenda”, Transportation Research Part D 34 2015, pp. 122–136 (Rezvani, Jansson & Bodin 2015). [Refer to the other contributions in this book here].

  5. 5.

    Ajzen, I., “The theory of planned behaviour”. Organizational Behavior and Human Decision Processes 50 1991, pp. 179–211, cited in Rezvani, Jansson & Bodin 2018, p. 126–128.

  6. 6.

    Rogers E.M. Diffusion of Innovations, New York: Free Press 2003 (5th edition), which builds upon the first edition that was published in 1995.

  7. 7.

    Carter L., and Bélanger F., “The utilization of e-government services: citizen trust, innovation and acceptance factors”, Info Systems Journal (2005), no. 15, pp. 5–25 (Carter and Bélanger 2005); Van Slyke et al. 2004.

  8. 8.

    See also Rousseau, D.M., Burt, R.S., Sitkin, S., Camerer, C.F., “Not So Different After All: A Cross-discipline View of Trust”, The Academy of Management Review, 1998, p. 395, cited Cock Buning, M. de & Senden, L. (eds.), Private Regulation and Enforcement in the EU p. 20, and their reference to Six, F., and Verhoest, K., (eds.), Trust in Regulatory Regimes, Cheltenham: Edward Elgar Publishing 2017, p. 3.

  9. 9.

    Ibidem Rousseau et al., 1998, p. 395.

  10. 10.

    See Carter & Bélanger 2005, p. 9, 21; Van Slyke, C., Belanger, F., and Comunale, C. L., “Factors influencing the adoption of web-based shopping: the impact of trust”, ACM SIGMIS Database: the DATABASE for Advances in Information Systems Volume 35 Issue 2, Spring 2004, pp. 32–49 (Van Slyke 2004); Balboni, P., Trustmarks: Third-party liability of trustmark organisations in Europe, Tilburg (diss.) 2008, available online via https://pure.uvt.nl/ws/portalfiles/portal/1063399/Trustmarks.PDF p. 9–10; See also regarding autonomous vehicles: Fagnant, D.J., & Kockelman, K., “Preparing a nation for autonomous vehicles: opportunities, barriers and policy recommendations”, Transportation Research Part A, 77 (2015), p. 177–178; and Glancy, D.J., “Privacy in Autonomous Vehicles”, Santa Clara Law Review 2012, vol. 52, no. 4, p. 1225 (Glancy 2012).

  11. 11.

    Glancy 2012, p. 1225.

  12. 12.

    Ibidem, p. 1225–1226.

  13. 13.

    Recital 10 to the GDPR.

  14. 14.

    Recitals 6, 7, 10, 12, 13 to the GDPR.

  15. 15.

    Ibidem recital 7.

  16. 16.

    See Article 3(1) GDPR.

  17. 17.

    See Article 3(2) GDPR.

  18. 18.

    Article 4(1) GDPR.

  19. 19.

    Article 4(2) GDPR.

  20. 20.

    See Kulk S., & Van Deursen, S. (eds.), Juridische aspecten van algoritmen die besluiten nemen – een verkennend onderzoek, UU/WODC 2020, p. 74–75; Surden H., & Williams, M.A., “Technological Opacity, Predictability, and Self-Driving Cars”, Cardozo Law Review 2016 vol 36, no. 1, p. 121–181.

  21. 21.

    See for example Malone, B., Simovski, B., Moliné, C. et al., “Artificial intelligence predicts the immunogenic landscape of SARS-CoV-2 leading to universal blueprints for vaccine designs”, Nature, scientific reports 23 December 2020, 10, no. 22375; Walts, E., “What AI Can-and Can’t-Do in the Race for a Coronavirus Vaccine”, IEEE Spectrum, 20 September 2020, via https://spectrum.ieee.org/artificial-intelligence/medical-ai/what-ai-can-and-cant-do-in-the-race-for-a-coronavirus-vaccine (last accessed 15 January 2021).

  22. 22.

    See the post by the De Montfort University, “Researchers argue that Artificial Intelligence can help decide who gets a Covid-19 vaccine first”, 12 January 2021, via https://www.dmu.ac.uk/about-dmu/news/2021/january/researchers-argue-that-artificial-intelligence-can-help-decide-who-gets-a-covid-19-vaccine-first.aspx (last accessed 15 January 2021).

  23. 23.

    See for example Ahuja, A.S., “The impact of artificial intelligence in medicine on the future role of the physician”, PeerJ, 2019; 7: e7702, https://doi.org/10.7717/peerj.7702, who more generally observes on the basis of a literature study that AI can be used in order to decide which medicine to prescribe.

  24. 24.

    Article 5(1)(a) GDPR.

  25. 25.

    Ibidem, sub b.

  26. 26.

    Sub c.

  27. 27.

    Sub d.

  28. 28.

    Sub e.

  29. 29.

    Sub f.

  30. 30.

    Article 5(2) GDPR.

  31. 31.

    Other obligations include for example that a comprehensive record of data processing activities is kept (Article 30 GDPR); and that privacy-by-design and privacy-by-default principles must be implemented in organisations and innovations (for example software solutions) before data processing activities are begun with (Article 25 GDPR).

  32. 32.

    Article 32(1)(a–d).

  33. 33.

    The GDPR distinguishes between “normal” personal data and “special category data”, which are defined in Article 9(1) as data: “revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation”.

  34. 34.

    Which for instance recently occurred in the Dutch VoetbalTV-case: Rechtbank (District Court) Midden-Nederland 23 November 2020, ECLI:NL:RBMNE:2020:5111.

  35. 35.

    European Data Protection Board, Guidelines 1/2020 on processing personal data in the context of connected vehicles and mobility related applications, version 1.0, 28 January 2020 (EDPB 01/2020).

  36. 36.

    Ibidem, p. 19–20.

  37. 37.

    Ibidem.

  38. 38.

    Ibidem, p. 20.

  39. 39.

    Ibidem, p. 29.

  40. 40.

    European Data Protection Board, Guidelines 03/2020 on the processing of data concerning health for the purpose of scientific research in the context of the COVID-19 outbreak, 21 April 2002 (EDPB 03/2020).

  41. 41.

    Article 35(1) GDPR.

  42. 42.

    Article 35(3)(a–c).

  43. 43.

    Article 35(7) GDPR.

  44. 44.

    Article 36 GDPR.

  45. 45.

    Article 29 Data Protection Working Party, “Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679”, 17/EN WP 248 rev.01.

  46. 46.

    Article 29 Data Protection Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679”, 3 October 2017, revised on 6 February 2018, 17/EN WP 251 rev.01, p. 19–21.

  47. 47.

    Article 22(2) GDPR.

  48. 48.

    Article 22(4) GDPR.

  49. 49.

    Article 44 GDPR.

  50. 50.

    Article 45(1) GDPR.

  51. 51.

    Article 47 GDPR.

  52. 52.

    Article 46(1) GDPR.

  53. 53.

    Commission Implementing Decision (EU) 2016/1250 of 12 July 2016, pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided by the EU-U.S. Privacy Shield, OJ L 207/1.

  54. 54.

    CJEU 6 October 2015, C-362/14, ECLI:EU:C:2015:650 (Schrems I).

  55. 55.

    CJEU 16 July 2020, C-311/18, ECLI:EU:C:2020:559 (Max Schrems II), no. 192.

  56. 56.

    CJEU Max Schrems II, para. 134; ruling no. 2.

  57. 57.

    European Data Protection Board, Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data, 10 November 2020 (EDPB R 01/2020); and Recommendations 02/2020 on the European Essential Guarantees for surveillance measures, 10 November 2020 (EDPB R 02/2020).

  58. 58.

    EDPB R 01/2020, p. 22–23 (on backup and hosting); 24–25 (on transit).

  59. 59.

    Ibidem, p. 23–24.

  60. 60.

    Ibidem, p. 26–27.

  61. 61.

    Article 40 GDPR.

  62. 62.

    Article 42 GDPR.

Reference

  1. Bruin RWD (2022) Regulating Innovation of Autonomous Vehicles: Improving Liability & Privacy in Europe (diss), Amsterdam: deLex

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roeland de Bruin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

de Bruin, R. (2022). Informational Privacy and Trust in Autonomous Intelligent Systems. In: Ferreira, M.I.A., Tokhi, M.O. (eds) Towards Trustworthy Artificial Intelligent Systems. Intelligent Systems, Control and Automation: Science and Engineering, vol 102. Springer, Cham. https://doi.org/10.1007/978-3-031-09823-9_3

Download citation

Publish with us

Policies and ethics