Skip to main content

Facial Recognition for Preventive Purposes: The Human Rights Implications of Detecting Emotions in Public Spaces

  • Chapter
  • First Online:
Investigating and Preventing Crime in the Digital Era

Abstract

Police departments are increasingly relying on surveillance technologies to tackle public security issues in smart cities. Automated facial recognition is deployed in public spaces for real-time identification of suspects and warranted individuals. In some cases, law enforcement is going even further by exploiting also emotion recognition technologies. In preventive operations indeed, emotion facial recognition (EFR) is being used to infer individuals’ inner affective states from traits like facial muscle movements. In this way, law enforcement aims to obtain insightful hints on unknown persons acting suspiciously in public or strategic venues (e.g. train stations, airports). While the employment of such tools still seems to be relegated to dystopian scenarios, it is already a reality in some parts of the world. Hence, there emerges a need to explore their compatibility with the European human rights framework. The Chapter undertakes this task and examines whether and how EFR can be considered compliant with the rights to privacy and data protection, the freedom of thought and the presumption of innocence.

Isadora Neroni Rezende is a PhD Candidate in Law, Science and Technology—Rights of the Internet of Everything.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Albino et al. (2015), p. 2 ff.; Kummitha and Crutzen (2017), pp. 43, 45.

  2. 2.

    Marat and Sutton (2021), p. 248.

  3. 3.

    On the potentialities of AI applications in law enforcement and criminal justice, see Lasagni, in this volume; Caianiello (2021).

  4. 4.

    Adapted from Article 29 Data Protection Working Party (2012), p. 2. For an overview of face recognition technologies, their functioning, issues and implications see Berle (2020), pp. 1–17.

  5. 5.

    Facial recognition technologies indeed process biometric data, a special category of personal data. On the notion of biometric data, see generally Kindt (2018).

  6. 6.

    See, e.g., BBC News (2018).

  7. 7.

    Notably, in China AFR is used to catch jaywalkers. See Liao (2018).

  8. 8.

    Heilweil (2020).

  9. 9.

    Clearview AI is an US-based tech company which provides facial recognition services. Notably, the app developed by Clearview runs its software not only on government-held images, but also on people’s pictures scraped by social media network. On the matter see Neroni Rezende (2020).

  10. 10.

    O’Flaherty (2020), p. 170.

  11. 11.

    See, e.g., Raab (2019).

  12. 12.

    Peeters (2020).

  13. 13.

    Affective computing comprises both “the creation of and interaction with machine systems that sense, recognize, respond to, and influence emotions”. See Daily et al. (2017), p. 213.

  14. 14.

    Mc Stay (2020), p. 1.

  15. 15.

    Article 19 (2021), p. 15.

  16. 16.

    On different applications beyond the security domain, see Mc Stay (2020).

  17. 17.

    On lie detectors and their implications in criminal proceedings, see Lasagni (2021).

  18. 18.

    Article 19 (2021), p. 21.

  19. 19.

    Crawford (2021).

  20. 20.

    Article 19 (2021), 19.

  21. 21.

    Id.

  22. 22.

    Id.

  23. 23.

    iBorderCrtl (2016). Critically assessed by Sánchez-Monedero and Dencik (2020).

  24. 24.

    Article 19 (2021), p. 19. See also Gallagher and Jona (2019).

  25. 25.

    Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, COM/2021/206 final. Critically assessed by Vaele and Borgesius (2021); Papakostantinou and De Hert (2021).

  26. 26.

    Malgieri and Ienca note indeed that the scheme of classification of high-risk AI system seems to revolve around three main criteria: (i) the type of AI system; (ii) its domain of application and (iii) its human target. This implies that if AI systems featuring limited risks are employed in very sensitive contexts and used for practices falling under the unbearable risk list they would be prohibited. This mechanism emerges clearly in the case of EFR that is labelled as low-risk when employed, for instance, in the commercial context, and as high-risk when used in law enforcement or education. See Malgieri and Ienca (2021). The Consultative Committee on the 108+ Convention has also highlighted the sensitivity of the law enforcement context, also in light of the power asymmetries between public authorities and data subjects. See Consultative Committee of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (2021).

  27. 27.

    It should be noted that biometric identification systems should not be conflated with biometric classification ones. Generally speaking, facial recognition technologies may have three different purposes: (i) verification/authentication; (ii) identification; (iii) classification/categorization. For an overview, see Castelvecchi (2020).

  28. 28.

    Art. 3(36) of the Proposal.

  29. 29.

    See Recital 9 of the Proposal for a notion of “publicly accessible place”.

  30. 30.

    See Art. 2(2) of the Framework Decision 2002/584/JHA: Council Framework Decision of 13 June 2002 on the European arrest warrant and the surrender procedures between Member States—Statements made by certain Member States on the adoption of the Framework Decision, OJ L 190, 18.7.2002, p. 1–20.

  31. 31.

    Cf. Article 10 the Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, OJ L 119, 4.5.2016, pp. 89–131 (the Police Directive).

  32. 32.

    Art. 3(34) of the Proposal.

  33. 33.

    See note 27 for an overview of different facial recognition systems.

  34. 34.

    In this case emotion facial recognition technologies are also referred to as “soft biometrics”. See McStay (2020), p. 4. Examples of this kind of applications involve EFR embedded in billboards and shopping malls cameras to register people’s emotional reactions to adverts displayed in public venues.

  35. 35.

    Kotsoglou and Oswald (2020), p. 87; Neroni Rezende (2020), pp. 382–383.

  36. 36.

    Below, Sect. 5.

  37. 37.

    Art 52(1) of the Proposal. Under Art. 52(2), this applies also to biometric classification.

  38. 38.

    On the position of the international agreements concluded by the Union within the hierarchy of the sources of EU law, see Adam and Tizzano (2014), pp. 149–156.

  39. 39.

    See, e.g., European Parliament (2021).

  40. 40.

    The European framework has been rightly described as a multilevel system of protection of fundamental rights. See Kostoris (2018), p. 68 ff.

  41. 41.

    Cf. Neroni Rezende (2021), p. 375, note 63. On the qualification of data stemming from EFR processing as personal data and thus the applicability of the EU data protection framework, see Ienca and Malgieri (2021).

  42. 42.

    See, e.g., Art. 1(1) of the Police Directive.

  43. 43.

    Rouvroy and Poullet (2009), p. 50; Hildebrandt (2010), pp. 36–37.

  44. 44.

    In philosophy, see Nussbaum (2001), p. 33. In cognitive research, see Feldman Barrett (2017), pp. 1–23; Science Daily (2017).

  45. 45.

    Noteworthily, Recital 41 of the Proposal for the AI Regulation excludes that the latter can be understood as providing for a legal basis for the use of the technologies and related data processing operations tackled in the text.

  46. 46.

    Art. 4(1)(b) of the Police Directive.

  47. 47.

    For a very detailed reconstruction of how the case law of the ECtHR and CJEU evolved in this respect, see De Hert and Malgieri (2020).

  48. 48.

    See, e.g. ECtHR, Roman Zakharov v Russia, judgement of 4 December 2015, Appl. No.47143/06, para. 229; ECtHR, Big Brother Watch and others v the United Kingdom, judgement of 13 September 2018, Appl. Nos. 58170/13, 62322/14 and 24960/15, para. 306.

  49. 49.

    ECtHR, Grand Chamber, Big Brother Watch and Others v United Kingdom, judgement of 25 May 2021, Appl. Nos. 58170/13, 62322/14 and 24960/15, para. 333; ECtHR, Zackarov v Russia, para. 229; ECtHR, Malone v the United Kingdom, judgement of 2 August 1984, Appl. No.8691/79, para. 67; ECtHR, Huvig v France, judgement of 24 April 1990, Appl. No.11105/84, para. 29; ECtHR, Kruslin v France, judgement of 24 April 1990, Appl. No.11801/85, para. 30; ECtHR, Rotaru v Romania, judgement of 4 May 2000, Appl. No. 28341/95 para. 55; ECtHR, Weber and Saravia v Germany, judgement of 29 June 2006, Appl. No.54934/00, para. 93;

  50. 50.

    Cf. CJEU, Digital Rights Ireland and Others, judgement of 8 April 2014, Joined Cases C-293/12 and C-594/12, paras. 41–42; CJEU, La Quadrature du Net and Others, judgement of 6 October 2020, Joined Cases C-511/18, C-512/18, C-520/18, para. 122.

  51. 51.

    Brkan (2018), p. 333.

  52. 52.

    Id., p. 339; Ojanen (2016), p. 324.

  53. 53.

    CJEU, Nold v Commission, judgement of 14 May 1974, Case C-4/73, para. 14 [emphasis added].

  54. 54.

    Brkan (2018), p. 347.

  55. 55.

    Id., pp. 341–344.

  56. 56.

    Id., pp. 348–349 (discussing the inconsistency of the interpretation and application of the notion in the ECtHR’s jurisprudence).

  57. 57.

    Id., p. 336.

  58. 58.

    Id.

  59. 59.

    Brkan (2018), p. 360. As for the ECHR, a similar stance is proposed by Rivers (2006), pp. 184–185.

  60. 60.

    Compare CJEU, Digital Rights, paras. 39–40; CJEU, Tele2 Sverige and Watson and Others, judgement of 21 December 2016, Joined Cases C-203/15 and C-698/15, para. 101; CJEU, Opinion 1/15, Opinion of the Court (Grand Chamber) of 26 July 2017, para 150.

  61. 61.

    CJEU, Maximillian Schrems v Data Protection Commissioner, judgement of 6 October 2015, Case C-362/14, paras. 94–95. For a thorough analysis, see Ojanen (2016).

  62. 62.

    Ojanen (2016), p. 326. Christofi and Verdoodt (2019). Tzanou (2017), p. 43. See also Brkan (2018), p. 363 ff.

  63. 63.

    Koops et al. (2017), pp. 531–532; Mantovani (2013), p. 588, note 6.

  64. 64.

    Koops et al. (2017), p. 531.

  65. 65.

    Schabas (2017), p. 420.

  66. 66.

    The scientific community is quite divided on whether EFR technologies are accurate and can actually “read our minds”. As reported by Murgia (2021), the EFR company 4LittleTrees claims around 85% of accuracy, while Affectiva more than 90%, as indicated by Heaven (2020), p. 504. Nonetheless, these results should be taken with a grain of salt. Indeed, one of the major underlying issues concerning the accuracy of these technologies seems to be data annotation. Before the EFR system is trained, datasets need to be labelled by humans choosing whether a given individual in a picture is expressing feelings of fear, happiness etc., often without any context. Even in this case, experts disagree about whether humans are always able to correctly read other’s facial expressions. In this sense, a panel of experts led by psychologist Lisa Feldmann Barrett has recently reviewed more than 1000 contributions on the matter, concluding that there is little to no evidence that people can reliably infer someone else’s emotional state from a set of facial movements. See Heaven (2020), p. 503. Cf. also Chen et al. (2018).

  67. 67.

    Prosser (1984, original work published in 1960), p. 107; Schoeman (1984), p. 16.

  68. 68.

    Brkan (2018), p. 365.

  69. 69.

    See Explanation on Article 1. Explanations relating to the Charter of Fundamental Rights OJ C 303, 14.12.2007, pp. 17–35.

  70. 70.

    McStay (2020), p. 3 (citing Wiewieorowski (2019)).

  71. 71.

    Different might be the case in which the user voluntarily decides to interact with emotional AI, see McStay (2018).

  72. 72.

    See Crawford (2021); Thomas (2018); Kelion (2019).

  73. 73.

    Article 19 (2021), p. 6; McStay (2020), p. 2.

  74. 74.

    Crawford (2021); Article 19 (2021), pp. 15–16; Sedenberg and Chuang (2017), p. 2; Korte (2020). For empirical evidence, see Chen et al. (2018).

  75. 75.

    EDPS (2017), p. 5.

  76. 76.

    Schwartz (2019).

  77. 77.

    Id.

  78. 78.

    Ackerman (2017).

  79. 79.

    US Government Accountability Office (2013). However, the US government has not completely given up emotional biometrics initiatives in the aviation security field, see Hogdson (2019).

  80. 80.

    European Union Agency for Fundamental Rights (2019), p. 3.

  81. 81.

    Cf. CJEU, Digital Rights, para. 48. Within the ECtHR’s case law see ECtHR, Segerstedt-Wiberg and Others v Sweden, judgement of 6 June 2006, Appl. No. 62332/00, para. 88.

  82. 82.

    Id. para. 54. See also Ienca and Malgieri (2021), pp. 9–10.

  83. 83.

    The notion of uncontrolled environments covers “places freely accessible to individuals, where they can also pass through, including public and quasi-public spaces such as shopping malls, hospitals, or schools”. Consultative Committee (2021), p. 5.

  84. 84.

    Information Commissioner’s Office (ICO) (2021), p. 9; see Consultative Committee (2021), p. 6 (discussing the role of consent in the use of AFR by public authorities).

  85. 85.

    CJEU, Digital Rights, para. 37.

  86. 86.

    While the ECtHR has acknowledged that subsequent notification is a relevant factor when assessing the effectiveness of remedies (see ECtHR, Roman Zakharov, para. 234; see also ECtHR, Klass and Others v Germany, judgement of 6 September 1978, Appl. No.5029/71, paras. 68–71; ECtHR, Weber and Saravia, para. 135), it has also considered that in bulk interception systems remedies that do not depend from previous individual notification may even provide better guarantees (see ECtHR, Big Brother Watch, para. 358). On notification in the ECtHR’s surveillance case law, see De Hert and Malgieri (2020), pp. 26–29.

  87. 87.

    CJEU, Tele2/Watson, para. 121.

  88. 88.

    The same happens when social media databases are integrated in AFR software, enabling the identification of people that have not been inserted in watchlists. See Neroni Rezende (2020), p. 385.

  89. 89.

    2021/0106(COD) Artificial Intelligence Act, Legislative Observatory.

  90. 90.

    EDPB-EDPS (2021), p. 3.

  91. 91.

    Id. The same opinion is shared by the Consultative Committee (2021), p. 5 [emphasis added].

  92. 92.

    For instance, it has emerged that controllers often give insufficient consideration to necessity and proportionality issues tied to the deployment of such systems. See ICO (2021), p. 11.

  93. 93.

    Above, Sect. 2.

  94. 94.

    The same categories of offences are listed as constituting serious crime in the Annex II of the Directive (EU) 2016/681 of the European Parliament and of the Council of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime, OJ L 119, 4.5.2016, p. 132–149.

  95. 95.

    On the issues that the growing proximity between intelligence and law enforcement has raised, see generally Vervaele (2005); De Hert (2005).

  96. 96.

    CJEU, La Quadrature du Net, paras. 135–137.

  97. 97.

    Ienca and Malgieri identify “mental data” with emotions or other thoughts that are not “related to health status, sexuality or political/religious beliefs”. See Ienca and Malgieri (2021), p. 1.

  98. 98.

    CJEU, Digital Rights, para. 57; CJEU, Maximilian Schrems v Data Protection Commissioner, para. 93; Tele2/Watson, para. 110; CJEU, Opinion 1/15, para. 191; CJEU, La Quadrature du Net, para. 133; CJEU, Privacy International v Secretary of State for Foreign and Commonwealth Affairs and Others, judgement of 6 October 2020, Case C-623/17, para. 78. With regard to the application of these criteria to the case of the AFR app Clearview, see Neroni Rezende (2020), p. 385 ff.

  99. 99.

    This approach has also made inroads outside the EU legal system with the recent decision R (Bridges) v the Chief Constable of South Wales Police [2020] EWCA CIV 1058. Specifically, the Court of Appeal stated that two concerns arose within the legal framework of AFR Locate, namely the “who question” and the “where question”. Indeed, in relation to the people that could be inserted in the watchlists and the locations where the technology could be deployed, legal rules were too generic and left an excessive margin of appreciation to public authorities.

  100. 100.

    Starting from the Huvig judgement, the ECtHR elaborated a set of foreseeability criteria against which surveillance laws need to be assessed. These criteria have been later refined in the Weber and Saravia case, and have been thus called “Huvig” or “Weber” criteria since then. According to De Hert and Malgieri, these criteria have been implicitly integrated in the CJEU case law since the Digital Rights Ireland case. See De Hert and Malgieri (2020), p. 32.

  101. 101.

    See Art. 11 of the Police Directive and Art. 22 GDPR.

  102. 102.

    A similar mechanism is already provided in Art. 6(5) of the PNR Directive.

  103. 103.

    The need for establishing difference in the storing regime according to the specific situation of the data subjects emerges clearly in the case law of the CJEU. See CJEU, Opinion 1/15, paras. 196–203.

  104. 104.

    The AFR Locate program, implemented by the Welsh police and censured in the Bridges case, provided that when the facial data processing of passers-by did not lead to any positive match, such data should have been immediately erased. See R (Bridges) v Chief Constable of the South Wales Police [2019] EWHC 2341, para. 16.

  105. 105.

    See paragraph below.

  106. 106.

    De Hert and Malgieri (2020), p. 10. See also Malgieri and De Hert (2017).

  107. 107.

    See De Hert and Malgieri (2020), pp. 26–29.

  108. 108.

    Big Brother Watch, para. 358.

  109. 109.

    See, e.g., Caianiello (2019); Hadjimatheou (2017), p. 40; De Hert (2005), p. 85.

  110. 110.

    As indicated by Sayers (2014), p. 1305 ff. In 2012, the CJEU recognized the presumption of innocence as “a feature of the constitutional traditions common to the Member States”. See CJEU, Criminal proceedings against Marcello Costa and Ugo Cifone, judgement of 16 February 2021, Joined Cases C-72/10 and C-77/10, para. 86.

  111. 111.

    All EU Member States are part to the International Covenant on Civil and Political Rights, whose Art. 14(2) explicitly refers to the accused’s right “right to be presumed innocent until proved guilty according to law”.

  112. 112.

    In primary EU law, the presumption of innocence is enshrined in Art. 48 of the Charter, whose explanations equate to the contents of Art. 6(2) of the Convention. Even before the entry into force of the Charter, however, the CJEU had already recognized the presumption of innocence as one of the fundamental rights protected in Union law (see CJEU, Montecatini S.p.A, judgement of 8 July 1999, Case C-235/92, para. 175). At the level of secondary law, this right is explicitly recalled at Art. 2 of the Directive (EU) 2016/343 of the European Parliament and of the Council of 9 March 2016 on the strengthening of certain aspects of the presumption of innocence and of the right to be present at the trial in criminal proceedings OJ L 65, 11.3.2016, pp. 1–11. On the protection of the presumption of innocence in the EU legal system, see generally Sayers (2014); Balsamo (2018), pp. 253–255; Manes and Caianiello (2020), pp. 249–255.

  113. 113.

    ECtHR, Konstas v. Greece, judgement of 24 May 2011, Appl. No.53466/07, para. 29. It is no surprise that the ECtHR frequently examines complaints of violations of the presumption of innocence with joint reference to both the first and second paragraph of Art. 6.

  114. 114.

    This means that the burden of proof is placed on the prosecution, and any doubt on the criminal responsibility of the accused should profit the latter. Cf. ECtHR, John Murray v. United Kingdom, judgement of 8 February 1996, Appl. No.18731/91, paras. 54; ECtHR, Telfner v. Austria, judgement of 20 March 2011, Appl. No.33501/96.

  115. 115.

    This rule prohibits that the accused person is considered or treated as guilty before her responsibility is established by a court of law. Cf. ECtHR, Shyti v. Romania, judgement of 19 November 2013, Appl. No. 12042/05.

  116. 116.

    De Hert (2005), p. 85.

  117. 117.

    In this regard, it should also be noted that the Charter used a more neutral language compared to the Convention. Indeed, while Art. 6(2) ECHR employs the expression “charged with a criminal offence”—which should be nonetheless interpreted in light of the so-called ‘Engel criteria’—the Charter only uses the term ‘charged’, avoiding any explicit reference to criminal offences.

  118. 118.

    Hadjimatheou (2017), pp. 41, 43 ff.

  119. 119.

    Ashworth and Zedner (2014), p. 66.

  120. 120.

    Id., p. 130 (citing Floud, Young (1982)).

  121. 121.

    Hadjimatheou (2017), p. 41.

  122. 122.

    On the preventive turn taken by policing and the role played by digital technologies, see van Brakel and De Hert (2011); Brayne (2017); Ferguson (2017).

  123. 123.

    Wrongful criminalization can be defined as “treating someone as if they have a particular propensity towards criminality or indeed are already involved in criminal activity, without proper grounds for doing so”. See Hadjimatheou (2017), p. 45

  124. 124.

    Balsamo (2018), p. 116.

  125. 125.

    Above, Sect. 4.

  126. 126.

    CJEU, Digital Rights, para. 37.

  127. 127.

    CJEU, La Quadrature du Net, para. 143 [emphasis added].

  128. 128.

    CJEU, La Quadrature du Net, para. 188.

  129. 129.

    ECtHR, 20 October 1998, Salabiaku v. France, Appl. No.10519/83, para. 28. More recently, see also ECtHR, 26 January 2016, Iasir v. Belgium, Appl. no. 21614/12, para. 30. In EU law, this approach was confirmed in the Directive on the presumption of innocence. Its Recital (22) indicates that the principle is not impinged by the use of presumptions, provided that these are “rebuttable”, “used only where the rights of the defence are respected”, and “confined within reasonable limits”, also considering the proportionate use of means employed in relation to the aims pursued.

  130. 130.

    Above, Sect. 4.

  131. 131.

    On the issues brought about the use of AI system with regard to the defence rights, especially in adversarial systems, see Contissa and Lasagni (2020); Quattroccolo (2019).

References

Download references

Acknowledgement

This contribution was written in the framework of the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie ITN EJD ‘Law, Science and Technology Rights of Internet of Everything’ grant agreement no. 814177.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Isadora Neroni Rezende .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Neroni Rezende, I. (2022). Facial Recognition for Preventive Purposes: The Human Rights Implications of Detecting Emotions in Public Spaces. In: Bachmaier Winter, L., Ruggeri, S. (eds) Investigating and Preventing Crime in the Digital Era. Legal Studies in International, European and Comparative Criminal Law, vol 7. Springer, Cham. https://doi.org/10.1007/978-3-031-13952-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-13952-9_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-13951-2

  • Online ISBN: 978-3-031-13952-9

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics