Abstract
Police departments are increasingly relying on surveillance technologies to tackle public security issues in smart cities. Automated facial recognition is deployed in public spaces for real-time identification of suspects and warranted individuals. In some cases, law enforcement is going even further by exploiting also emotion recognition technologies. In preventive operations indeed, emotion facial recognition (EFR) is being used to infer individuals’ inner affective states from traits like facial muscle movements. In this way, law enforcement aims to obtain insightful hints on unknown persons acting suspiciously in public or strategic venues (e.g. train stations, airports). While the employment of such tools still seems to be relegated to dystopian scenarios, it is already a reality in some parts of the world. Hence, there emerges a need to explore their compatibility with the European human rights framework. The Chapter undertakes this task and examines whether and how EFR can be considered compliant with the rights to privacy and data protection, the freedom of thought and the presumption of innocence.
Isadora Neroni Rezende is a PhD Candidate in Law, Science and Technology—Rights of the Internet of Everything.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
Marat and Sutton (2021), p. 248.
- 3.
On the potentialities of AI applications in law enforcement and criminal justice, see Lasagni, in this volume; Caianiello (2021).
- 4.
- 5.
Facial recognition technologies indeed process biometric data, a special category of personal data. On the notion of biometric data, see generally Kindt (2018).
- 6.
See, e.g., BBC News (2018).
- 7.
Notably, in China AFR is used to catch jaywalkers. See Liao (2018).
- 8.
Heilweil (2020).
- 9.
Clearview AI is an US-based tech company which provides facial recognition services. Notably, the app developed by Clearview runs its software not only on government-held images, but also on people’s pictures scraped by social media network. On the matter see Neroni Rezende (2020).
- 10.
O’Flaherty (2020), p. 170.
- 11.
See, e.g., Raab (2019).
- 12.
Peeters (2020).
- 13.
Affective computing comprises both “the creation of and interaction with machine systems that sense, recognize, respond to, and influence emotions”. See Daily et al. (2017), p. 213.
- 14.
Mc Stay (2020), p. 1.
- 15.
Article 19 (2021), p. 15.
- 16.
On different applications beyond the security domain, see Mc Stay (2020).
- 17.
On lie detectors and their implications in criminal proceedings, see Lasagni (2021).
- 18.
Article 19 (2021), p. 21.
- 19.
Crawford (2021).
- 20.
Article 19 (2021), 19.
- 21.
Id.
- 22.
Id.
- 23.
- 24.
- 25.
Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, COM/2021/206 final. Critically assessed by Vaele and Borgesius (2021); Papakostantinou and De Hert (2021).
- 26.
Malgieri and Ienca note indeed that the scheme of classification of high-risk AI system seems to revolve around three main criteria: (i) the type of AI system; (ii) its domain of application and (iii) its human target. This implies that if AI systems featuring limited risks are employed in very sensitive contexts and used for practices falling under the unbearable risk list they would be prohibited. This mechanism emerges clearly in the case of EFR that is labelled as low-risk when employed, for instance, in the commercial context, and as high-risk when used in law enforcement or education. See Malgieri and Ienca (2021). The Consultative Committee on the 108+ Convention has also highlighted the sensitivity of the law enforcement context, also in light of the power asymmetries between public authorities and data subjects. See Consultative Committee of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (2021).
- 27.
It should be noted that biometric identification systems should not be conflated with biometric classification ones. Generally speaking, facial recognition technologies may have three different purposes: (i) verification/authentication; (ii) identification; (iii) classification/categorization. For an overview, see Castelvecchi (2020).
- 28.
Art. 3(36) of the Proposal.
- 29.
See Recital 9 of the Proposal for a notion of “publicly accessible place”.
- 30.
See Art. 2(2) of the Framework Decision 2002/584/JHA: Council Framework Decision of 13 June 2002 on the European arrest warrant and the surrender procedures between Member States—Statements made by certain Member States on the adoption of the Framework Decision, OJ L 190, 18.7.2002, p. 1–20.
- 31.
Cf. Article 10 the Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, OJ L 119, 4.5.2016, pp. 89–131 (the Police Directive).
- 32.
Art. 3(34) of the Proposal.
- 33.
See note 27 for an overview of different facial recognition systems.
- 34.
In this case emotion facial recognition technologies are also referred to as “soft biometrics”. See McStay (2020), p. 4. Examples of this kind of applications involve EFR embedded in billboards and shopping malls cameras to register people’s emotional reactions to adverts displayed in public venues.
- 35.
- 36.
Below, Sect. 5.
- 37.
Art 52(1) of the Proposal. Under Art. 52(2), this applies also to biometric classification.
- 38.
On the position of the international agreements concluded by the Union within the hierarchy of the sources of EU law, see Adam and Tizzano (2014), pp. 149–156.
- 39.
See, e.g., European Parliament (2021).
- 40.
The European framework has been rightly described as a multilevel system of protection of fundamental rights. See Kostoris (2018), p. 68 ff.
- 41.
- 42.
See, e.g., Art. 1(1) of the Police Directive.
- 43.
- 44.
- 45.
Noteworthily, Recital 41 of the Proposal for the AI Regulation excludes that the latter can be understood as providing for a legal basis for the use of the technologies and related data processing operations tackled in the text.
- 46.
Art. 4(1)(b) of the Police Directive.
- 47.
For a very detailed reconstruction of how the case law of the ECtHR and CJEU evolved in this respect, see De Hert and Malgieri (2020).
- 48.
See, e.g. ECtHR, Roman Zakharov v Russia, judgement of 4 December 2015, Appl. No.47143/06, para. 229; ECtHR, Big Brother Watch and others v the United Kingdom, judgement of 13 September 2018, Appl. Nos. 58170/13, 62322/14 and 24960/15, para. 306.
- 49.
ECtHR, Grand Chamber, Big Brother Watch and Others v United Kingdom, judgement of 25 May 2021, Appl. Nos. 58170/13, 62322/14 and 24960/15, para. 333; ECtHR, Zackarov v Russia, para. 229; ECtHR, Malone v the United Kingdom, judgement of 2 August 1984, Appl. No.8691/79, para. 67; ECtHR, Huvig v France, judgement of 24 April 1990, Appl. No.11105/84, para. 29; ECtHR, Kruslin v France, judgement of 24 April 1990, Appl. No.11801/85, para. 30; ECtHR, Rotaru v Romania, judgement of 4 May 2000, Appl. No. 28341/95 para. 55; ECtHR, Weber and Saravia v Germany, judgement of 29 June 2006, Appl. No.54934/00, para. 93;
- 50.
Cf. CJEU, Digital Rights Ireland and Others, judgement of 8 April 2014, Joined Cases C-293/12 and C-594/12, paras. 41–42; CJEU, La Quadrature du Net and Others, judgement of 6 October 2020, Joined Cases C-511/18, C-512/18, C-520/18, para. 122.
- 51.
Brkan (2018), p. 333.
- 52.
Id., p. 339; Ojanen (2016), p. 324.
- 53.
CJEU, Nold v Commission, judgement of 14 May 1974, Case C-4/73, para. 14 [emphasis added].
- 54.
Brkan (2018), p. 347.
- 55.
Id., pp. 341–344.
- 56.
Id., pp. 348–349 (discussing the inconsistency of the interpretation and application of the notion in the ECtHR’s jurisprudence).
- 57.
Id., p. 336.
- 58.
Id.
- 59.
- 60.
Compare CJEU, Digital Rights, paras. 39–40; CJEU, Tele2 Sverige and Watson and Others, judgement of 21 December 2016, Joined Cases C-203/15 and C-698/15, para. 101; CJEU, Opinion 1/15, Opinion of the Court (Grand Chamber) of 26 July 2017, para 150.
- 61.
CJEU, Maximillian Schrems v Data Protection Commissioner, judgement of 6 October 2015, Case C-362/14, paras. 94–95. For a thorough analysis, see Ojanen (2016).
- 62.
- 63.
- 64.
Koops et al. (2017), p. 531.
- 65.
Schabas (2017), p. 420.
- 66.
The scientific community is quite divided on whether EFR technologies are accurate and can actually “read our minds”. As reported by Murgia (2021), the EFR company 4LittleTrees claims around 85% of accuracy, while Affectiva more than 90%, as indicated by Heaven (2020), p. 504. Nonetheless, these results should be taken with a grain of salt. Indeed, one of the major underlying issues concerning the accuracy of these technologies seems to be data annotation. Before the EFR system is trained, datasets need to be labelled by humans choosing whether a given individual in a picture is expressing feelings of fear, happiness etc., often without any context. Even in this case, experts disagree about whether humans are always able to correctly read other’s facial expressions. In this sense, a panel of experts led by psychologist Lisa Feldmann Barrett has recently reviewed more than 1000 contributions on the matter, concluding that there is little to no evidence that people can reliably infer someone else’s emotional state from a set of facial movements. See Heaven (2020), p. 503. Cf. also Chen et al. (2018).
- 67.
- 68.
Brkan (2018), p. 365.
- 69.
See Explanation on Article 1. Explanations relating to the Charter of Fundamental Rights OJ C 303, 14.12.2007, pp. 17–35.
- 70.
- 71.
Different might be the case in which the user voluntarily decides to interact with emotional AI, see McStay (2018).
- 72.
- 73.
- 74.
- 75.
EDPS (2017), p. 5.
- 76.
Schwartz (2019).
- 77.
Id.
- 78.
Ackerman (2017).
- 79.
- 80.
European Union Agency for Fundamental Rights (2019), p. 3.
- 81.
Cf. CJEU, Digital Rights, para. 48. Within the ECtHR’s case law see ECtHR, Segerstedt-Wiberg and Others v Sweden, judgement of 6 June 2006, Appl. No. 62332/00, para. 88.
- 82.
Id. para. 54. See also Ienca and Malgieri (2021), pp. 9–10.
- 83.
The notion of uncontrolled environments covers “places freely accessible to individuals, where they can also pass through, including public and quasi-public spaces such as shopping malls, hospitals, or schools”. Consultative Committee (2021), p. 5.
- 84.
- 85.
CJEU, Digital Rights, para. 37.
- 86.
While the ECtHR has acknowledged that subsequent notification is a relevant factor when assessing the effectiveness of remedies (see ECtHR, Roman Zakharov, para. 234; see also ECtHR, Klass and Others v Germany, judgement of 6 September 1978, Appl. No.5029/71, paras. 68–71; ECtHR, Weber and Saravia, para. 135), it has also considered that in bulk interception systems remedies that do not depend from previous individual notification may even provide better guarantees (see ECtHR, Big Brother Watch, para. 358). On notification in the ECtHR’s surveillance case law, see De Hert and Malgieri (2020), pp. 26–29.
- 87.
CJEU, Tele2/Watson, para. 121.
- 88.
The same happens when social media databases are integrated in AFR software, enabling the identification of people that have not been inserted in watchlists. See Neroni Rezende (2020), p. 385.
- 89.
2021/0106(COD) Artificial Intelligence Act, Legislative Observatory.
- 90.
EDPB-EDPS (2021), p. 3.
- 91.
Id. The same opinion is shared by the Consultative Committee (2021), p. 5 [emphasis added].
- 92.
For instance, it has emerged that controllers often give insufficient consideration to necessity and proportionality issues tied to the deployment of such systems. See ICO (2021), p. 11.
- 93.
Above, Sect. 2.
- 94.
The same categories of offences are listed as constituting serious crime in the Annex II of the Directive (EU) 2016/681 of the European Parliament and of the Council of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime, OJ L 119, 4.5.2016, p. 132–149.
- 95.
- 96.
CJEU, La Quadrature du Net, paras. 135–137.
- 97.
Ienca and Malgieri identify “mental data” with emotions or other thoughts that are not “related to health status, sexuality or political/religious beliefs”. See Ienca and Malgieri (2021), p. 1.
- 98.
CJEU, Digital Rights, para. 57; CJEU, Maximilian Schrems v Data Protection Commissioner, para. 93; Tele2/Watson, para. 110; CJEU, Opinion 1/15, para. 191; CJEU, La Quadrature du Net, para. 133; CJEU, Privacy International v Secretary of State for Foreign and Commonwealth Affairs and Others, judgement of 6 October 2020, Case C-623/17, para. 78. With regard to the application of these criteria to the case of the AFR app Clearview, see Neroni Rezende (2020), p. 385 ff.
- 99.
This approach has also made inroads outside the EU legal system with the recent decision R (Bridges) v the Chief Constable of South Wales Police [2020] EWCA CIV 1058. Specifically, the Court of Appeal stated that two concerns arose within the legal framework of AFR Locate, namely the “who question” and the “where question”. Indeed, in relation to the people that could be inserted in the watchlists and the locations where the technology could be deployed, legal rules were too generic and left an excessive margin of appreciation to public authorities.
- 100.
Starting from the Huvig judgement, the ECtHR elaborated a set of foreseeability criteria against which surveillance laws need to be assessed. These criteria have been later refined in the Weber and Saravia case, and have been thus called “Huvig” or “Weber” criteria since then. According to De Hert and Malgieri, these criteria have been implicitly integrated in the CJEU case law since the Digital Rights Ireland case. See De Hert and Malgieri (2020), p. 32.
- 101.
See Art. 11 of the Police Directive and Art. 22 GDPR.
- 102.
A similar mechanism is already provided in Art. 6(5) of the PNR Directive.
- 103.
The need for establishing difference in the storing regime according to the specific situation of the data subjects emerges clearly in the case law of the CJEU. See CJEU, Opinion 1/15, paras. 196–203.
- 104.
The AFR Locate program, implemented by the Welsh police and censured in the Bridges case, provided that when the facial data processing of passers-by did not lead to any positive match, such data should have been immediately erased. See R (Bridges) v Chief Constable of the South Wales Police [2019] EWHC 2341, para. 16.
- 105.
See paragraph below.
- 106.
- 107.
See De Hert and Malgieri (2020), pp. 26–29.
- 108.
Big Brother Watch, para. 358.
- 109.
- 110.
As indicated by Sayers (2014), p. 1305 ff. In 2012, the CJEU recognized the presumption of innocence as “a feature of the constitutional traditions common to the Member States”. See CJEU, Criminal proceedings against Marcello Costa and Ugo Cifone, judgement of 16 February 2021, Joined Cases C-72/10 and C-77/10, para. 86.
- 111.
All EU Member States are part to the International Covenant on Civil and Political Rights, whose Art. 14(2) explicitly refers to the accused’s right “right to be presumed innocent until proved guilty according to law”.
- 112.
In primary EU law, the presumption of innocence is enshrined in Art. 48 of the Charter, whose explanations equate to the contents of Art. 6(2) of the Convention. Even before the entry into force of the Charter, however, the CJEU had already recognized the presumption of innocence as one of the fundamental rights protected in Union law (see CJEU, Montecatini S.p.A, judgement of 8 July 1999, Case C-235/92, para. 175). At the level of secondary law, this right is explicitly recalled at Art. 2 of the Directive (EU) 2016/343 of the European Parliament and of the Council of 9 March 2016 on the strengthening of certain aspects of the presumption of innocence and of the right to be present at the trial in criminal proceedings OJ L 65, 11.3.2016, pp. 1–11. On the protection of the presumption of innocence in the EU legal system, see generally Sayers (2014); Balsamo (2018), pp. 253–255; Manes and Caianiello (2020), pp. 249–255.
- 113.
ECtHR, Konstas v. Greece, judgement of 24 May 2011, Appl. No.53466/07, para. 29. It is no surprise that the ECtHR frequently examines complaints of violations of the presumption of innocence with joint reference to both the first and second paragraph of Art. 6.
- 114.
This means that the burden of proof is placed on the prosecution, and any doubt on the criminal responsibility of the accused should profit the latter. Cf. ECtHR, John Murray v. United Kingdom, judgement of 8 February 1996, Appl. No.18731/91, paras. 54; ECtHR, Telfner v. Austria, judgement of 20 March 2011, Appl. No.33501/96.
- 115.
This rule prohibits that the accused person is considered or treated as guilty before her responsibility is established by a court of law. Cf. ECtHR, Shyti v. Romania, judgement of 19 November 2013, Appl. No. 12042/05.
- 116.
De Hert (2005), p. 85.
- 117.
In this regard, it should also be noted that the Charter used a more neutral language compared to the Convention. Indeed, while Art. 6(2) ECHR employs the expression “charged with a criminal offence”—which should be nonetheless interpreted in light of the so-called ‘Engel criteria’—the Charter only uses the term ‘charged’, avoiding any explicit reference to criminal offences.
- 118.
Hadjimatheou (2017), pp. 41, 43 ff.
- 119.
Ashworth and Zedner (2014), p. 66.
- 120.
Id., p. 130 (citing Floud, Young (1982)).
- 121.
Hadjimatheou (2017), p. 41.
- 122.
- 123.
Wrongful criminalization can be defined as “treating someone as if they have a particular propensity towards criminality or indeed are already involved in criminal activity, without proper grounds for doing so”. See Hadjimatheou (2017), p. 45
- 124.
Balsamo (2018), p. 116.
- 125.
Above, Sect. 4.
- 126.
CJEU, Digital Rights, para. 37.
- 127.
CJEU, La Quadrature du Net, para. 143 [emphasis added].
- 128.
CJEU, La Quadrature du Net, para. 188.
- 129.
ECtHR, 20 October 1998, Salabiaku v. France, Appl. No.10519/83, para. 28. More recently, see also ECtHR, 26 January 2016, Iasir v. Belgium, Appl. no. 21614/12, para. 30. In EU law, this approach was confirmed in the Directive on the presumption of innocence. Its Recital (22) indicates that the principle is not impinged by the use of presumptions, provided that these are “rebuttable”, “used only where the rights of the defence are respected”, and “confined within reasonable limits”, also considering the proportionate use of means employed in relation to the aims pursued.
- 130.
Above, Sect. 4.
- 131.
References
2021/0106(COD) Artificial Intelligence Act, Legislative Observatory. https://oeil.secure.europarl.europa.eu/oeil/popups/ficheprocedure.do?reference=2021/0106(COD)&l=en. Accessed 9 July 2021
Ackerman S (2017) TSA screening program risks racial profiling amid shaky science – study. The Guardian. https://www.theguardian.com/us-news/2017/feb/08/tsa-screening-racial-religious-profiling-aclu-study. Accessed 3 July 2021
Adam R, Tizzano A (2014) Manuale di Diritto dell’Unione Europea. Giappichelli Editore, Torino
Albino V et al (2015) Smart cities: definitions, dimensions, performance and initiatives. J Urban Technol 22:1–19
Article 19 (2021) Emotional Entanglement: China’s emotion recognition market and its implications for human rights. https://www.article19.org/wp-content/uploads/2021/01/ER-Tech-China-Report.pdf. Accessed 22 June 2020
Article 29 Data Protection Working Party (2012) Opinion 02/2012 on facial recognition in online and mobile services, 00727/12/EN, WP 192, Brussels, 22 March 2012
Ashworth A, Zedner L (2014) Preventive justice. Oxford monographs on criminal law and justice. Oxford University Press
Balsamo A (2018) The content of fundamental rights. In: Kostoris RE (ed) Handbook of European criminal procedure. Springer, pp 99–170
BBC News (2018) 2,000 wrongly matched with possible criminals at Champions League. https://www.bbc.com/news/uk-wales-south-west-wales-44007872. Accessed 11 July 2021
Berle I (2020) Face recognition technology. Springer, Law, Governance and Technology Series
Brayne S (2017) Big data surveillance: the case of policing. Am Sociol Rev 82(5):977–1008
Brkan M (2018) The concept of essence of fundamental rights in the EU legal order: peeling the onion to its Core. Eur Const Law Rev 14:332–368
Caianiello M (2019) Criminal process faced with the challenges of scientific and technological development. Eur J Crime Crim Law Crim Just 27(4):267–291
Caianiello M (2021) Dangerous liaisons. Potentialities and risks deriving from the interaction between artificial intelligence and preventive justice. Eur J Crime Crim Law Crim Just 29(1):1–23
Castelvecchi D (2020) Is facial recognition too biased to be let loose? Nature 587:347–349. https://www.nature.com/articles/d41586-020-03186-4. Accessed 28 June 2020
Chen C et al (2018) Distinct facial expressions represent pain and pleasure across cultures. Proc Natl Acad Sci U S A 115(43):E10013–E10021. https://www.pnas.org/content/pnas/115/43/E10013.full.pdf. Accessed 2 July 2021
Christofi A, Verdoodt V (2019) Exploring the essence of the right to data protection and smart cities. CiTiP Working Paper. Available at SSRN: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3483616. Accessed 2 July 2021
Consultative Committee of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (2021) Guidelines on facial recognition. https://rm.coe.int/guidelines-on-facial-recognition/1680a134f3. Accessed 8 July 2021
Contissa G, Lasagni G (2020) When it is (also) algorithms and AI that decide on criminal matters: in search of an effective remedy. Eur J Crime Crim Law Crim Just 28:280–304
Crawford K (2021) Time to regulate AI that interprets human emotions. Nature 592(7853):167. https://www.nature.com/articles/d41586-021-00868-5. Accessed 15 Aug 2022
Daily SB, James MT, Cherry D, Porter JJ, Darnell SS, Isaac J, Roy T (2017) Affective computing: historical foundations, current applications, and future trends. In: Jeon M (ed) Emotions and affect in human factors and human-computer interaction. Associated Press, pp 213–231
De Hert P (2005) Balancing security and liberty within the European human rights framework. A critical regarding of Court’s case law in the light of surveillance and criminal law enforcement strategies after 9/11. Utrecht Law Rev 1(1):68–96
De Hert P, Malgieri G (2020) Article 8 ECHR Compliant and Foreseeable Surveillance: the ECtHR’s expanded legality requirement copied by the CJEU. A Discussion of European Surveillance Case Law. Brussels Privacy Hub Working Paper 6(1):1–40
EDPB-EDPS (2021) Joint Opinion 5/2021 on the proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act). https://edpb.europa.eu/system/files/2021-06/edpb-edps_joint_opinion_ai_regulation_en.pdf. Accessed 6 July 2021
EDPS (2017) Assessing the necessity of measures that limit the fundamental right to the protection of personal data: a toolkit. https://edps.europa.eu/sites/default/files/publication/17-06-01_necessity_toolkit_final_en.pdf. Accessed 16 Aug 2022
European Parliament (2021) Artificial Intelligence in policing: safeguards needed against mass surveillance. Press Release. https://www.europarl.europa.eu/news/en/press-room/20210624IPR06917/artificial-intelligence-in-policing-safeguards-needed-against-mass-surveillance. Accessed 30 June 2021
European Union Agency for Fundamental Rights (2019) Facial Recognition Technology: Fundamental Rights Considerations in the Context of Law Enforcement. https://fra.europa.eu/en/publication/2019/facial-recognition-technology-fundamental-rights-considerations-context-law. Accessed 6 July 2021
Feldman Barrett L (2017) The theory of constructed emotion: an active inference account of interoception and categorization. Soc Cogn Affect Neurosci 12(11):1–23
Ferguson AG (2017) Big data surveillance: the convergence of big data and law enforcement. In: Gray D, Henderson SE (eds) The Cambridge handbook of surveillance law. Cambridge University Press
Gallagher R, Jona L (2019) We Tested Europe’s New Lie detector for travelers — and immediately triggered a false positive. The Intercept. https://theintercept.com/2019/07/26/europe-border-control-ai-lie-detector/. Accessed 8 July 2021
Hadjimatheou K (2017) Surveillance technologies, wrongful criminalisation, and the presumption of innocence. Philos Technol 30:39–54
Heaven D (2020) Expression of doubt. Nature 578:502–504
Heilweil R (2020) Big tech companies back away from selling facial recognition to police. That’s progress. The Vox. https://www.vox.com/recode/2020/6/10/21287194/amazon-microsoft-ibm-facial-recognition-moratorium-police. Accessed 11 July 2021
Hildebrandt M (2010) Some caveats on profiling. In: Gutwirth S, Poullet Y, de Hert P (eds) Data protection in a profiled world. Springer
Hogdson C (2019) AI lie detector developed for airport security. The Financial Times. https://www.ft.com/content/c9997e24-b211-11e9-bec9-fdcab53d6959. Accessed 8 July 2021
iBorderCrtl (2016) The Project. https://www.iborderctrl.eu/The-project. Accessed 11 July 2021
Ienca M, Malgieri G (2021) Mental Data Protection and the GDPR. Available at SSRN.: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3840403. Accessed 8 July 2021
Information Commissioner’s Office (ICO) (2021) The use of live facial recognition technology by law enforcement in public places. https://ico.org.uk/media/fororganisations/documents/2619985/ico-opinion-the-use-of-lfr-in-public-places-20210618.pdf. Accessed 5 July 2021
Kelion L (2019) Emotion-detecting tech should be restricted by law – AI Now. BBC. https://www.bbc.com/news/technology-50761116. Accessed 15 Aug 2022
Kindt EJ (2018) Having yes, using no? About the new legal regime for biometric data. Comput Law Secur Rev 34(3):523–538
Koops BJ, Newell BC, Timan T, Škorvánek I, Chokrevski T, Galič M (2017) A typology of privacy. U Pa J Int Law 38(2):483–575
Korte A (2020) Facial-Recognition Technology Cannot Read Emotions, Scientists Say. American Association for the Advancement of Science. https://www.aaas.org/news/facial-recognition-technology-cannot-read-emotions-scientists-say. Accessed 8 July 2020
Kostoris RE (2018) The protection of fundamental rights. In: Kostoris RE (ed) Handbook of European criminal procedure. Springer
Kotsoglou KN, Oswald M (2020) The long arm of the algorithm? Automated facial recognition as evidence and trigger for police intervention. Forensic Sci Int Synergy 2:86–69
Kummitha RKR, Crutzen N (2017) How do we understand smart cities? An evolutionary perspective. Cities 67:43–52
Lasagni G (2021) La Falsa Confessione come Causa di Errori Giudiziari. In: Lupària Donati L (ed) L’Errore Giudiziario. Giuffrè, pp 189–225
Liao S (2018) Chinese facial recognition system mistakes a face on a bus for a jaywalker. The Verge. https://www.theverge.com/2018/11/22/18107885/china-facial-recognition-mistaken-jaywalker. Accessed 11 July 2021
Malgieri G, De Hert P (2017) European human rights, criminal surveillance, and intelligence surveillance: towards “good enough” oversight, preferably but not necessarily by judges. In: Gray DC, Henderson S (eds) The Cambridge handbook on surveillance. Cambridge University Press, New York, pp 509–532
Malgieri G, Ienca M (2021) The EU regulates AI but forgets to protect our mind. European Law Blog. https://europeanlawblog.eu/2021/07/07/the-eu-regulates-ai-but-forgets-to-protect-our-mind/#more-7784. Accessed 7 July 2021
Manes V, Caianiello M (2020) Manuale di Diritto Penale Europeo. Giappichelli
Marat E, Sutton D (2021) Technological solutions for complex problems: emerging electronic surveillance regimes in Eurasian cities. Eur Asia Stud 73(1):243–267
Mantovani F (2013) Diritto Penale: Parte Speciale I. Delitti Contro la Persona, 5th edn. Cedam
McStay A (2018) The Right to Privacy in the Age of Emotional AI. https://www.ohchr.org/Documents/Issues/DigitalAge/ReportPrivacyinDigitalAge/AndrewMcStayProfessor%20of%20Digital%20Life,%20BangorUniversityWalesUK.pdf. Accessed 2 July 2021
McStay A (2020) Emotional AI, soft biometrics and the surveillance of emotional life: an unusual consensus on privacy. Big Data Soc January-June 2020:1–12
Murgia M (2021) Emotion recognition: can AI detect human feelings from a face?. Financial Times. https://www.ft.com/content/c0b03d1d-f72f-48a8-b342-b4a926109452. Accessed 30 January 2022
Neroni Rezende I (2020) Facial recognition in police hands: assessing the ‘Clearview case’ from a European perspective. New J Eur Crim Law 11(3):375–389
Neroni Rezende I (2021) Predictive policing: safeguards for the choice of data and automated processing in the preventive context. In: Barona Vilar S (ed) Justicia Algorítmica y Neuroderecho: Una mirada multidisciplinaria. Tirada lo Banch, Valencia, pp 361–387
Nussbaum MC (2001) Upheavels of thought: the intelligence of emotions. Cambridge University Press
O’Flaherty M (2020) Facial recognition technology and fundamental rights. European Data Protection Law Review 6(2):170–173
Ojanen T (2016) Making the essence of fundamental rights real: the court of justice of the European Union clarifies the structure of fundamental rights under the charter. Eur Const Law Rev 12(2):318–329
Papakostantinou V, De Hert P (2021) EU lawmaking in the Artificial Intelligent Age: Act-ification, GDPR Mimesis, and Regulatory Brutality. European Law Blog. https://europeanlawblog.eu/2021/07/08/eu-lawmaking-in-the-artificial-intelligent-age-act-ification-gdpr-mimesis-and-regulatory-brutality/#more-7788. Accessed 8 July 2021
Peeters B (2020) Facial recognition at Brussels Airport: face down in the mud. CiTiP Blog. https://www.law.kuleuven.be/citip/blog/facial-recognition-at-brussels-airport-face-down-in-the-mud/. Accessed 11 July 2021
Prosser WL (1984, original work published in 1960) Privacy [A legal analysis]. In: Schoeman F (ed) Philosophical dimensions of privacy. An anthology. Cambridge University Press
Quattroccolo S (2019) Equità del processo penale e automated evidence alla luce della Convezione europea dei diritti dell’uomo. Revista Ítalo-Española de Derecho Procesal 2:1–17
Raab T (2019) Germany. Video surveillance and face recognition: current developments. Eur Data Protect Law Rev 5(4):544–547
Rivers J (2006) Proportionality and variable intensity of review. Cambridge Law J 65(1):174–207
Rouvroy A, Poullet Y (2009) The right to informational self-determination and the value of self-developments: reassessing the importance of privacy for democracy. In: Gutwirth S, Poullet Y, De Hert P, de Terwangne C, Nouwt S (eds) Reinventing data protection? Springer, Dordrecht, pp 45–76
Sánchez-Monedero J, Dencik L (2020) The politics of deceptive borders: ‘biomarkers of deceit’ and the case of iBorderCtrl. Inf Commun Soc. https://doi.org/10.1080/1369118X.2020.1792530
Sayers D (2014) Article 48 (criminal law). In: Peers S, Hervey T, Kenner J, Ward A (eds) The Eu Charter of fundamental rights. A commentary. Bloomsbury, pp 1305–1350
Schabas WA (2017) The European convention of human rights. A commentary. Oxford University Press
Schoeman F (1984) Privacy. Philosophical dimensions of the literature. In: Schoeman F (ed) Philosophical dimensions of privacy. An anthology. Cambridge University Press
Schwartz O (2019) Don’t look now: why you should be worried about machines reading your emotions. The Guardian. https://www.theguardian.com/technology/2019/mar/06/facial-recognition-software-emotional-science Accessed 3 July 2021
Science Daily (2017) Emotions are cognitive, not innate, researchers conclude. https://www.sciencedaily.com/releases/2017/02/170215121100.htm. Accessed 12 July 2021
Sedenberg E, Chuang J (2017) Smile for the Camera: Privacy and Policy Implications of Emotion AI. http://arxiv.org/abs/1709.00396. Accessed 3 July 2021
Thomas D (2018) The Cameras that Know if You’re Happy – or a Threat. BBC. https://www.bbc.com/news/business-44799239. Accessed 2 July 2021
Tzanou M (2017) Data protection as a fundamental right. In: Tzanou M (ed) The fundamental right to data protection: normative value in the context of counter-terrorism surveillance. Hart Publishing, Oxford
US Government Accountability Office (2013) Aviation Security: TSA Should Limit Future Funding for Behavior Detection Activities. https://www.gao.gov/products/gao-14-159. Accessed 3 July 2021
Vaele M, Borgesius FZ (2021) Demystifying the Draft EU Artificial Intelligence Act. https://osf.io/preprints/socarxiv/38p5f. Accessed 8 July 2021
van Brakel R, De Hert P (2011) Policing, surveillance and law in a pre-crime society: understanding the consequences of technology based strategies. Cahiers Politiestudies 20:163–192
Vervaele JAE (2005) Terrorism and information sharing between the intelligence and law enforcement communities in the US and the Netherlands: emergency criminal law. Utrecht Law Rev 1(1):1–27
Wiewieorowski W (2019) Facial recognition: A solution in search of a problem? European Data Protection Supervisor. edps.europa.eu/node/5551. Accessed 2 July 2021
Acknowledgement
This contribution was written in the framework of the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie ITN EJD ‘Law, Science and Technology Rights of Internet of Everything’ grant agreement no. 814177.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Neroni Rezende, I. (2022). Facial Recognition for Preventive Purposes: The Human Rights Implications of Detecting Emotions in Public Spaces. In: Bachmaier Winter, L., Ruggeri, S. (eds) Investigating and Preventing Crime in the Digital Era. Legal Studies in International, European and Comparative Criminal Law, vol 7. Springer, Cham. https://doi.org/10.1007/978-3-031-13952-9_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-13952-9_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-13951-2
Online ISBN: 978-3-031-13952-9
eBook Packages: Law and CriminologyLaw and Criminology (R0)