1 Introduction

Technological advances in the form of image analysis and algorithmic processing have generated a significant change in the capability of facial recognition. Automated facial recognition (AFR) is a biometric technology that uses cameras to match live footage of individuals with images from a database [1]. As a computer-based security system, it helps to verify the identity of a person based on the individual’s biometric data. AFR is rapidly being deployed by private actors as well as law enforcement agencies, permeating nearly every aspect of the society. Given its dual use in nature, AFR can be used for beneficial or malicious purposes. The technology has sparked an intense debate on its potential impact on fundamental rights, of which the protection of privacy is at the core. Despite its widespread use, AFR is operating in a legal grey area. The risk-intensive processes of AFR shed light on the limits of current legal and ethical frameworks, which entails a global challenge to alleviate the impending threats to privacy. This article seeks to explore potential legal and regulatory approaches in selected jurisdictions and address these challenges from multipronged perspectives.

This article proceeds as follows: Part I starts with a benefit–cost analysis and further ascertains whether the use of AFR is viable or untenable in terms of privacy protection. Part II looks at privacy provisions under the General Data Protection Regulation (GDPR) and a ruling by the European Court of Human Rights (ECtHR) that exemplifies how privacy is therein protected. Part III discusses the latest U.S. statutory approaches to the AFR-driven challenges. Some cases are reflective of the laws’ application at both state and federal levels. Part IV compares the UK and China’s efforts in governing the AFR-related practice respectively by the policy and a private entity. It is indicated that neither country is equipped with adequate governance regimes. Arguably, there is little difference of current approaches between each other, despite the plausibly considerable divergences between the two legal systems. Part V refers to a social license theory that involves AFR’s social acceptability. It presents how AFR plays its role when privacy implications arise. Although a public private partnership (PPP) could improve the AFR’s performance, further measures, such as impact assessments, need to be embedded into practice by both public and private actors. Part VI puts forward a paradoxical game theory in surveillance and date collection and seeks to address ethical and legal challenges. This part casts a challenge on a long-standing theory of moral and ethical supremacy and inferiority in the context of AFR deployment. With the intense competition for AI supremacy, the divergences are narrowed down, and lines blurred between the West and China. Part VII analyses principles of necessity and proportionality to ensure that the AFR system meets relevant legal requirements. Some proposals are provided in furtherance of the debate on AFR’s global governance with particular regard to its safeguard infrastructure and institutions. The concluding remark affirms a momentous duty for both public and private actors to get the most out of AFR while still protecting privacy. With the balance highlighted between privacy protection and crime deterrence, further research needs to be undertaken given the lack of more critical qualitative and quantitative data analysis. The pressing inquiries can only be addressed from a multifaceted perspective.

2 AFR’s benefits and privacy challenges

As a dual-use technology, AFR has a wide range of benefits and potential privacy implications. It can be used for well-intended purposes that have serious social consequences [2]. One should take account of AFR’s potential cost while benefiting from the use of the tool. AFR, if used properly, can enhance law enforcement capabilities and protect public safety. However, its detrimental effects should not be ignored, since the intrusive technology could destroy people’s privacy rights and force them to change their behaviour.

2.1 Benefits

AFR has the potential to bring about enormous benefits, such as crime prevention and counterterrorism [3]. It deters terrorism through analysing past crimes to predict the chances that criminals will reoffend. The technology helps to create reliable evidence from video footage and keep track of criminals and potential law-breakers, enabling law enforcement to react effectively [4, 5]. Given its “sense-enhancing” function, AFR enables enforcement agencies to do more than ordinary surveillance [5] and can aggregate and assess vast quantities of data that are beyond human capacity to analyse unaided [6]. It shows that “facial recognition software got twenty times better at searching a database to find a matching photograph”, which is based on evaluations of 127 algorithms from 39 developers that have been undertaken between 2014 and 2018 [7]. These algorithms can work more accurately than can their human counterparts. Thus, AFR improves efficiency of law enforcement and enhances a state’s national security. Private actors employ AFR for commercial profits as well. Even if these benefits sound appealing, there are many unexpected privacy concerns associated with the use of the tool [8].

2.2 Privacy: a cornerstone for the enjoyment of fundamental rights

Privacy is a core value inherent to a liberal democratic and pluralist society, and a cornerstone for the enjoyment of fundamental rights [9]. Given a high degree of intrusion into privacy, AFR is considered as more concerning than other biometric techniques [10]. Once stored, data are difficult to completely delete, leading to the so-called “data persistence” [11]. Privacy is challenged when anyone’s online searches can recognise a person across vast sets of facial data in real time [12]. First of all, privacy is not an abstract concept, but a contextual one. A reasonable expectation of privacy refers to the extent to which people can expect privacy in public spaces without being subjected to surveillance [11]. AFR deployment in a particular context may violate such reasonable expectations [13]. The threat of perpetual surveillance erodes fundamental rights, because there is a significant gap between the AFR and the laws regulating its use [4]. Secondly, it is not to an absolute but a qualified right, which inherently allows for the permissible restriction of the protection to arbitrary or unlawful interference [14]. Even the attainment of privacy could be subject to limitations; values are modulated by circumstances with AFR being used circumstantially in both private and public spheres [15]. It is more sensible to explore whether AFR uses can be justified by the needs of the surrounding context. It might be less critical particularly if other important values are at stake. However, any interference needs to be adequately justified [16] and cannot compromise the essential, inalienable core of the right [17]. AFR should operate in an adaptive manner to ensure that the digital world has places where law-abiding people can enjoy privacy [18].

2.3 Is an outright ban a panacea?

AFR presents far too serious a threat to privacy interests, and triggers many debates on the parameters of privacy [5]. For instance, the storage of facial measurements in code makes people’s facial identity easy to transpose. As such, the question arises as to whether the use of AFR should be banned until the right legal framework along with privacy and security safeguards are in place [19, 20]. Arguably, a blanket ban on AFR is not the answer to the concerns, which, otherwise, would deny consumers the convenience that AFR entails. As discussed above, AFR creates innovative benefits for society and should continue to be developed. The central concern is that the deployment of AFR needs to be adequately regulated to preserve privacy. Precision regulation would make up for the gap where there is greater risk of societal harm [21]. However, it takes time to enact new laws and relevant guidance on trial protocols.

3 Provisions under the general data protection regulation (GDPR)

EU data protection rules clearly cover the processing of biometric data. Under the EU law, biometric data is defined as “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images”.Footnote 1 Facial images constitute biometric data, as they can be used to identify individuals. The EU protects people from the threats of facial recognition by enforcing the General Data Protection Regulation (GDPR), which prevents the processing and sharing of biometric data without consent.

3.1 Applying the GDPR in the deployment of AFR

The GDPR generally forbids the processing of biometric data for uniquely identifying purposes unless one can rely on one of the 10 exemptions.Footnote 2 The law provides that collection and processing of biometric data including facial recognition is valid when “the data subject has given explicit consent to the processing of personal data”.Footnote 3 The national and EU legislators have the discretion to decide the cases where the use of this technology guarantees a proportionate and necessary interference with human rights.Footnote 4 The GDPR shows the beginning of resistance to untrammelled data collection. It requires that organisations collect and process data only with the clear and informed consent of individuals. Notably, the use of AFR by law enforcement agencies is not subject to the GDPR, but the Law Enforcement Directive (LED).Footnote 5 The most relevant instrument establishes a comprehensive system of personal data protection. The LED specifically refers to facial images as ‘biometric data’ when used for biometric matching for the purposes of the unique identification of a natural person.Footnote 6 Pursuant to the European model, law enforcement may not engage in a particular investigative method until it has been fully authorised by statute [22].

3.2 Exemptions under the GDPR

The GDPR includes exemptions for the collection of biometric data like facial recognition by authorities, even though such information is considered “sensitive” and highly restricted in the hands of private companies. Regarding the applicability of Article 8 of the European Convention of Human Rights (ECHR) on the right to respect for private life, a government may interfere with these rights if sufficiently justified by legality and necessity. It requires that personal data be processed only for specified purposes, which must be explicitly defined by law. Such purposes need to be “specified, explicit and legitimate” and transparently communicated to the data subject.Footnote 7 It is essential to examine exceptional circumstances from perspectives of proportionality, necessity and the balancing of interests. A three-pronged test developed by the ECtHR requires that:

Any rights interference has to pursue a legitimate aim; be in accordance with the law, i.e. necessitating an appropriate legal basis meeting qualitative requirement,Footnote 8 as well as necessary in a democratic society.Footnote 9

These principles will be discussed in more detail in the seventh part of this article. In judicial practice, courts are increasingly siding with people’s rights and limiting the use of privacy-invading policing tactics like AFR surveillance, as legal challenges arise against the use of mass surveillance [23]. The ECtHR also found in Peck v. the United Kingdom that video surveillance of public places where the visual data is recorded, stored and disclosed to the public fell within the scope of Article 8.Footnote 10 The ECtHR has ruled that UK laws enabling mass surveillance had violated human rights,Footnote 11 and more specifically the right to privacy protected by Article 8 of the ECHR.Footnote 12

As the most prominent legislative act related to AFR, the GDPR raises questions about the legality of the new storage regimes and mechanisms for transfer of biometric data. Many social challenges presented by AFR are not wholly addressed via the GDPR. This is attributed largely to low evolution in updating conceptual and theoretical challenges. The European Commission is planning to impose stricter limits on facial recognition usage to give EU citizens explicit rights over the use of their facial data [24]. It seems that the EU does not censor online content, nor does it grant law enforcement agencies access to personal data without a court order.

4 AFR-related laws in the U.S.

Law enforcement in face recognition affects over 117 million American adults, who have been captured in a “virtual, perpetual lineup” [25]. Like the EU, the U.S. takes a similar stance against AFR-related violations of privacy, which is reflected not only in its statutory but also some high-profile precedents. Although the U.S. has so far opted for minimal regulation, the statutory approaches have far-reaching implications given the enactment of AFR-specific laws at both state and federal levels.

4.1 Precedents related to emerging technologies and privacy

The Court has been concerned about systems of digital surveillance and their potential privacy invading power [26]. In Katz v. United States, the Supreme Court adopted a two-part test to determine whether a person has a reasonable expectation of privacy, assessing:

(i) Whether the person exhibited an actual, subjective expectation of privacy

(ii) Whether that expectation is one that society recognises as reasonable.Footnote 13

The Katz test provides a framework for analysing Fourth Amendment issues. In United States v. Jones, Justice Sonia Sotomayo concurred that the court’s jurisprudence might not be adequate in “cases of electronic or other novel modes of surveillance that do not depend upon a physical invasion on property”.Footnote 14 Justice Alito highlighted the need to enact new law that:

In circumstances involving dramatic technological change, the best solution to privacy concerns may be legislative … to balance privacy and public safety in a comprehensive way.Footnote 15

In Carpenter v. United States, the Supreme Court held that the government’s warrantless access to an extensive compilation of cell phone user data violated the Fourth Amendment.Footnote 16 However, the Court declined to address whether short-term, limited or real-time access had equal concerns under the Fourth Amendment.Footnote 17 A fundamental question embedded in the above three cases is whether the surveillance system being used leads to a violation of a reasonable expectation of privacy.

There is a circuit split on the extent to which plaintiffs must show harm in order to bring privacy and data breach causes of action [27]. The Supreme Court in Spokeo v. Robins held that: “a statutory violation is not in itself sufficient to create standing if the injuries are not concrete”.Footnote 18 As Ohm noted, the Court is concerned with systems of digital surveillance from the Supreme Court’s analytical perspectives, like depth, scope and breadth [28]. In Facebook v. Patel, it was alleged that Facebook misled tens of millions of users about their ability to control facial recognition within their accounts.Footnote 19 The privacy-protective case helps frame the analysis, and the ruling in Facebook v. Patel would have shaped how courts view AFR. However, the Supreme Court declined to hear the case. Although, in Riley v. United States, the Supreme Court’s quantitative and qualitative analysis applies to the challenges of AFR surveillance,Footnote 20 there has so far been no developed case law or constitutional precedent upholding the police use of facial recognition without a warrant. The court has not even decided whether facial recognition constitutes a search under the Fourth Amendment. Critics have argued that AFR may implicate the First Amendment right to privacy.

4.2 Statutory approaches to addressing AFR’s implication of privacy

The California Consumer Privacy Act (CCPA 2019) is likely to set the legal hurdle high for businesses deploying the technology.Footnote 21 The latest legislation to limit AFR is referred to as the Body Camera Accountability Act.Footnote 22 The Californian City of San Francisco hereby bans AFR because of its excessively intrusive nature into people’s privacy and to avoid possible abuse by law enforcement agencies [29,30,31]. CCPA 2019 has substantial impact on privacy rights and consumer protection, which is sometimes considered as a model for a federal data privacy law. On 14 March 2019, the Commercial Facial Recognition Privacy Act (CFRPA) was introduced by senators to offer legislative oversight on AFRʼs commercial application [32]. CFRPA 2019 prohibits the use of AFR in the absence of affirmative consent from a data subject [33]. It seeks legal changes that require companies to inform before facial recognition data is acquired. The law sets general limits on which information businesses can collect from individuals, and what can be done with it. The legislations represent an important step toward protecting privacy. They are conducive to strengthening consumer protections by prohibiting commercial users of AFR from collecting and re-sharing data for identifying or tracking consumers without their consent.

5 Legal challenges against the use of AFR in the UK and China

The use of AFR for policing purposes has in recent years emerged as an acute controversy. Some debates are initiated amongst privacy issues versus protecting the public. Not only is AFR routinely used by law enforcement agencies, but also by private actors. There is no legislation specifically designed to regulate the use of AFR. Underlining the controversial nature of AFR, the use has brought legal challenges in both China and the UK. With AFR integrated into China’s rapidly expanding networks of surveillance, a claimant accused a wildlife park of compulsorily collecting visitors’ biometric data via facial recognition. The UK is also one of the most surveilled countries in the world [34], where a plaintiff’s challenge against policy is on the grounds that the use of AFR breaches the right to privacy and data protection laws. Both cases’ legal bases have been called into question.

5.1 R (Bridges) v. CCSWP and SSHD

AFR has the potential to become an epidemic of intrusiveness [35]. The technology presents obvious questions over whether police are violating citizens’ privacy protections. A challenge to the South Wales Police’s use of AFR on the basis of data protection and human rights infringements has been unsuccessful before the High Court of England and Wales, although the decision is still in the process of being appealedFootnote 23 [36]. At the heart of this case lies a dispute about the privacy and data protection implications of AFR. The tool involves the processing of sensitive personal data, which requires the users to comply with the Data Protection Act (DPA 2018). It remains uncertain whether the trials demonstrate full compliance with the law. In judgment, Lord Justice Haddon-Cave and Mr Justice Swift found that AFR did interfere with the right to private life under Article 8 of the ECHR. Nevertheless, Mr Justice Swift held that:

… South Wales police’s use to date of AFR has been consistent with the requirements of the Human Rights Act and the data protection legislation.Footnote 24

The court agreed that although AFR amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate. The decision was considered as a green light for widespread deployment of AFR as a crime-fighting panacea [37].

The primary arguments before the Court of Appeal on 23 June 2020 are whether the Divisional Court erred in their analysis of the application of Article 8(2) and whether the legal framework governing the use of AFR has the requisite quality.Footnote 25 The judgment remains reserved. The case is significant not just for the ongoing use of AFR but for its future governance. It has profound implications for the way that society is policed. It is noteworthy that the judgment itself considers solely the use of AFR by the police rather than any other public or private bodies [38].

The current legislative and regulatory framework on AFR use is insufficient. The real-time facial recognition cameras are biometric checkpoints, identifying members of the public without their knowledge [39]. There is neither a regulatory framework limiting the AFR’s law enforcement applications, nor legislation regulating its use by private actors for commercial purposes. The legal framework which currently applies to the use of AFR by law agencies and private actors does not ensure those rights are sufficiently protected. Apparently, AFR has profound consequences for privacy and data protection rights. The lack of legislation surrounding the use of AFR has called into question the legal basis of the trials [40].

In RMC and FJ v. MPS, the court found that the “indefinite retention of the claimant’s [custody photographs] was an unjustified interference with their rights” under Article 8 of the ECHR.Footnote 26 The High Court ruling indicates that retaining the custody images of unconvicted people amounted to a breach of human rights, as the court realised that: “the algorithms of the law must keep pace with new and emerging technologies”.Footnote 27 The Bridges litigation is the first case of its kind around the world and will likely be influential in the approach taken by jurists in this developing area of law across jurisdictions [41]. As Ruhrmann observed:

Even in mature democracies with a strong commitment to protecting civil liberties, establishing policy safeguards for the use of AFR in law enforcement in accordance with human rights remains challenging [42].

Due to the absence of a legal basis and the risks inherent in AFR, it is vital to create a framework within which state agencies can work to ensure security and privacy.

It remains unanswered as to whether there should be a specific legal framework for the police and other actors to routinely deploy AFR, although the court held that the current national legal regime is adequate to ensure the appropriate use of AFR.Footnote 28 The DPA 2018 is the primary UK legislation controlling how personal data is used by the public and private sectors and contains extensive regulation of the processing and control of data [43]. Police deploying AFR must comply with the DPA 2018, and the Surveillance Camera Code of Practice [44]. Relevant to the retention of images for comparison against faces viewed through AFR is that DPA 2018 classifies “custody images” as personal data [48]. In addition, the Criminal Justice and Public Order Act (CJPOA 1994) confers on police the power to require removal of facial coverings in England and Wales if they feel they are being worn for the purpose of concealing identity and if they believe incidents involving violence may take place in any locality.Footnote 29 Furthermore, the Protection of Freedoms Act of 2012 only applies indirectly to the use of AFR by mandating a code of practice for surveillance camera systems [45]. However, these legal regimes do not provide guidelines or rules specifically regulating the use of AFR by the police [46].

The UK government has not yet passed new regulations in this specific arena. The regulatory framework gives little indication or guidance about the proper threshold at which AFR can be lawfully used. Law normally lags behind technology, and AFR currently exists in a regulatory vacuum, so at present it is being used in ways that undermine public confidence in the systems they avail of [47]. Private entities can start using it without declaring the move publicly or notifying authorities [48]. As such, given the lack of a regime regulating the use of AFR and biometric tracking capabilities, legislation needs to be brought forward that seeks to govern current and future biometric technologies [49]. A statutory code of practice is needed to govern how the police should use this technology.

5.2 Bing Guo v. Hangzhou Safari Park (Hangzhou Fuyang District Court, China, 2019)

In a landmark ruling, a Chinese court ruled that it was illegal for an entity to collect consumers’ facial biometric data without their consent. It is the first case to challenge the commercial use of AFR and was brought to Hangzhou Fuyang District People’s Court in October 2019. Bing Guo sued Hangzhou Safari Park after being required by the Park to scan his face to gain entrance. Guo claimed that the biometric system infringes upon consumers’ privacy rights and jeopardises consumers’ safety if abused. He further alleged that the Park had violated China’s consumer protection law. On 20 November 2020, the Hangzhou Court made a judgement, holding that the defendant breached the principle of necessity, while altering the contract unilaterally [50]. The defendant was ordered to compensate Guo ¥RMB1038 (£118). Nevertheless, the court did not confirm explicitly whether the defendant had infringed privacy, although the ruling was made in Guo’s favour that the Park’s behaviour was a breach of contract [51]. Nor did the court order the defendant to delete all his biometric information, i.e. facial and fingerprint data. As such, an entity should gain consent ex ante, and comply with the principles of “legality, legitimacy and necessity” when collecting personal information. The decision could help rein in what has been a Wild West of mass data collection for commercial purposes [52].

5.2.1 Increasing awareness of privacy protection

The Guo case has triggered a growing debate about privacy and abuse of personal data in an increasingly digitised society. Despite the nominal compensation, the high-profile case upended the notion that Chinese people do not care about privacy. As Chinese people become increasingly aware of their privacy rights, the concept of data privacy is gaining ground in China. They are increasingly concerned about how their data is being collected and used via AFR. During a recent survey, a main concern is that the respondents are worried about their biometric data to be unduly leaked. A total of 80% of respondents said they were concerned that the AFR system operators had lax safeguard measures [53]. In all, 74% said they would prefer to use traditional identification (ID) methods to AFR for the sake of verifying their identity [54]. However, the survey also suggested that around 60–70% of Chinese citizens believed AFR made public places safer. It indicated a broad public willingness to surrender their privacy in exchange for the safety and convenience that AFR entails [55]. In comparison, approximate 65% of the UK interviewees showed their discomfort around police uses of AFR with regard to privacy, surveillance, consent and ethics[15]. Around specific intense debate over deployment of AFR for applications in security and law enforcement, the surveys indicated little difference in privacy awareness between the two nationals. Both countries’ people seek to strike a balance between increased security and interferences with their privacy [9]. The survey indicates growing pushback against AFR in China. It is far from clear whether this rising discomfort will give way to policy changes [56]. The fundamental solutions will have to come from within.

5.2.2 Regulations

The Ministry of Science and Technology (MST) issued Governance Principles for a New Generation of Artificial Intelligence (AI): Develop Responsible AI on 17 June 2019, which encompasses a principle of respect for privacy.Footnote 30 In June 2019, the Office of the Cyberspace Administration (OCA), the highest administrative internet regulator of China, has issued “Data Protection Regulatory Guideline”, in which the protection of personal biometric information is highlighted. It provides directions on how to collect and use customer data, effectively setting personal data protection standards in China [57]. As China’s first major digital privacy guideline, the Personal Information Security Specification (PISS 2018) took effect on 1 May 2018. It lays out guidelines for consent and how personal data should be collected, used and shared.Footnote 31 Although Cyber Security Law (CSL 2017) is considered to be the most authoritative law protecting personal information, the Personal Information Security Specification (PISS) 2018 is the effective centrepiece of an emerging system around personal data [58]. Its 2020 version strengthens privacy protection and revises the “exceptions to soliciting consent”Footnote 32 and refinements for personal biometric information.Footnote 33 In particular, PISS 2020 provides that “Personal biometric information is a type of Personal Sensitive Information and includes facial recognition features, which may not be shared or transferred in principle.”Footnote 34 Despite the lack of penalties and legally-binding effects, PISS serves as an important reference for enforcement agencies.

5.2.3 Existing legal framework

There is a fragmented legal and regulatory landscape of privacy and data protection laws. Under Article 253 of the Criminal Law of the Peopleʼs Republic of China (PRC), illegally selling or providing citizens’ personal information is subject to criminal penalties.Footnote 35 The Supreme People’s Court (SPC) and the Supreme People’s Procuratorate (SPP) jointly issued the Judicial Interpretation on Several Issues Concerning the Application of Law in Criminal Cases of Infringing on Personal Information.Footnote 36 Being legally binding, the Judicial Interpretation specifies criminal penalties for misusing citizen’s personal information. Even if the personal information were legally collected, in the absence of the consent from the relevant individuals, one cannot provide such personal information to any third party. Doing so would be criminally actionable under the PRC Criminal Law.Footnote 37 The General Provisions of the Civil Law (GPCL) of China stipulates that: “‘natural persons’ personal information shall be protected by law”.Footnote 38 Taking effect on 1 June 2017, the China Cyber Security Law (CSL 2017) bans online service providers from collecting and selling citizens’ personal information without consent.Footnote 39 The law imposes legal obligations on operators by stating the requirements for the collection, use and protection of personally identifiable information (PII), which includes biometric data [59]. The consent requirement is echoed in China’s Consumer Protection Law (CPL 2019), which provides that consumers’ personal information can only be collected for legitimate purposes with consent.Footnote 40 In terms of the legal basis, the defendant’s use of AFR has not gained the plaintiff’s consent, let alone proper safeguards. China’s Personal Data Protection Law (PDPL 2020) leads to a more comprehensive framework for individual data rights and protection, of which Article 16 highlights the compulsory consent. More importantly, PDPL 2020 has a strong focus on biometric data protection to curb facial recognition abuses.Footnote 41 The latest PRC Civil Code 2020 provides that an individual’s biometric data is protected.Footnote 42 A victim could refer to this provision for remedies, despite the lack of specific narrative of AFR issues. As such, Guo could have sued the Park for the unauthorised use of his biometric data,Footnote 43 in lieu of claiming for the defendant’s breach of the contract. Notably, there are currently neither laws governing the specific use of AFR, nor overarching principles of data protection being set at the national level. The legal and regulatory efforts mark a major step in China’s tentative progress towards protecting Chinese citizensʼ personal data, although the law continues to evolve in this scenario.

5.3 Far-reaching implications in both the UK and China

The cases have been the first of their kind respectively in China and the UK amid increasing concerns over indiscriminate use of AFR, due largely to the development of AFR outpacing legal safeguards. They are bound to have wide-reaching implications over the use of AFR by businesses as well as law enforcement agencies. The cases trigger a heated debate on the legitimacy and morality of adopting AFR [60]. Chinese people in general are far less suspicious of AFR and consider it as a positive way to bring convenience. Despite growing attention to the issue, China’s personal data protection system is still made up of a patchwork of laws and standards in which users of AFR lack clear guidance [61]. Relevant laws and regulations are scattered, unsystematic and can barely provide effective or substantial legal protection for privacy. Legislators are generally not good at predicting future problems. It is a system that is short of adequate checks, balances and disclosure requirements, which should regularly be part of the western Europe or U.S. surveillance networks [62]. Nevertheless, there is a global rise in authoritarianism. Even in countries with strong rule of law traditions, AFR gives rise to legal and ethical challenges. In Bridges, police and intelligence agencies were using the same surveillance tools to solve and deter crimes and prevent terrorism. Notably, the judgement does not relate to AFR use by the private sector.

Any rules should be based on “notice and consent” when AFR is used to verify someone’s identity. Problematically, the two countries enrol images without the data subject’s active consent. Neither the law enforcement agencies in the UK nor the Chinese private actors have obtained the plaintiffs’ consent before deploying AFR. In this vein, the two countries share a lot in common. Whether used by governments or private entities, AFR appears to be developing faster than the law and the government’s ability to ensure its responsible use [63]. The proper use of provision and regulation of biometrics is key to ensuring that the criminal justice system functions effectively. Otherwise, AFR could result in miscarriages of justice [20]. The regulatory lacuna surrounding the use of AFR has called into question the legal basis of the trials [20]. Neither country has specific law to protect citizens’ biometrics, which highlights a lack of a safeguards system. Legislators must keep pace so that human rights are properly protected. Despite the few regulations surrounding law enforcement’s use of AFR, legislation should require that public agencies rigorously review biometric technologies for privacy concerns [64].

6 Public enforcement authorities vis-à-vis private actors in deploying AFR: a theoretical analysis of social licence

AFR is being used in public spaces, not only by law enforcement agencies but also increasingly by the private sector. The surveillance leads to chilling effects on social interactions [65]. The lack of privacy protection has a negative influence on society. AFR redefines the architecture of the social world, which renders it necessary to ensure the respect of privacy in an evolving socio-technical system [42]. A fundamental issue is whether AFR is socially preferable, even though the utilisation of AFR increases public safety and benefits law enforcement.

6.1 Social license and ethical commitment

AFR innovations need to be imbued with public values before being integrated into public life [66]. There is some justification for AFR to be used by law enforcement agencies in public spaces. The public would accept the AFR technology in circumstances where there are adequate safeguards in place as well as clear public interests. From Pew’s survey, 56% of Americans trust law enforcement agencies to use AFR responsibly in terms of the societal acceptability of, and public attitudes to, AFR; a 59% majority of U.S. adults think it is acceptable for them to use AFR to assess potential security threats in public spaces [67]. Some level of public surveillance does not pose a challenge given that the functionality falls squarely within citizens’ reasonable expectation of privacy.Footnote 44 As such, surveillance of this kind does not run afoul of international privacy rules [68]. In contrast, people are less trusting of private actors.

Private companies are spearheading a rollout of the controversial technology, which causes concerns about the commercialisation of private data. The use of AFR is experiencing a commercial race to the bottom, with tech companies forced to choose between social responsibility and market success [69]. Private entities can only get access to facial verification, provided they can demonstrate that its use is “strictly necessary and proportionate” and has a clear legal basis. Otherwise, their use of AFR would be unlikely to withstand a legal challenge. Given that protecting privacy is a widely shared social preference [70], rigorous assessments of AFR use should be undertaken in the context of public and private partnership (PPP).

6.2 Private actors’ profit-maximisation, critics and assessment under public and private partnership (PPP)

Non-law enforcement AFR uses also raise some controversial inquiries [26], given their rapid development for commercial applications. This rapid expansion raises unprecedented concerns about the nature of privacy and surveillance. Private actors’ direct access to increasingly large quantities of data may result in the amplification of harms. They are capable of tremendous sophistication in analysis and decision-making [5]. AFR use exposes consumers to the risk of their private information being shared with unintended recipients due to data breaches. This has profound consequences for privacy and data protection rights.

6.2.1 Public and private partnership (PPP) in the assessment implications for privacy

AFR should be deployed only after an adequate evaluation of its purpose, benefits and risks [71]. Using the technology for security and surveillance reasons, public actors typically rely on private companies for procuring and deploying AFR. The former need to obtain all necessary information from the latter, and the PPPs that combine government and business data sets plausibly help improve system performance [5]. However, data-sharing of surveillance material not only increases the potential of privacy harms, but also blurs the public and private boundaries given the differentiated rationales behind their access [72]. It is important to design tailored impact assessment methods to appropriately evaluate all affected rights in a comprehensive manner, which is conducive to ensuring a fundamental rights-compliant application of AFR [9]. A fully-fledged analysis and assessment of this kind need to be put in place from the outset of AFR use.

6.2.2 Profit maximisation vis-à-vis privacy protection

The forces of capitalism continue to drive toward greater profit despite the social implications of AFR. Businesses harness AI capabilities to improve analytic processing [39]. AFR-driven data analysis can provide a valuable assessment, i.e. consumer behavioural insights, which reflects consumer evidence-based decision-making [73]. This allows for a competitive advantage to the data users in predicting consumers’ actions [75]. Once data is retained it can be readily repurposed for profit [74]. The market for facial recognition is increasing, with large investments of up to $1.6 billion in start-ups from China [75]. Zuboff describes this process as “surveillance capitalism” where data extraction greatly diminishes the information costs of corporate actors, redistributing privacy rights away from consumers and towards corporate actors [76]. Companies attempt to exploit the perks of AFR for commercial purposes, such as Alibaba trying to make “smile to pay” happen [77]. The asymmetric power over information between private AFR users and data subjects increases the potential for abuse. Out of a profit-maximising motive, companies can effectively manipulate customers based on facial expressions. There is little justification and low public approval for the use of AFR systems operated by a private actor. Only around 9% of UK residents approved of specific uses of AFR with appropriate safeguards [15].

6.2.3 Nurturing resilience in response to the emerging challenges

Companies are supposed to abide by the principles of legality, legitimacy and necessity. When collecting and processing personal information, they should clearly indicate the purpose, method and scope, and obtain the consent of the data subjects in advance [78]. In June 2019, Microsoft deleted a massive online data set that contained more than 10 million images of 100,000 individuals that was used to train other companies’ AFR systems [79]. Ultimately, the company does not have a legal basis to process facial data under Article 9 of GDPR. Facebook paid five billion US$ to settle Federal Trade Commission (FTC) charges that the company violated a 2012 FTC order by deceiving users about their ability to control the privacy of their personal information [80]. The unprecedented penalty should have the strongest possible deterrent effect in order to change Facebook’s privacy culture to decrease the likelihood of continued violations.

The above cases demonstrate how privacy value has been maintained from both public enforcement and Microsoft’s self-remedial measure. This should facilitate the transformation to integrate a built-in clause, that is, fundamental rights considerations need to be built into technical specifications, deployments and even contracts [9]. For instance, the EU Public Procurement Directive (2014/24/EU) strengthened EU Member States’ commitment to socially responsible public procurement when purchasing a product or a service. Likewise, the EU could apply a similar approach when procuring the technology or commissioning innovative AFR-oriented research. A clear framework of AFR is needed to regulate how it can be used in both public and private spheres. The AFR-specific legislation is of great significance to promote human rights in the context of this emerging technology. It is imperative to put into place an effective set of laws and guidelines as the use of AFR proliferates. From the above-mentioned remarks, it is apparent that the legislative and judicial branches have been adapting the law to AFR to ensure that the proper balance is maintained between security and privacy [81]. However, the legal landscape is far from settled. The current framework does not keep pace with emerging technologies. Law is slow to protect personal privacy and individual liberty in the face of rapidly accelerating technologies of social control [26].

7 A paradoxical game in surveillance and data collection

The scope of surveillance capacities continues to grow [82]. Not all systems focus on database matching—some systems assess aggregate demographic trends or conduct broader sentiment analysis via facial recognition crowd scanning [39]. AFR can identify a person in a crowd in real time as well as track their movements, detect emotions and predict behaviour. A long-standing debate remains unaddressed as to whether the West still claims superiority in this scenario, which renders it important to ascertain the divergences between approaches.

7.1 A paradoxical debate in ethics and morality

AFR invokes substantial criticism in terms of the ethics and legality of its application. Questions on the role of law and ethics in governing AFR are more relevant than ever [83]. Turning the human face into another object for measurement and categorisation by companies touches the right to human dignity [84]. Given the users’ strong moral and ethical obligation to ensure effective protection of privacy, the benefits of AFR must be weighed against the possible adverse effects it may have on subjects’ privacy [85]. The implementation of AFR should ensure that its risks are not disproportionately borne by, or the benefits disproportionately flow to, a particular group [86]. In social psychology, moral realism is always behind a shadow that inhibits trade-offs and the achievement of compromise [87]. It is hardly justified to convert the question into a debate on whether economic well-being matters more or less than the human rights, like privacy [88]. In terms of the delicate conversion, of necessity is to differentiate their dimensionality with the two subjects evaluated.

AFR raises a host of ethical and legal questions about privacy along with surveillance. A classic question arises as to whether the benefits outweigh the intrusion into people’s privacy, or more specifically, whether it is worth sacrificing privacy and civil liberties. The proposition per se is questionable given that it is on the ground of an incomplete hypothesis. Some fundamental variables must not be left out, like the diverse values and traditions. It is often difficult to harmonise diverse values when society develops public policy or weighs the costs and benefits of choices [89]. It has even been argued that AFR should be banned outright. As Axon’s independent ethics board stated: “face recognition technology is not currently reliable enough to ethically justify its use” [90]. A consensus can hardly be established on ethical behaviour particularly between a competing set of ethical and policy priorities. Different countries are developing different regulatory regimes. Ethical principles can be used to inform deployments and frame policymaking. The absence of a binding code and national guidelines gives rise to inconsistent approaches. The absence of such framing has led to a widespread culture of disregard of the law and put privacy in danger [91]. As such, values, ethics or human rights should be embedded in those entities that profit from marketing surveillance capabilities [92]. However, building ethical AFR is an enormously complex undertaking.

7.2 Moral superiority vis-à-vis moral inferiority in the context of AFR

A key question is whether there are emerging threats of AFR-driven authoritarianism against democratic values, or whether there is no substantive difference in the deployment of AFR between the two [93]. As discussed above, AFR is being used in both the UK and China roughly in the same way, playing nearly the same roles in both public and private sectors. The classic debate is whether the UK or the West could claim moral supremacy over China in surveillance scenarios.

7.2.1 The driving force of digital capitalism and security

Data is the fuel that drives the AI engine [5]. Opening access to data will help gain insights that will transform the economy [94]. How to approach with surveillance largely determines whether the industry or even the state will be put at a competitive advantage in the current fierce digital competition [95]. The digital rights model has every relevance to corporate losses or gains. Limitations on data processing and resale curb corporate profits [96]. The tougher the privacy laws in the West, the less data there are, but data constitute the indispensable raw materials for AI. As West and Allen noted:

Almost all the data are proprietary in nature and not shared very broadly with the research community, and this limits innovation and system design [5].

By taking a restrictive stance on issues of data collection and analysis, the West, particularly the EU, is putting its manufacturers and software designers at a significant disadvantage to the rest of the world [5]. Entrepreneurship and innovation is harmed, given that individuals will avoid “experimenting with new, controversial, or deviant ideas” [97]. It makes more sense to think about the broad objectives desired in AI and enact policies that advance them.

The Chinese perception of privacy differs from that of the West. People’s rights in privacy are less robust than those of their Western counterparts. China’s cultural attitudes towards data and privacy norms are strikingly looser than those in the West [98]. In China, companies already have “considerable resources and access to voices, faces and other biometric data in vast quantities, which would help them develop their technologies” [99]. As a world-leading AI-powered surveillance state, China has embraced AFR, using it to implement a national surveillance system. The AFR-based surveillance capabilities have been on full display, which helps advance China’s architecture for social control [93]. Surveillance is becoming pervasive, and algorithms score Chinese citizens on their behaviour. China has led the way in developing and deploying AFR, and set up the world’s most sophisticated surveillance state. AFR could be used to prosecute minor crimes such as jaywalking or littering, and even allow the creation of a full “social credit” system of government surveillance [100]. The issue has been heightened by the growing use of the technology in China as part of a compulsory National Social Ratings and Surveillance Scheme [101]. As such, China makes it more permissive for companies so as not to hinder the development of AI.

China has been making a sustained effort for leadership and primacy [102]. As part of its push to advance its high-tech strategies, the country has been pushing forward its Next Generation Artificial Intelligence Initiative [103]. The State Council, China’s highest administrative organs, launched the Next Generation Artificial Intelligence Development Plan in mid-2017, outlining a national ambition to become a leading AI power by 2030 [104]. Problematically, algorithms can identify faces but do so in ways that threaten privacy. The call for privacy could become a major challenge to China’s internet titans, and eventually to the cyber-authoritarian aspirations of the Chinese government itself [56]. For the sake of substantial economic, social and strategic benefits, Chinese companies have been exporting the AFR-related products to like-minded governments in order to spread influence and promote an alternative governance model [105, 106]. As such, elements of China’s model of surveillance inspire other autocracies [62]. Some critics have accused China and the surveillance companies of “exporting authoritarianism” via the technology [107].

As a polarising topic, AFR is sometimes seen as a problematic development in surveillance capitalism [108]. Despite the impact on privacy, this provides Chinese companies that have shaped standards an advantage in breaking into new potential markets [109]. It represents a smart short-cut for China to leverage global governance as well as businesses in AFR. China accounted for nearly half of the global facial recognition business in 2018 [109]. The market for facial recognition has grown 20% annually over the past 3 years and will be worth $9bn by 2022 [110]. While domestic concerns grow, China’s facial recognition companies are leading the global market for public surveillance systems [111]. When there is a fundamental shift in the underlying balance of power, people’s stance unconsciously falls into the trajectory of the Thucydidean rivalry between a rising China and the currently dominant West [112]. In the guise of ideological confrontations, conflicts will inevitably take place when a rising power threatens to displace a ruling one [113, 114].

7.3 Are there any hypothetical, ideological or substantive differences?

To pursue security, the West may have to collect data in order to safeguard themselves from cyber-attacks. France is the first EU country to use a nationwide facial recognition ID app [115]. Murgia and Yang argued that it is not a question of legality, but of morals and ethics in terms of the moral equivalence between China’s authoritarianism and Western values [109]. However, in response to challenges posed by AFR, a perception of China’s moral inferiority could be arguably a pseudo-proposition. It might well be worth referring to the famous Tacitus Trap, which contributes to the adoption of an absolutist moral stance that: ‘when a government loses credibility, whether it tells the truth or a lie, to do good or bad, will be considered a lie or to do bad’.Footnote 45 As discussed in Bridges, the development and application of AFR by some police forces encapsulates a number of the problems that have arisen due to the lack of a clear legislative framework for the technology [20]. Both the UK and China are using increasingly sophisticated technology in their pervasive surveillance systems. In this scenario, the two countries are morally equivalent. It is simply difficult for the West to give up the moral superiority that underlies so much Western rhetoric about China [116]. One may argue for the UK’s moral superiority because of its robust rights of privacy and free expression. However, Professor Paul Wiles, the UK’s Biometrics Commissioner, notes that: “the technology is being rolled out in a ‘chaotic’ fashion in the absence of any clear laws” [117]. Despite the fact that China’s public and private sectors have been aggressively using AFR,Footnote 46 the above argument also applies squarely to Western private entities. Companies based in liberal democracies are also actively selling sophisticated equipment [118]. To provide more in-depth demographics, Intel and Tencent try collaboratively to “gain new insights about their customers to both elevate the users’ experience and drive business transformation” [119]. Along with Huawei, Google, BAE, NEC, Amazon and Alibaba are all involved in helping Saudi Arabia build AFR as well as other mass surveillance systems [120]. Some commentators observed:

While debate over the use of facial recognition in the EU and the U.S. is focused on the privacy threat of governments or companies identifying and tracking people, the debate in China is often framed around the threat of leaks to third parties, rather than abuses by the operators themselves [107].

In this regard, moral or ethical issues have been complicated due largely to the Western and Chinese actors’ convergence in deploying AFR. The data collector must ensure that “appropriate safeguards” and “enforceable data subject rights and effective legal remedies are available”.Footnote 47 To meet the above criteria, an assessment should be based on a review of a country’s privacy laws as well as on its record on “the rule of law, respect for human rights and fundamental freedoms” [121, 122]. It is vital to place procedural safeguards on AFR, such as requiring warrants and limiting the duration of surveillance, and alleviate concerns over security and privacy while encouraging innovation [123].

7.3.1 Standard: at stake is who will reshape future rules

China’s individual data rights framework has profound global implications [124]. It increasingly seeks to influence internet governance and the information ecosystem [125]. The Chinese approach to data governance will play an important role in shaping global markets, technology development and policy. Its efforts to pioneer standards is a reflection of how the Chinese companies are seeking to supply surveillance technology across the world [126]. The AI Global Surveillance Index (AGSI) identifies that at least 64 countries have been actively incorporating AFR in their surveillance programs [39]. There is a first-mover advantage for whoever writes the new rules for the digital economy, and such an advantage in setting standards and rules can give a powerful edge to companies and businesses [109]. Dominant in the global facial recognition market, Chinese companies have made every submission to the UN for international standards on surveillance technology in the past 3 years [126]. The UN’s International Telecommunication Union (ITU), which establishes common global specifications for technology, has received 20 standards proposals since 2016 from Chinese companies, including Huawei. Many of the submitted standards have already been approved, even though concerns are rising about how Chinese companies are gaining access to the personal data of individuals around the world [127].

8 Embed principles and rules into the AFR governance regime

AFR use has been operating in a legal vacuum [128]. The current legal landscape is fragmented. Neither a specific legal framework nor overarching principles of data protection are set at the global level to govern the deployment of AFR. This inadequate legal framework negatively impacts the foreseeability and accessibility of AFR policy. To fill the legal vacuum and develop an effective and cohesive future policy strategy, a proper governance framework that is fit for these emerging technologies in order to balance policing effectiveness and privacy is needed. The risk of interferences with fundamental rights is higher and therefore the necessity and proportionality test must be stricter [9]. The fundamental rights implications of using AFR vary considerably depending on the purpose, context and scope of the use [9]. It is essential to embed some principles to properly protect privacy while making efficient use of AFR. At stake in procedural control is the attachment of adequate balances and checks in the processing of AFR-driven data.

8.1 Enhance global governance by creating rules with teeth

The line between permissible and impermissible levels of surveillance is always blurred, which renders it imperative to protect the public interest through an AFR-based stringent regulation.

Meanwhile, providers must be accountable for ensuring that they do not facilitate human rights abuses. International law affirms its commitment to protecting privacy as a fundamental right. The Universal Declaration of Human Rights (UDHR) states that “no one shall be subjected to arbitrary interference with his privacy”.Footnote 48 The right to privacy is also internationally recognised in the International Covenant on Civil and Political Rights (ICCPR).Footnote 49 Privacy is universally accepted at the international level and codified in the ICCPR, which states that: “[n]o one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence”.Footnote 50 Although China is not a party to the Covenant, it is worth examining whether such pervasive surveillance implicates the ICCPR. The ICCPR privacy provision may not have enough teeth to offer a shield against Chinese-level facial surveillance.

Nevertheless, there are no uniform standards in terms of data access, data sharing or data protection [5]. The use of AFR is subject to insufficient regulation that is specifically designed to protect human rights given the new challenges posed by emerging technology [41]. There are no international rules that require law enforcement or companies to notify the people that an AFR system is in operation. Few rules govern access to and use of image databases. For instance, it remains unclear whether misuse of police data should be a criminal offence for which people are punished [129]. The absence of a comprehensive global governance framework to oversee AFR deployment has considerable implications for the protection of privacy. Substantial uncertainties and paramount challenges may lead to a global dialogue and the formation of a global practice on this critical issue, avoiding a potentially fractured global legal landscape [57].

There are clear advantages to having open norm-setting venues that aim to address AFR governance. AFR-specific regulation is necessary to account for the unique risks the technology poses for human rights [42]. Legislators should pass laws to regulate both public enforcement and private deployment via face recognition. Given the novelty of the technology as well as the lack of safeguard measures, global regulations must be created to avoid the violation of privacy in the digital age [130]. There is considerable need for a normative framework for AFR to help determine whether or not a specific deployment of AFR is human rights compliant [131]. The algorithms of the law must keep pace with new and emerging technologies.Footnote 51 Global safeguards and norms need to be instituted to shape how public and private actors use AFR [132]. In view of the above governance vacuum and institutional void, it is worth exploring how to regulate the controversial use of AFR by referring to some well-established principles.

8.2 The test of necessity and the balance of interests

The use of extensive surveillance powers is sometimes abusive. Investigative AFR presents a unique analytical problem and requires a sophisticated balancing of interests [26]. The use of AFR is permissible only when it is being employed in the public interest. The issue about the proper balance between privacy and security has long been debated in public discourse as well in judicial arenas [133]. This might be straightforward in certain scenarios where there is a public interest in being able to identify those engaged in criminal activity [134]. Nevertheless, an objective of general interest, such as crime prevention or public security, is not per se sufficient to justify an interference [135]. Any interference with a right needs to be examined as to whether the given legitimate aim could not be obtained by other means that interfere less with the right guaranteed [136]. The GDPR introduces a data minimisation principle whereby personal data can be collected “limited to what is necessary in relation to the purposes for which they are processed”.Footnote 52

When determining whether a measure is necessary in a democratic society, an effective and targeted system should strike a balance between values, such as public safety, data security and fundamental rights. With an array of competing goals considered, it is essential to ascertain through some precedents whether security might be worth sacrificing privacy for [137].

Article 9 under GDPR specifies circumstances where the collection of biometric data is necessary “for reasons of substantial public interest”. Pursuant to the GDPR, the processing of biometric data is only allowed where processing is:

Necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject.Footnote 53

The ECtHR held in S. and Marper v. United Kingdom that states should “strike a right balance” between protecting fundamental rights and developing new technologies.Footnote 54 In Zakharov v. Russia, the ECtHR dealt with the secret interception of mobile phone communications. It interpreted the principle of necessity that:

As to the question whether an interference was “necessary in a democratic society” in pursuit of a legitimate aim … when balancing the interest of the respondent State in protecting its national security through secret surveillance measures against the seriousness of the interference with an applicant’s right to respect for his or her private life ….The Court has to determine whether the procedures for implementation of the restrictive measures are such as to keep the “interference” to what is “necessary in a democratic society”.Footnote 55

In a landmark battle against UK mass surveillance, the ECtHR held that “the UK’s regime for authorising bulk interception was incapable of keeping the ’interference‘ to what is ’necessary in a democratic society‘.”Footnote 56 The rulings represent a significant step forward in the protection of privacy. In the UK Supreme Court Case Bank Mellat v. Her Majesty’s Treasury, Lord Reed formulated an applicable test for legality and necessity:

(1) Whether the objective of the measure is sufficiently important to justify the limitation of a protected right

(2) Whether the measure is rationally connected to the objective

(3) Whether a less intrusive measure could have been used without unacceptably compromising the achievement of the objective

(4) Whether, balancing the severity of the measure’s effects on the rights of the persons to whom it applies against the importance of the objective, to the extent that the measure will contribute to its achievement, the former outweighs the latter.Footnote 57

The parameters improve greatly operationality of legal enforcement. The benefits have to be sufficiently great so as to justify any interference with other rights. Legislation should balance the competing needs of law enforcement with the fundamental protection of individual privacy [26]. The debate about the proper balance between privacy and public safety will continue to play out in the courts [4]. Apart from the key element of necessity, the use of AFR should also meet a proportionality requirement. It can be permissible only if the benefits are proportionate to any loss of liberty and privacy.

8.3 The test of proportionality

The appropriate extent of transparency or surveillance would be a dystopia [95]. The interference that needs to correspond to a pressing social need must be proportionate [138]. It will depend on the purpose for which AFR is used and on the safeguards in place to protect individuals from negative consequences. The proportionate use of AFR suggests that its application must be clearly warranted in existing laws [139]. The UK Surveillance Camera Code of Practice 2013 requires any police use of facial recognition or other biometric characteristic recognition systems to be clearly justified and proportionate to meeting the stated purpose.Footnote 58 The UK’s Human Rights Act 1998 requires that any interference with the ECHR Article 8 right to a private life be both necessary and proportionate. The Law Enforcement Directive (LED) lays down similar, albeit somewhat more permissive conditions,Footnote 59 which influences the regulation of AFR:

The risk to the rights and freedoms of natural persons, of varying likelihood and severity, may result from data processing which could lead to physical, material or non-material damage.Footnote 60

The court considers how to apply the principle of proportionality in a variety of circumstances. When developing the rules and regulations that must ensure citizens’ privacy protections, the judges undertake proportionality assessments. There are some rulings pertinent to the controversial issue.

In Murray v. The United Kingdom, the ECtHR found that the taking and retention of a photograph of a suspected terrorist without his/her consent was not disproportionate to the legitimate terrorist-prevention aims of a democratic society.Footnote 61 The principle articulated in Murray interprets the extent to which enforcement authorities make efforts to achieve the legitimate terrorism-prevention aims of a democratic sociality. The ruling is in line with the ECHR, which provides that “it is not intended to bar lawful and proportionate law enforcement activities”.Footnote 62 In Tele2 Sverige AB, the Court of European Justice (CJEU) found the retention of communications data to be subject both to the requirements of Article 7 and Article 8 and a balancing test.Footnote 63 The CJEU further held: “the obligation to retain communications data must be proportionate, within a democratic society, to the objective of fighting serious crime …”.Footnote 64 These decisions sketch out the parameters within which any regulations of AFR will be evaluated, especially considering that the UK plans to remain a party to the ECHR after Brexit. To enable an informed assessment of the necessity and proportionality of AFR use, the more intrusive the technology is, the stricter the test must be [9]. Balances and checks need to be implemented to ensure that AFR is socially and lawfully used.

8.4 Balances and checks in place for reasonable control

The balance and checks ensure that the crime-fighting benefits are counter-balanced with due regard to concerns about its impact on privacy and the current limitations of the algorithms it employs [140]. The right to an effective judicial remedy must be taken into account in relation to decisions by the users as well as the supervisory authority. Failing to address the legislative vacuum around new biometrics would put privacy in jeopardy. Harrellʼs challenge of the vacuum of legal checks and balances revealed a “surveillance-first, ask-permission-later system” [141]. One could argue that the exploitation of AFR presents a chilling model for fellow autocrats and poses a direct threat to open democratic societies [39]. It is imperative to maintain checks and balances and set rules to restrain those who collect and process biometric data.

8.4.1 Efficient safeguard and remedial measures

Of the utmost importance is to establish strong legal safeguards that guarantee privacy and accountability. Efficient institutions need to be put in place to ensure the efficacy of a mechanism to hold those actors accountable for their failure to abide by these principles. Potential rights-harming outcomes should be identified, and effective action taken to prevent and mitigate harms [142]. Both Article 32 of the GDPR and Article 29 of the LED require that Member States take necessary measures to prevent personal data from being disclosed to unauthorised parties. Such measures need to be integrated into a safeguards regime to protect the rights of people concerned.Footnote 65 Another pillar is that data subjects have the right to an effective remedy in case their rights are unduly violated. Such a right is well enshrined in the EU Charter of Fundamental Rights.Footnote 66 The access to remedy is also echoed in the GDPR, ensuring that a victim will have a channel for justice, which provides that:

A controller or processor may transfer personal data to a third country or an international organisation only if the controller or processor has provided appropriate safeguards, and on condition that enforceable data subject rights and effective legal remedies for data subjects are available.Footnote 67

These measures need to be conducted based on sophisticated privacy impact assessments.Footnote 68 Apparently, the existing protections are inadequate to guard against abuse. Merely setting up the mechanism of safeguards and accountability is not sufficient. It matters as to how the institution functions. Furthermore, it may need to include a private right of action to enhance deterrence against violation of privacy. While restrictions are made on the basis of the above principles, it is similarly essential to guarantee transparency and due process. There would be little sense if the users of AFR turned them into merely a box-ticking practice.

Given the lack of adequate governance of AFR deployment [143], the related oversight is not sufficiently robust to protect against abuse.Footnote 69 As such, ethical design serves as a bulwark against the relentless pursuit of profit and power [144]. Since the government plays a dual role of player and referee in deploying AFR, an independent oversight authority becomes indispensable to keeping the controller and processor more accountable. This approach is consistent with Article 8 of the Charter on the protection of personal data, which requires the oversight of data processing by an independent authority.Footnote 70 To prevent fundamental rights violations, oversight authorities must have sufficient powers, resources and expertise [9]. The institutional approach helps to build trustworthiness through system validation by third parties.

8.4.2 Multi-stakeholder initiatives to develop best practices

The use of AFR raises a number of ethical issues and trade-offs, from concerns around privacy to a legitimate interest in public safety, which can only be resolved in public discourse [42]. Mere legal governance solutions are limited, suffering from conceptual ambiguity and lack of enforcement mechanisms [70]. It is important to go beyond conventional rhetoric to formulate and embed those fundamental values, like privacy initially. Of similar importance is to ensure that there is equitable stakeholder representation when developing AFR governance regimes [83]. There should be consensus on how to best balance AFR adoption between privacy and public interests [64]. More proactive approaches help to develop effective ways to raise public awareness for the trade-off between benefits and risks of its applications. Both the government and business focus more on economic growth, which could be at the expense of social inquiries. Initiating a much-needed and vigorous public debate about the proper balance between the AFR-related surveillance and privacy rights should be put high up on the agenda. Given the current limited role that civil society plays in shaping policies, the public should be afforded opportunities to voice concern and effectively object via democratic engagement [42]. The above proposals entail stronger procedural control regulating how law enforcement agencies and private companies legitimately deploy AFR [145]. The approaches are complementary, and the measures should be characterised with transparency, accountability and the avoidance of abuse.

9 Conclusion

AFR compromises the inviolable essential core of privacy and poses serious threats to fundamental rights. There is a lack of well-defined regulations controlling the collection, use, dissemination and retention of biometric identifiers. In view of the status quo of AFR governance, neither the legislative nor the judicial branch is well equipped to adjust the balance between the values, like security and privacy. The existing laws that protect individual biometric data are not adequate to respond to the challenges posed by AFR. The deployment of such ground-breaking technology in the absence of a sufficient legal framework has resulted in complex ethical and legal repercussions. The use of AFR calls for a new legal and regulatory framework to avoid a dystopian future. The algorithms of the law must keep pace with new and emerging technologies. It will be up to the courts and policymakers to strike the right balance between the need for information and the right to privacy. Enforcement authorities must be ensured of being able to make efficient use of AFR’s powerful investigatory roles, while privacy should be taken into reasonable account. The international community needs to develop a viable policy framework that ensures the respect of the above-mentioned privacy principles. More factors need to be considered, including the legal basis, necessity, proportionality and justification, in order to address intrusive AFR processing. It is likewise important to build up the consensus on protecting privacy while still enabling law enforcement to make use of surveillance’s tremendous investigatory and crime-fighting tools.

Furthermore, it still remains unclear how non-state actors are collecting and using their personal data. To ensure the respect of human rights in this new socio-technical context, both public and private actors need to be committed to striking the right balance when using AFR. It is crucial to remain critical of the underlying aims of AFR governance solutions as well as those collateral impacts, especially in terms of legitimising private sector-led practice. There is still a gap to fill with regard to incentivising private entities to behave in a socially responsible way, striking a balance between maximising profits and protecting fundamental rights. Impact assessments are important tools to comprehensively assess the risks involved in AFR. Given that processing of personal data constitutes a limitation of privacy, it needs to be subjected to a strict necessity and proportionality test, including a clear legal basis to do so and a legitimate aim pursued. To achieve this goal, efficient mechanisms of balances and checks are indispensable to ensure that the proposed test principles will function properly. Due to the lack of qualitative and quantitative analysis based on solid data, it takes time to optimise the roadmap of addressing the global challenge.