1 Introduction

The possibilities and pitfalls offered by the implementation of AI in government decision-making are currently at the forefront of political, legal, and academic interest. The expanding implementation of AI-assisted decision-making in Europe has been evident in state functions ranging from the administration of public benefits,Footnote 1 to the prevention of terrorism and other serious crimes.Footnote 2 Several factors contribute to making full automation challenging in the policing context, and the continued role of human decision-makers exercising judgment and discretion has consequently been highlighted in previous research.Footnote 3 Even so, the stakes involved in police decision-making and the potential consequences for individuals affected by mistakes caused by algorithmic policing tools have called to the forefront the need for functional safeguards for individual rights as well as the rule of law in this context.Footnote 4 In line with this, the use of AI tools for law enforcement purposes has been highlighted in the proposed new EU AI-act (‘AIA’) as high-risk, with the proposal outlining several limits to its use.Footnote 5 However, there are already existing rules and principles relevant for the use of such tools in the data protection framework of the EU, more specifically in Directive (EU) 2016/680 (‘the Law enforcement directive’, or ‘LED’).Footnote 6 Recently, these rules have been subject to significant developments through ECJ case law, which independently of the final form of the AIA will place certain limits on the use of AI for policing, particularly those pertaining to the processing of sensitive personal data under the LED.

To better illustrate these developments and put them into perspective, we can turn to the specific example of automated facial recognition. This technology is of particular interest as it pinpoints a core challenge of law enforcement—the identification of unique individuals—and thus, in the words of Kotsoglou and Oswald, ‘has the potential to revolutionise the identification process, facilitate crime detection, and eliminate misidentification of suspects’.Footnote 7 Facial recognition technology thus represents a clear example of shifting technological affordances,Footnote 8 as technology now affords a type of rapid mass-identification of individuals which previously would have been impossible. This in turn shifts the implications of the wide availability of images and video.Footnote 9 The technology is therefore simultaneously associated with significant risks, due to the potential consequences of a misidentification.Footnote 10 For the individual, some of the most obvious risks relates to false positives, being wrongly identified as a suspect which may trigger further, potentially far-reaching intrusive investigative measures. For law enforcement agencies, false negatives imply a risk of failing to identify a suspect, while false positives on the other hand will divert investigative resources towards unfruitful avenues or worse, towards suspects that may wrongfully be charged with a crime.Footnote 11 The risk of false positives have been said to be higher for individuals belonging to overpoliced communities, adding a discriminatory potential to the technology.Footnote 12 There is also the risk that through automation bias, the outcome of the application of facial recognition systems may be favoured in the assessment of evidence, even faced with contradicting evidence.Footnote 13

However, the potential harms of facial recognition technology go beyond false positives. The European Data Protection Board (‘EDPB’) has highlighted that the processing of biometric information implied by such systems constitutes a serious interference with the fundamental rights without taking the outcome of the processing into account. Indeed, so are legislative measures allowing their use under Articles 7 and 8 of the Charter.Footnote 14 The EDPB goes even further, pointing to how biometric data and facial recognition technology may impact the right to human dignity under Article 1 of the Charter, which requires that human beings are not treated as mere objects. This observation is based on how facial recognition technology ‘calculates existential and highly personal characteristics, the facial features, into a machine-readable form with the purpose of using it as a human license plate or ID card, thereby objectifying the face’.Footnote 15 However, the EDPB does not mention the potential implications of this—the right to human dignity is inviolable, so should facial recognition technology be found to interfere with it, it cannot be justified under any circumstances.Footnote 16 Still, the risks involved have so far not been held to generally preclude facial recognition systems being put to use in law enforcement agencies, as long as their use can be justified as legitimate interferences under the relevant fundamental rights, with the relevant data processed in line with the requirements of the LED, and authorised by law.Footnote 17 However, certain applications of facial recognition technology have been highlighted as particularly invasive by the EDPB. These include remote automated biometric identification of individuals in public spaces, biometric profiling, emotion recognition, and populating facial recognition databases through scraping of social media accounts. While the specifics are at this point still under negotiation, many of these applications are likely to be specifically regulated by the upcoming AIA.Footnote 18 However, in the shadow of these more contentious applications of biometric identification exist a range of applications that could be considered as more mainstream. This includes the forensic matching of suspects between videos or photos appearing in a criminal investigation, to existing police databases. Such applications may not raise the specter of mass-surveillance to the same extent as those highlighted by the EDPB, but still involves the processing of sensitive biometric data of individuals, the risk of false positives (and negatives), and involve what is generally perceived of as AI technology to reach law enforcement objectives.

One such automated facial recognition system has indeed recently been implemented by the Swedish police authority and will serve as our example of some of the specific issues involved under EU law. This system was introduced following a review in 2019 by the Swedish Authority for Privacy Protection (at the time named ‘Datainspektionen’, henceforth ‘the DPA’). The review came as a result of a request from the Swedish national police authority within the framework of ‘prior consultations’ required under the Swedish transposition of the LED before conducting processing operations of high sensitivity.Footnote 19 The request concerned the use of facial recognition software to forensically match suspects in crime scene surveillance footage or other forensic imagery against the national database of criminal offender photographs (‘Nationella signalementsregistret’). This would, according to the police authority be more effective than traditional and time-consuming manual analysis by human investigators.Footnote 20 Having considered the request, the DPA eventually found in October of 2019 that the use of facial recognition was allowed. It based this conclusion primarily on the judgment of the ECJ in Case C-524/06, Heinz Huber,Footnote 21 finding that the requirement of the processing being ‘necessary’ in the Swedish act transposing the LED should be interpreted as ‘something that is needed to effectively carry out a task’. The DPA consequently found that it was ‘clear that the planned processing using facial recognition technology to identify perpetrators is significantly more effective than individual officers making this selection manually’ and approved the use of the software in this context.Footnote 22 Following this decision, the use of biometric matching was implemented within the Swedish police as a new forensic standard procedure in May 2021. Photos of unknown persons are now processed through facial recognition and matched against existing photos in the newly developed police database ABIS (Automatic Biometric Identification System).Footnote 23

The finding by the Swedish DPA that enabled the implementation of this system raises, however, a few issues of both practical and theoretical importance, and highlights what appears to be a tension in the interpretation of the requirement of necessity under European data protection law, especially in relation to the processing of sensitive personal in the law enforcement context.

First, there is the question of what level of necessity should be applied in considering the processing of biometric data under the LED. Article 10 of the LED states that such processing should be allowed only when it is ‘strictly necessary’. The Swedish transposition of these rules in the context of forensic investigations state that biometric data can be processed when it is ‘absolutely necessary’ for the purpose of the processing,Footnote 24 which includes forensic matching.Footnote 25 But the necessity level the Swedish DPA applied in its decision was that of ordinary necessity,Footnote 26 as was the requirement interpreted by the ECJ in Heinz Huber, which Swedish DPA relied on.Footnote 27 Meanwhile, the ECJ has in numerous cases held that all derogations from the right to data protection must be limited to what is strictly necessary,Footnote 28 and has recently established what seems to be an even higher level of necessity in relation to the processing of biometric data under the LED.Footnote 29

Second, the decision by the Swedish DPA highlights the rather opaque relationship that seem to exist between the necessity requirement and effectiveness. Given that manual analysis of surveillance footage is possible, and indeed represents the way this has been done for years, one could argue that the use of facial recognition software is not so much necessary, as it is more efficient. But is comparable efficiency enough to satisfy the requirement of necessity in this context?

Third, the analysis by the Swedish DPA contained no further assessment of proportionality strictu sensu, where the interest of the Swedish police to effectively analyse crime-scene footage was weighed against the damage done to right to data protection. Indeed, no such requirement seems to follow from the wording of the LED, where Article 10 only mentions that processing of sensitive personal data should be ‘allowed only where strictly necessary’. However, the ECJ has increasingly made use of proportionality balancing in relation to derogations from the right to data protection.Footnote 30 How could these developments impact the legal requirements surrounding the use of biometric data and AI technologies in this context?

Analysing these questions allows for a fourth and final question; what impact will this have on the use of emerging AI-technologies as decision support or analysis tools in the law enforcement context, and the intended function of the LED to protect personal data in this setting? This final question is of salience in light of the proposed AIA, as it will determine what role the act will play as a further necessary safeguard in relation to processing of sensitive categories of data through AI technologies.

In this paper, these questions will be analysed in light of case law from the ECJ interpreting the LED and the GDPR; guidelines from the European Data Protection Board; decisions from national data protection agencies; as well as literature surrounding proportionality as a principle of constitutional and data protection law. The discussion will be focused on semi-automated decision-making, rather than fully automated decisions, as the former is (still) more prevalent in the context of both criminal investigations and police intelligence operations where few decisions are taken without first having been reviewed by a human at some stage of the decision-making process. The focus on automation and the implications of AI in this context is thus on its role as potential decision-support tools and automated analysis, rather than on making final decisions without human input.

Before proceeding to the analysis of necessity, we need to take a quick look at the specific data protection context that surrounds processing of personal data within law enforcement agencies.

2 Data Protection in the Law Enforcement Context

2.1 The Law Enforcement Directive: A Very Short Introduction

The LED is part of the overall EU data protection framework along with its more famous cousin, the General Data Protection Regulation (GDPR).Footnote 31 Given the special context of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, the directive served as a way of ensuring a more specifically tailored legal framework in that setting.Footnote 32 Implicitly this gave member states further autonomy in choosing how to implement the necessary level of data protection. The LED has been described as ‘a major step forward’ in data protection showing ‘that it now will be possible to achieve high privacy and data protection standards while processing personal data for law enforcement purposes in a more flexible manner’.Footnote 33 In comparison to the framework decision it replaced, the directive has been seen as increasing harmonisation as well as data protection standards.Footnote 34 Yet some characteristics of the LED have also been described as challenging those positive outcomes, particularly ‘the lack of specific guidelines on how certain general concepts such as necessity, proportionality and appropriateness are to be implemented and applied by member states to balance privacy with security and other civil rights’.Footnote 35

2.2 The Relative (Ir)relevance of the Rules on Automated Decision-Making

Before delving into the specifics of necessity in relation to the processing of sensitive personal data in the LED, it is worth pointing out the specific rules existing on automated individual decision-making (including profiling) in Article 11 of the LED. This rule contains what at first glance may appear to be a prohibition against the use of such decision-making, but one which is subject to several exceptions. First, Article 11 only refers to decisions based solely on automated processing, including profiling. The operative terms used in this rule are the same as in Article 22 of the GDPR, where the consensus have been that automated processes fall outside of the field of application of the article when they remain decisional support tools, ‘provided the human decision-maker considers the merits of the result rather than being blindly or automatically steered by the process’.Footnote 36 Second, automated decisions as defined in Article 11 LED are still allowed if authorised by union or member state law which provides appropriate safeguards for the rights and freedoms of the data subject. Third, although a specific mention is made in Article 11(2) LED of how wholly automated decisions should not be made using special categories of personal data referred to in Article 10, this prohibition contains a similar exception if ‘suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place’. While this type of decision-making is not the focus of this contribution, as the type of decision support and analysis tools that are discussed here only supports human decision-making, it is worth noting that the protections in Article 11 are circumscribed to the point where automated decisions, even based on sensitive personal data like biometrics, are fully possible to implement under the LED, although fundamental rights as interpreted by the ECJ may form an additional layer of protection.Footnote 37

2.3 The Likely Limits of the Proposed EU AI Act

Before moving onto necessity under the EU data protection framework, it is worth to briefly mention the intended relationship between the LED and the proposed AIA in this context. In relation to the processing of biometric data, the latest draftFootnote 38 establishes new rules limiting the use of ‘real-time’ remote biometric identification of natural persons for the purposes of law enforcement in publicly accessible spaces. The rules do not include a complete ban against the use of AI technologies for this purpose but limits its use through draft Article 5d to specific situations, such as the targeted search for specific potential victims of crime or the prevention of a specific and substantial threat to the critical infrastructure, life, health or physical safety of natural persons or the prevention of a terrorist attacks. It would also, according to draft Article 5 d (3) be subject to prior judicial authorisation. In this context, the AIA would according to draft recital 23 become lex specialis, and the use of such real-time remote identification could therefore not be based on Article 10 LED. However, all other processing of biometric data and other personal data involved in the use of AI systems for biometric identification, other than in connection to the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement would, according to draft recital 24, fall under Article 10 LED. In other words, the LED will still form the main legal safeguard, with the AIA providing additional limits in certain high-risk contexts. As such, the forensic matching we are using as our example here, would, most likely, still be primarily governed by the LED, rather than the AIA.

3 Necessity and Data Protection

3.1 Necessity, Strict Necessity, or Absolute Necessity?: The LED Meets the ECJ

3.1.1 The Early Discussion Surrounding Necessity in the LED

One of the previously mentioned flexibilities implied by the LED is that in contrast to the GDPR, the processing of sensitive categoriesFootnote 39 of personal data is not prohibited as a rule, rather it is restricted through Article 10 of the LED to situations where it is strictly necessary and subject to appropriate safeguards for the rights and freedoms of the data subjects. This differentiates sensitive data from the less stringent requirement of ‘ordinary’ necessity required for processing of non-sensitive data established in Article 8 of the LED. This reflects the legitimate need for law enforcement agencies to process sensitive categories of data in some of their tasks, from the use of fingerprints (i.e. biometric data) to identify suspects at crime-scenes to the processing of data on the religious or ethnic background of victims in cases of hate crime. Given the sensitivity of this data, the requirement of strict necessity makes it clear that processing of such data should not be a routine measure.

This two-pronged approach to necessity stemming from the text of the LED has however for a long time become increasingly muddled in light of case law of the ECJ in the data protection context. The ECJ has over time established strict necessity as a basic requirement for derogations and limitations of the protection of personal data, without limiting this analysis to the protection of special categories of sensitive data.Footnote 40 While the cases establishing this level of necessity has concerned primarily the previous data protection directive and the GDPR, the European Data Protection Supervisor (‘EDPS’) has concluded that this strict necessity applies as an overall requirement within the context of the LED as well.Footnote 41 But if ‘plain’ necessity should be construed strictly, what is then to be made of the specific reference to strict necessity in Article 10 LED? The conclusion of the Article 29 Working Party (the precursor to the current EDPB) was that ‘the term “strictly necessary” in Article 10 must be understood as a call to pay particular attention to the necessity principle in the context of processing special categories of data, as well as to foresee precise and particularly solid justifications for the processing of such data’.Footnote 42 Recently we received some clarity in this matter, as the ECJ interpreted the requirement of necessity under Article 10 LED for the first time in a ruling that partly confirmed these views, but which arguably went even further—establishing what must be regarded as a new threshold of necessity above strictly necessary.

3.1.2 Absolute Necessity in Criminal Proceedings Against V.S.: Establishing a New Threshold

In Case C-205/32, Criminal proceedings against V.S., delivered in January of 2023, the ECJ considered a request for preliminary ruling from a Bulgarian court on the interpretation of the LED.Footnote 43 With regards to necessity, the ECJ interpreted the question posed by the Bulgarian court as whether Article 10 LED (read in conjunction with general principles of processing personal data in Article 4 LED and the requirements of lawful processing in Article 8 LED) precluded national legislation which provided for the systematic collection of biometric and genetic data of any person accused of an intentional offence subject to public prosecution. Specifically, when no specific obligations were established in national law for the competent authority to determine and demonstrate the necessity of that collection for specific objectives pursued and that those objectives could not be achieved by collecting only a part of the data concerned.Footnote 44

The ECJ began answering this question by acknowledging the special sensitivity of both the data at issue and the context of their processing. These implied, the court held, significant risks to the Charter rights to respect for privacy and for data protection.Footnote 45

Proceeding to the specific interpretation of ‘strictly necessary’ in Article 10 LED the ECJ focused on the difference between strict necessity in that article, and the requirement of necessity in the directive in general. Here, the court specifically highlighted certain linguistic factors. The court began by taking note of the French-language version of the LED, where the term used in Article 10 was nécessité absolue’, which the court found established ‘strengthened conditions for lawful processing of sensitive data’.Footnote 46 This led to two interesting conclusions:

Thus, first, the use of the adverb ‘only’ before the words ‘where strictly necessary’ underlines that the processing of special categories of data, within the meaning of Article 10 [LED], will be capable of being regarded as necessary solely in a limited number of cases. Second, the fact that the necessity for processing of such data is an ‘absolute’ one [(‘absolue’)] signifies that that necessity is to be assessed with particular rigour.Footnote 47

The court then proceeded to tone down the significance of how the term ‘strictly necessary’ was used in certain other language versions, as those words also implied a strengthened condition. In this context the court also took note of how that requirement had been added late in the legislative process, implying an ambition to give greater protection to persons subject to such processing.Footnote 48

It is interesting to note how the court in Criminal proceedings against V.S. has essentially established a brand-new standard of necessity, above strict necessity, thus resolving the terminological paradox of ordinary vs. strict necessity in the LED. In doing so, the court also made it clear that this level of necessity will only be reached in a limited number of cases, thereby signaling the exceptional nature of the processing of this type of sensitive data. The court went on to observe that this level of necessity entails particularly strict checking of that the requirement of data minimisation under the LED has been met. This interpretation also confirms one made by the EDPS in its proposed guidelines on the use of facial recognition technology in the area of law enforcement,Footnote 49 which were published for public consultation in mid-2022, prior to the ruling in V.S. Here, the implication of strict necessity under the LED was interpreted by the EDPS as the measure needing to be indispensable. ‘It restricts the margin of appreciation permitted to the law enforcement authority in the necessity test to an absolute minimum’.Footnote 50 The EDPS also tied this to the requirement of objective criteria defining the ‘circumstances and conditions under which processing can be undertaken, thus excluding any processing of a general or systematic nature’.Footnote 51

There is one more significant thing to note in the ECJ ruling in V.S. The court held that given the strict review that is warranted, it is also necessary to consider the nature of the objective pursued, to ensure that the processing is connected to ‘the prevention of criminal offences or threats to public security displaying a certain degree of seriousness, the punishment of such offences or protection against such threats’.Footnote 52 Implicitly, the court here moves beyond necessity and into proportionality balancing, but without making any explicit reference to the latter principle. To understand the significance of this, we need to look at how the ECJ case law on necessity has developed up to the present ruling. As we will see, proportionality has played an increasingly important role in this case law, going from largely absent to explicitly used to limit restrictions of the right to data protection.

3.2 The Emergence of Strict Necessity and Proportionality in the ECJ Case Law on Data Protection

3.2.1 From Strict Necessity as an Interpretive to an Overall Requirement

In most of the cases leading up to Criminal proceedings against V.S. where the ECJ has expounded on the necessity requirement, it has done so in relation to national or union legislative measures derogating from the right to privacy and data protection. In the first case where the court mentions this application of strict necessity, Case C-73/07, Satakunnan, it was in explaining the specific derogations allowed by specific chapters of the Data Protection Directive (DPD) as they were to be applied in the balancing against freedom of expression with regards to the publication of tax-records by a newspaper.Footnote 53

In order to take account of the importance of the right to freedom of expression in every democratic society, it is necessary, first, to interpret notions relating to that freedom, such as journalism, broadly. Secondly, and in order to achieve a balance between the two fundamental rights, the protection of the fundamental right to privacy requires that the derogations and limitations in relation to the protection of data provided for in the chapters of the directive referred to above must apply only in so far as is strictly necessary.Footnote 54

This case can arguably be said to establish mainly that the specific derogations allowed by the DPD should be interpreted narrowly to ensure that they are limited to what is strictly necessary. The ruling in Satakunnan was however later referred to in case S-92/09 and C-93/09, Volker und Markus Scheke, which concerned the validity of regulations requiring the publication of beneficiaries of agricultural funds.Footnote 55 This time the ECJ did not limit its statement to the DPD, but rather held that derogations and limitations in relation to the protection of personal data must apply only in so far as is strictly necessary.Footnote 56 While widening the scope of the statement from the specific interpretation of the DPD to a more general statement, this ECJ also moved its analysis to the legislative level, i.e. an analysis of a requirement to publish information established by the EU in a regulation, which was not strictly necessary to achieve the aim of the public interest.Footnote 57

In the subsequent case C-473/12, IPI, the ECJ referred to the requirement of strict necessity as ‘settled case law’ and related it to ‘the protection of the fundamental right to privacy’.Footnote 58 Again, this case related to a question on the national legislative framework of data protection, specifically the possible exceptions established by that framework in relation to the activities of private detectives.Footnote 59

Through these three cases, we have moved from the specific interpretation of the DPD to more general considerations about the fundamental right to privacy and data protection, but in doing so, the court has also moved towards an application of the requirement in the context of legislative measures rather than specific processing operations within a specific legislative framework. This continued through case C-293/12 and C-594/12, the Digital Rights Ireland judgment, which also relates to the legislative level, this time the EU data retention directive.Footnote 60

In 2014 the ECJ produced one of the few exceptions to the tendency to discuss strict necessity on the legislative level, case C-212/13, Ryneš.Footnote 61 This concerned a specific processing, a home security camera installed by a private person which recorded the entrance to his home as well as a public footpath and the entrance to the house opposite, and whether this processing could fall under the exception in DPD Article 3(2) for ‘purely personal or household activity’.Footnote 62 The ECJ again repeated its statement on the need for strict necessity, again in relation to the Charter right to privacy, but ended up with a more narrowly tailored implication of this strict necessity—namely that the exception in DPD Article 3(2) ‘must be narrowly construed’.Footnote 63

In C-362/14, Schrems (I), the ECJ returned to the legislative level and applied strict necessity to the data transfer agreement between the EU and the United States, finding that a generalised storage, transfer, and access, of all the personal data of all the persons whose data has been transferred to the United States could not be strictly necessary for the objective pursued.Footnote 64 Again, as the court moved to analysing a legal framework surrounding the protection of personal data, the strict necessity criteria seems to be used more actively and with more expansive consequences.

These cases indicate a tendency by the court to look towards the fundamental right roots of data protection to resolve cases before them, rather than being caught in more specific articles of the data protection rules in question.

3.2.2 Necessity, Proportionality, and Automation

An interesting development in the ECJ approach to necessity came the following year, in the 2016 Tele2 judgment (C-203/15 and C-698/15).Footnote 65 Here, the court again considered the legislative framework surrounding the protection of personal data, this time the Swedish and British national rules on data retention for the purposes of investigating and preventing serious crime. After having up until now touched upon the requirement of necessity as a distinct requirement, the ECJ now found it to be explicitly derived from the principle of proportionality:

Due regard to the principle of proportionality also derives from the Court’s settled case-law to the effect that the protection of the fundamental right to respect for private life at EU level requires that derogations from and limitations on the protection of personal data should apply only in so far as is strictly necessary.Footnote 66

It is reasonable to assume that the ECJ here talked about proportionality as a wider analytical framework, within which necessity forms a distinct step before proportionality strictu sensu.Footnote 67 The ECJ went on to make observations surrounding the general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communications.Footnote 68 In doing so, the ECJ did not stray from a more narrow definition of necessity as a means to sort out derogations that unnecessarily impacts the right to data protection, as the court did not begin to discuss balancing against opposing interests until later in the same judgment.Footnote 69 Still, the explicit reference to proportionality had not been made before, and once mentioned in the Tele2 ruling, it returned in later cases.

In Opinion 1/15 on the Draft agreement between Canada and the European Union for the Transfer of Passenger Name Record data from the European Union to Canada, the court again reiterated that reference to the principle of proportionality, adding that it requires ‘in accordance with settled case-law of the Court, that derogations from and limitations on the protection of personal data should apply only in so far as is strictly necessary’.Footnote 70 Again, the statement by the ECJ was made in the context of analysing legislative level frameworks and safeguards of data protection, to ensure interference is limited to what is strictly necessary. Interestingly, the court here added that ‘[t]he need for such safeguards is all the greater where personal data is subject to automated processing […] particularly where the protection of the particular category of personal data that is sensitive data is at stake’.Footnote 71 This statement is directly relevant for our understanding of necessity in relation to the automated processing of biometric data implied by facial recognition technologies.

Similar concerns of automated processing and sensitive categories of data were later repeated in La Quadrature du Net,Footnote 72 and in Ligue des Droits Humains,Footnote 73 respectively. In these cases, the court was more explicit with proportionality balancing forming an additional step after the assessment of strict necessity, adding some needed clarity in terms of its view on the process of proportionality analysis. Again, the court in these cases analysed the strict necessity in relation to legislative frameworks derogating from the right to data protection, this time surrounding the access of law enforcement agencies to communications data in La Quadrature du Net, and passenger name records from airlines in Ligue des Droits Humains.Footnote 74

3.2.3 Some Preliminary Observations on Absolute Necessity and Proportionality

Summing up the case law discussed so far, it seems quite clear that the ECJ views strict necessity as an overall requirement for derogations in relation to the rights of data protection and privacy. However, as this analysis also shows, the application of that requirement seems most consistent whenever the court have been asked to consider the overall legislative framework establishing such derogations. In these cases, the court has also moved towards explicitly engaging in proportionality balancing as an additional step after analysing the strict necessity of the legislative framework allowing for restrictions of the right to data protection. In the few cases where the court expounded on the specific processing of data within such frameworks prior to Criminal procedures against V.S., the court has similarly made mentions of the strict necessity requirement, but applied it somewhat differently, mainly as a dictum to narrowly interpret the specific rules allowing for such derogation. For quite some time this suggested the possibility that the two-pronged necessity requirements of the LED, separating ‘ordinary’ from ‘strict’ necessity could have held in the face of ECJ scrutiny, through the distinction between legislative measures that derogate from the right to data protection, which the ECJ requires to meet the threshold of strict necessity, and specific measures taken within the scope of such legislation, such as the LED and its national transposition the necessity requirement would still depend on the category of data. In the latter context the requirement would primarily have been a call for a narrower interpretation in light of the rights of the Charter. We now know that the ECJ instead chose a different approach in Criminal procedures against V.S. and instead elevated ‘strict necessity’ in the context of Article 10 LED to what amounts to ‘absolute necessity’. Still, while the ECJ ruling answered some questions it created a few new ones as well.

A first question arises from the fact that the ruling in V.S. did not contain any of the usual references to earlier case law establishing the requirement of strict necessity in relation to restrictions of the right to data protection. This is noteworthy, as references to for example Satakunnan; Volker und Markus Schecke and Eifert; and Digital Rights Ireland, have become something of a formula when the court applies necessity in the data protection context. Also, the court never explicitly mentions the overall requirement of strict necessity repeated in those cases, choosing instead to differentiate between those articles in the LED referencing only ‘necessity’ from that of ‘strict necessity’ in Article 10 requiring strengthened conditions of scrutiny.Footnote 75 This may be explained by how these often-mentioned cases relate to the DPD and the GDPR rather than the LED, and the need to separate the different contexts of processing. Still, in doing so, the ECJ missed an opportunity to clearly establish that (or at least clarify whether) the requirement of strict necessity does in fact carry across to the LED context outside of the scope of Article 10. However, one could argue that by establishing that ‘strict necessity’ in Article 10 should, in fact, be construed as ‘absolute necessity’, the ECJ implicitly suggest the need for something stricter than ‘strict necessity’—a need that reasonably arises due to the overall requirement of strict necessity in the data protection context.

A second question arises from how the ECJ avoided explicit mentions of proportionality balancing, instead opting for incorporating the weight of the purpose underpinning processing of sensitive data under the heading of necessity. This constitutes a break from the trend established in ECJ case law ever since the Tele2 ruling. A possible explanation is that given how the court is acting not on the legislative level, but within the framework of the LED, the court maintained its more cautious approach by incorporating some concerns that traditionally have fallen under proportionality stricto sensu into the requirements under Article 10, rather than adding proportionality as a separate concern based on the application of Charter rights extraneous to the LED. While such an application of proportionality based on fundamental rights would carry additional weight, it is also something the court has generally preserved for cases where it assesses general legislative frameworks derogating from the right to data protection, rather than interpreting specific rules in secondary EU law. Also, by incorporating these considerations into the requirements included in Article 10, this requirement is likely to assert itself further in the application of this article within law enforcement agencies that may be less inclined to consider the need for overall proportionality assessment under fundamental rights documents.

3.3 Implications of Necessity

3.3.1 Necessity and Least Restrictive Means

Having established the standard of necessity required, we may now move to consider the actual implications of necessity, strict necessity, and absolute necessity, in the data protection context and in relation to the use of AI as decision-support tools.

To do this, we need to further consider what the necessity test implies. As suggested by the name of the test, it requires that a limitation of a right can in fact be proven to be necessary to reach the aim. The intention is to sort out unnecessarily intrusive measures, without having to resort to more complicated and controversial balancing exercises as part of proportionality in the strict sense.Footnote 76 Implicit in this necessity analysis is also the least restrictive means test, which includes an analysis of whether alternative means exist which still contribute effectively to the intended aim but which would restrict the right to a lesser extent.Footnote 77 If we can find such alternative means this is a clear indication that necessity is not met.

One example of where the ECJ has found a violation of necessity through the least restrictive means test concerned the public disclosure of penalty points for road traffic offences in Latvia. Asked to consider whether this system complied with the right to data protection, the ECJ compared the Latvian legislation to alternatives in other member states using less privacy sensitive preventive measures such as public awareness campaigns or driving tests to reach the same goal. Such measures did not carry the same risk of social disapproval and stigmatisation of the data subject, yet according to the ECJ there was no indication that the Latvian legislature had considered them. As such, the court found that the Latvian system was not strictly necessary.Footnote 78

In other cases, such as those relating to data retention and access to communications data by law enforcement agencies, the analysis of necessity is analytically somewhat intermingled with more overarching proportionality concerns, but certain parts of the ECJ jurisprudence in this context are clear expressions of necessity and questions of least restrictive means. For example, the retention of all communications data, for all users of communications networks, goes beyond what is necessary for the prevention of serious crimes or the protection of national security.Footnote 79 As such, some type of criteria must be implemented that implies a possible link between the affected individuals and the crimes the measure is intended to prevent—otherwise it will impact a wider set of individuals than is necessary. The associated proportionality aspect of this equation is that the significant impact of these measures for the rights in question can only be motivated by the fight against serious crimes or the protection of national security.Footnote 80

It is not entirely clear how this requirement would translate into the facial recognition context. The Swedish police authority has stressed that searching through video footage in preliminary investigations will necessitate a scan of all faces that appear in the footage to identify an individual of interest to the investigation.Footnote 81 In other words, the scan is conducted precisely to identify the possible link between an individual and the crime. To some extent this scan will, in the case of preliminary investigations, likely relate to a specific location relevant to the investigation of a crime. This could imply that there is a limit to the affected individuals and that at least a possible link exists between the person, through the location, to a possible crime. Geographical limits have been highlighted by the ECJ as a possible measure to make communications data retention conform to the necessity requirement of the Charter.Footnote 82 However, in situations where the crime being investigated through the use of facial recognition technology has been committed in a public space with larger crowds present, such as in the case of riots occurring in the context of a political protest, this limitation is rendered less functional. In such situations, the processed images may also reveal political opinions of participants in the demonstration which adds further concerns to the existing processing of biometric data.Footnote 83

It should be said that the way in which the ECJ has expressed the least restrictive means test adds a certain note of confusion to the relationship between ordinary and strict necessity. In Proceedings brought by B, the ECJ applied a test established in the discussion of ‘regular’ necessity in recital 39 of the GDPR to explain the implications for the strict necessity requirement emanating from the Charter. This requirement, the ECJ held:

is not met where the objective of general interest pursued can reasonably be achieved just as effectively by other means less restrictive of the fundamental rights of data subjects, in particular the rights to respect for private life and to the protection of personal data guaranteed in Articles 7 and 8 of the Charter, since derogations and limitations in relation to the principle of protection of such data must apply only in so far as is strictly necessary […]Footnote 84

This may indicate that in terms of the least restrictive means test, the threshold is similar under the two levels of necessity. In other words, the standard of just as effectively for comparisons to other potential measures that could fulfil part of the legislative aim, is the same. If so, the level of strictness or scrutiny that distinguishes necessity from strict necessity would apply primarily to the care with which courts and public authorities are expected to perform their review of necessity and the safeguards surrounding the measure. Essentially it would limit the margin of appreciation permitted to the law enforcement authority in the necessity test to a minimum.

3.3.2 Effectiveness or Efficiency: Two Sides of the Same Coin?

The question of whether the standard of just as effectively within the test of the least restrictive means can present a functional safeguard against unnecessary restrictions of the right to data protection will, ultimately, depend on the closer definition of effectively. A particular concern in this context is that moving from necessity to effectiveness may invite another subtle shift, whereby effectiveness is confused with efficiency.

We can find an example of this conceptual confusion in our example of the decision by the Swedish DPA, as it highlighted operational efficiency—as in cost/time efficiency—in its opinion on the Swedish police use of facial recognition, stating that the use of the system would be ‘more effective than traditional and time-consuming manual analysis by human investigators’.Footnote 85 In other words, the question was not whether the measure was essential to achieving the public interest, but rather that it was a more efficient way of reaching the same result.Footnote 86

In contrast, the EDPS has in its toolkit to assess the necessity of measures limiting the fundamental right to protection of personal data, stressed that convenience or cost effectiveness is not sufficient to reach the threshold of necessity.Footnote 87 The EDPS expresses that necessity requires a measure to be ‘genuinely effective, i.e. essential to achieve the objective of general interest pursued’.Footnote 88 Not only that, the toolkit holds that ‘[i]f the proposed measure includes the processing of sensitive data, a higher threshold should be applied in the assessment of effectiveness’.Footnote 89 Questions of operational efficiency, such as the saving of resources, are issues the toolkit instead highlights as part of the proportionality analysis, as it is a question that ‘requires the balancing with other competing interests of public interest’.Footnote 90

This analysis of the EDPS finds support in the theoretical literature on necessity as part of an overall proportionality assessment as well. Aharon Barak, for instance, holds that questions of costs are questions dealt with under the proportionality stricto sensu step of analysis.

Whenever the new means, whose limitation of the constitutional right is of a lesser extent, require additional expense, we can no longer conclude that the means originally chosen are not necessary. […] The issue, therefore, is whether the state’s choice of avoiding the additional expense in order to prevent the further limitation of a human right is constitutional. The necessity test cannot assist us in attempting to resolve this issue; indeed this discussion should be conducted within the framework of proportionality stricto sensu, which is based on balancing.Footnote 91

As previously mentioned, there has been a move towards more explicitly acknowledging the need proportionality balancing by the ECJ in the data protection context, which could contain these matters of efficiency as well. However, it could be argued that the ECJ has recently adopted an even stricter approach to necessity and operational efficiency.

In Case C-184/20 OT, the ECJ was asked to consider the publication of personal data of persons in charge of establishments receiving public funds, in a publicly accessible database on the Lithuanian Chief Ethics Commission’s website. The publication of this data was intended to allow for discovery of conflicting interests and combat corruption in the public sector. The disclosure forms published contained information about the declarant’s spouses, cohabitees, or partners, as well as information about presents received and transactions between partners, which may reveal sensitive personal characteristics, including sexual orientation.Footnote 92 In its preliminary ruling, the ECJ focused extensively on the necessity in terms of whether the measure was the least restrictive means to reach the aim. This, the court held, was a question that had to be assessed ‘in the light of all the matters of fact and law specific to the Member State concerned’.Footnote 93

One part of this discussion is particularly salient in this context. On the question of why the information in the database had to be available to the public rather than used only by anti-corruption authorities, the Lithuanian government had argued that the state it did not have sufficient human resources to check effectively all the declarations that were submitted to it.Footnote 94 Essentially, the Chief Ethics Commission counted on a form of crowdsourcing to allow for a more efficient use of human resources. The ECJ response to this approach was blunt and to the point.

However, it must be pointed out that a lack of resources allocated to the public authorities cannot in any event constitute a legitimate ground justifying interference with the fundamental rights guaranteed by the Charter.Footnote 95

On its face, this statement by the ECJ could essentially exclude concerns of efficacies of human resources or cost saving measures to underpin a claim of necessity. If so, this could carry far-reaching consequences for many different applications of technological methods of analysis, as many are implemented as cost-saving measures, or to reduce dependencies on limited human resources.

A more cautious reading may however be warranted. In the context of policing, as well as in other sectors, many cost-saving measures can plausibly be reframed as effectiveness-measures. As an example, the police may rightfully argue that the use of facial recognition is the only effective way to quickly identify potential perpetrators in crime-scene footage. However, implicit in this argument hides the fact that the police will always be resource constrained in comparison to its many tasks, with an associated need to prioritise resources.Footnote 96 The argument can thus be reframed; the only way to quickly identify potential perpetrators within existing resource parameters, may be to use facial recognition technology. This reframing does not however answer the question of whether this identification is accurate, or forensically valid,Footnote 97 which ultimately will determine whether the method is in fact effective.

Furthermore, in the context of automating certain tasks, comparisons to manual processing may also invite arguments relating to tensions that may exist between different principles and values within the LED as such. One example can be found in the Swedish Police Authority’s opinion on the previously mentioned proposed EDPS guidelines on the use of facial recognition technology in the area of law enforcement.Footnote 98 First, the Swedish Police Authority opined that the implications of the EDPS’s interpretation of ‘strictly necessary’ would be that the scope to use facial recognition technology would be limited to an absolute minimum. This would, the authority pointed out, ‘conflict with efforts of lawmakers and authorities to use the possibilities offered by technology to more effectively prevent and investigate crime and in doing so protect people’s fundamental rights and freedoms’.Footnote 99 The Authority also pointed to secondary effects of manual processing:

In the Swedish Police Authority’s view, it should be taken into consideration that the alternative in certain cases to using software for image analysis to assist in search and analysis, for example, may instead involve a large number of officers going through huge quantities of images manually. Even if it is possible in theory (in some cases), this would mean needing to actually access a large number of irrelevant images and would require the images to be saved for a considerably longer period of time, which is difficult to reconcile with the general principles in data protection regulations.Footnote 100

This argument is interesting as the use of facial recognition also requires access (by automated means) to a large number of irrelevant images, as well as biometric processing of all photos included in a database intended for comparison. Still, these arguments illustrate how issues of cost-efficiency can be restated as matters of effectiveness in relation to more overarching aims of crime investigations, such as the positive obligations of the state in safeguarding fundamental rights through the protection of life through effective measures to detect, prevent, and investigate crime.

Almost every part of policing, regardless of member state, is affected by the need to prioritise limited resources. The drive to implement technological systems will inevitably take place against this backdrop. As such, the impact of the statement by the ECJ in OT is still uncertain, but it is clear in the sense that arguments of cost-efficiency cannot singlehandedly justify limitations of fundamental rights under the umbrella of necessity. However, systems that can be argued to also improve effectiveness—reaching the aim faster, or with better accuracy—while still having positive effects for cost-efficiency—are likely to be accepted under this necessity analysis and need to be analysed further, through balancing stricto sensu.

4 Conclusions: Necessity and AI-Supported Decision-Making Under the Law Enforcement Directive

The recent case law from the ECJ has added much needed clarifications on necessity under the LED (see Fig. 1). While there are reasons to believe that EU data protection law can provide meaningful limits to the use of AI in the law enforcement settings, the conflation of effectiveness and efficiency is likely to remain a risk. This is due to the way the requirement of necessity has been expressed in the LED and the GDPR, not only in the context of sensitive data—with no mention of proportionality although such a requirement is implicit in the limitation of fundamental rights. As we have seen, there are increasingly explicit mentions of proportionality balancing in ECJ rulings on necessity under data protection frameworks, as well as in the current guidelines from the EDPB. This is in line with how proportionality theory stress that necessity is one (important) threshold requirement, before moving onto proportionality balancing. Still, the absence of proportionality in the relevant sections of the LED and the GDPR may—as we can see in our example from the Swedish DPA—lead to proportionality balancing potentially becoming lost in practice. This highlights the need for DPAs and courts to continuously acknowledge that ‘human rights cost money’,Footnote 101 and guard against strict necessity devolving into simple cost effectiveness.

Fig. 1
figure 1

Development of necessity in ECJ case law

When the ECJ baked in certain aspects of proportionality balancing into the analysis of necessity under Article 10 LED in Criminal proceedings against V.S., it may have countered that risk to some extent. It is a clear signal that the LED is not immune to proportionality considerations. It is also likely that the new threshold of ‘absolute necessity’ established in V.S. implies the need for a much stricter scrutiny of necessity, particularly in connection with the emphasis on data minimisation. However, the emphasis the ECJ placed on how processing of sensitive data should be a very clear exception rather than commonplace is likely to conflict with the ambitions of many member states in the law enforcement context.Footnote 102 Through the ruling in OT, the ECJ has also highlighted how considerations of lacking resources cannot itself be enough to warrant restrictions of fundamental rights. Taken together, the rulings in V.S. and OT have planted seeds that hold the potential to grow into effective legal limits for the implementation of AI systems that impact the fundamental right to data protection, particularly in contexts where sensitive categories of data are used. However, to what extent those seeds will grow depends on the extent to which DPAs and courts engage in critically examining the purported benefits of AI-supported decision-making.

Looking at the development of the principle of necessity in ECJ case law, it seems clear that it has, over time, become a focal point of the court’s reasoning in data protection cases. Simultaneously, the primary source of interpretation has shifted towards its fundamental rights foundations rather than its expression in secondary law. This is significant as it mirrors how the ECJ over time have emphasised data protection rights in the face of large-scale data processing and surveillance measures established in secondary law (such as in Digital Rights Ireland and Tele2) or international agreements (such as in Schrems and in Opinion 1/15). While the ECJ has not been entirely consistent, it is still clear that the interpretation of the requirements of the LED will need to be taken against this broader backdrop of fundamental rights.

A lot of attention as of late has been placed on the prospects of the upcoming AIA. However, the developments outlined here shows that data protections concerns are likely to remain highly relevant in relation to AI use by law enforcement even as the AIA arrives on the scene. There are several strengths brought by this framework as compared to the AIA. First, unlike the AIA, the data protection framework is largely technology agnostic and is not dependent on establishing the presence of a specific AI-technology. Second, data protection can come into play at the earlier stages of data gathering as a form of processing, rather than through the deployment of an AI-system. Third, it will remain relevant across the different categories of risk proposed in the AIA. While the AIA is likely to express that some uses of AI technologies, such as live ‘remote’ facial recognition of public spaces should be forbidden (at least as a main rule),Footnote 103 it should be noted that those same uses are likely to run afoul of the current data protection principles as well. It is in any case fair to say that the strength of the current data protection framework has not yet been fully tested in relation to emerging AI technologies, but it has the potential to act as a meaningful safeguard against the risks of such systems.

In another sense, however, the developments outlined in this contribution might be able to tell us something of what to expect of future interpretation of the AIA. As many of the articles of the act is built upon foundations found in fundamental rights such as the respect for privacy and human dignity, we should not be surprised if those foundations will slowly begin to emerge and influence the interpretation of the act as well. Perhaps it is through this process, rather than in any specific statutory addition to the current data protection framework that the real potential of the AIA will be located.