Keywords

1 Introduction

In the digital context, there are increasingly points at which users encounter data protection topics, especially to provide consent. At the same time, companies are heavily involved in handling data in a legally compliant manner and obtaining permission to process data. Entire industries have built up around “managing” user consent. However, a significant part of the added value of such solutions is to promise a high consent rate: That is, to apply designs that are lawful, but still nudge users to disclose data as much as possible. This does not have to happen through deceptive design [18, 37, 62] (see also the chapter “The Hows and Whys of Dark Patterns: Categorizations and Privacy”) but can also work by achieving transparency in terms of privacy and security safeguards (low risks) and presentation of added value (large benefit) [24, 51]. Nevertheless, it is usually done in the interest of the data processor that in turn can be to the detriment of a free decision by the customer aka data subject. The data-driven economy fosters exactly such unbalanced relations between the entities that gather and process personal information and the individuals who are often unaware of the extent and the significance of the processing [20]. By large, there are three ways of influencing said imbalance: First, by the increasing business incentive for companies to collect and make use of (especially: personal) data, actors become more likely to engage in more excessive data collection practices. Second, by the increasing complexity and opaqueness of algorithms used, it is becoming more difficult to explain data processing, especially to non-tech-savvy people. Third, the complexity of organizational and market structures likewise makes it more difficult to communicate what data are used, in which ways, and to whom it is transferred to or obtained by.

In this area of tension, the HCI sub-community called “Usable Privacy” is researching, among other things, the optimization of the user-friendliness of privacy management in terms of both awareness [9, 11, 32, 45, 74, 82] and control [3, 14, 15, 23, 31, 34, 58]. Often times, such research quite naturally intersects with data protection law: A classic example of this is the extensive study of privacy notices, which, in essence, illustrates a problem also present for other applications of law: the content is written by lawyers for lawyers and is hardly understandable for laypersons [64, 65]. Privacy communication tends to be too long, legalistic, and as such not graspable for consumers. In this context, various research contributions apply methods of information visualization, for example, to make texts easier to consume and clearer [49, 55, 56, 61, 78, 81].

The HCI community has been researching solutions to this problem in a variety of ways for a long time. However, so far existing concepts and structures of law are largely taken as “given” and designed around them to lay out user-friendly interfaces. Lawyers’ formulations and contents themselves usually remain untouched—at most, there is a discussion on prioritizing information on different layers or extension with icons or graphics. There are two flaws of researching data protection law in HCI, and both of which sadly make up for the majority of work: The first kind of studies plays along with legal requirements undisputed. Such studies are, for example, very popular in the domain of privacy policies, where the legal text receives a decent polishing with design measures, but the actual content remains unchanged and even unchallenged. The second kind, on the contrary, researches alternatives from a user perspective but ends up neglecting factual legal necessities altogether or does not have the ambition to include them into their reasoning. Both kinds of studies cannot and should not claim to engage in actual research in the domain of data protection law—let alone an interdisciplinary one. A deeper examination of legal concepts is typically left out, such that the actual potential of truly interdisciplinary work is often not exploited. Yet HCI, more than any other research field, has appropriate methods to inform jurisprudence and policymaking in the context of privacy in the digital space. On the side of HCI, however, a strategic and thorough engagement with legal concepts likewise is largely missing as of today.

In this chapter, therefore, we present the extensive impulses that are also coming from legal sciences themselves that motivate a more thorough engagement of HCI and legal sciences in a multi-stakeholder environment, which HCI so often claims to be sensitive for. To this end, we turn to the example of data protection legislation and discuss the legislative intentions surrounding the European General Data Protection Regulation (GDPR), which is not only forming the basis for data protection and use in all of the European Union, but moreover has become a blueprint for many international privacy legislations. GDPRs’ requirement of “effectiveness” of technical and organizational protection measures (Art. 25 GDPR) is in the center of interest, since, as by dominating legal interpretation, these include “data protection by design,” but also design measures, as these are ultimately reflected in technical measures and thus open the door for collaboration with HCI.

Afterward, we discuss in how far empirical research—and especially HCI-related research—has already engaged with law. We especially carve out the differences of the “Legal Design” approach and argue why HCI is more suitable for answering complex legal research questions.

We finally present practical examples to demonstrate three different levels of such cooperation between HCI and legal scholars, namely implementation, evaluation, and identification. We argue that especially the last point requires thorough collaboration and engagement with legal concepts discussed among legal scholars and has so far barely been conducted. However, we further argue that such engagement is necessary for HCI to unfold its capabilities of identifying both problems and solutions in multi-stakeholder environments such as data protection regulation (and arguably any regulation).

2 The Call for Effective Measures: A Door Opener for Empirical Sciences

But why should lawyers even care about HCI and its research methods (for an introduction to HCI research methods, please refer to the chapter “Achieving Usable Security and Privacy Through Human-Centered Design”)? First off, there is the value that iterative, user-centered design promises to all: Insights about how the research artifact (here: data protection legislation) unfolds in real life and thus input to design a better version in the next iteration. But there are more formal reasons for lawmakers to include HCI into reasoning, too:

With the GDPR, new rules for the processing of personal data have been applied throughout the EU since May 25, 2018. In recognizing the general problem of uncertainty in a complex world, the GDPR lays out certain processing principles (Art. 5 GDPR [69]) such as transparency, purpose limitation, data minimization as well as a set of specific rights for data subjects to intervene, correct or delete data, which the controller has to implement into the technical and organizational design of the processing operations. The regulations also relate to the way in which data processors must inform the data subjects about the processing (for example, Art. 5 sect. 1 lit. a alt. 3 and Art. 12 et seq.) and on this basis can obtain an “informed consent” from the data subject (in particular Art. 5 sect. 1 lit. a Alt. 2, Art. 6 sect. 1 lit. a and Art. 7) [69].

Interestingly, GDPR also states requirements that should sound familiar to researchers in Usable Privacy and beyond: For example, Article 12 (1) sent. 1 GDPR stipulates that the responsible person must take appropriate measures to provide the data subject with all information “in a precise, transparent, comprehensible and easily accessible form in clear and simple language” [69]. These requirements pick up on the general principle of fairness and transparency, which are a traditional playing field for research into Usable Privacy.

Additionally, said requirements must be implemented into the technical and organizational design of the processing operations according to the approach “Data Protection by Design and by Default” under Art. 25 GDPR [69], resembling the widely known Privacy by Design approach. The same article provides further guidance on how to determine the appropriate measures, namely based on the risks to the rights and freedoms of natural persons arising from the data processing. Their evaluation entails appropriate technical and organizational measures to be taken, which are designed to implement the data protection principles (such as legality and transparency/…) to protect data subjects, effectively [29]. In their guidelines on Art. 25 Data Protection by Design and by Default, the European Data Protection Board (EDPB) also defines key performance indicators, and to use qualitative methods and to seek feedback [26]. Similarly, the Article 29 Working Group states:

It is not only mandatory to disclose certain information about data practices, but even the comprehensibility and presentation of that information assume a central role to demonstrate compliance and its quality should even be empirically evaluated. [7]

The lawyers reading this may excuse us here, but “even” in the legal literature, it is the prevailing opinion that the formulations “appropriate measures” as defined in Article 12 (1) and “technical and organizational measures” as defined in Article 25 (1) also cover mechanisms and methods of user experience design [29, 40, 42]. It is simply too easy to draw the connection. Although Usable Privacy focuses on understanding demands in and designing usable solutions for user interactions in the digital sphere, its outcomes very much pose technical measures as designs are finally implemented in code. Especially in the digital domain, the measures to implement the principles of the GDPR, such as transparency, purpose limitation, or intervention rights, depend to a large extent on their usability and utility [13]. Thus, effectiveness hinges essentially on design issues and the evaluation of the user perspective.

Here, legal science now faces a major problem: How should such effectiveness be evaluated? The concept of effectiveness poses challenges for the legal sciences as to how this proof should be provided with the existing set of methods. Lawyers—which is not to take as a criticism—typically are not qualified for conducting appropriate empirical evaluations and lack the methodological expertise to fulfill said requirements.

HCI, on the other hand, has a strong empirical tradition of measuring “usability” [44] and therefore a natural alignment to scientifically measure effectiveness of data protection implementation and the user’s satisfaction of the interaction with a system [35]. The HCI community has conducted intense research on the needs for control in privacy management [3, 23, 34], transparency [4, 4, 23], system intelligibility and accountability [1, 12, 47, 68], and privacy awareness [33, 46]. Moreover, from a legal perspective, such recourse to experts for assessments and requirements is traditionally not alien to the legal sciences, especially in IT security. For example, regulation of smart metering as critical infrastructure has frequently turned to IT experts to draft security requirements. The cooperation of the normative–deductive legal approach with the empirical–inductive one of HCI unleashes great fruitful potential for the development of data protection—both on the level of legal texts, their interpretation in court, and in practical use with users.

3 Going Beyond Designing Law: The Case for the Full Toolbox of HCI Research

While Usable Privacy research does share a connection to data protection law in some ways, a close relative to the field would be legal design. Its self-declared goal is to make legal processes more accessible to laypersons and communicating them appropriately:

Legal design is a way of assessing and creating legal services, with a focus on how usable, useful, and engaging these services are.Footnote 1

Typically, legal design focuses on the state of the legal system and seeks to apply design thinking methods to communicate better. The ambition of legal design—much like HCI—so far, however, has largely limited itself to taking legal framework conditions for granted. Principles of information visualization and UX design are applied to legal content to make it easier to consume and clearer [49, 55, 56, 61, 78, 81]. To put it bluntly, however, privacy notices now look prettier, include a table of contents, are interactive and easier to consume, but remain basically unreadable for laymen because they are still fundamentally constructed to be a document from and for lawyers to fence out possible disagreement.

Similarly, several labsFootnote 2 digitalize legal documents and processes to effectively make them accessible to a much wider audience. These developments are important, yet they do not go beyond a certain level of engagement with legal sources themselves. They are about problem solving in applying the law, but much less about the problems of legal standards (and their implementation) in the first place. It might be the close relation to product development that, by asking how legal design can find solutions for clients, hinders actual legal innovation. After all, building solutions quite naturally places great emphasis on practicability and legal certainty. As a result, the core of legal design focuses on what design (and its competencies) can do for the legal system [25] and remains rather uncritical of the existing interpretations of the law itself and its underlying concepts.

While this focus of designing the law into artifacts that are usable and accessible is absolutely to be applauded, it leaves out much of the capabilities of (empirical) design research. What is largely missing in legal design—if not structurally, then de facto—is the innovation component of law itself. The claim is to methodically improve law or its interpretation. Finding that privacy notices are illegible is important. Designing them to structure text, provide links, and finally apply principles of information visualization is a major step into the right direction. However, it is also within HCI’s capabilities—if not even: responsibility—to explore potential for improvement in multi-stakeholder environments such as the legal system is by nature. Constructively addressing the issue would mean identifying ways to improve the communication of the legal requirements themselves. From this perspective, enriching privacy notices with interactivity and layered approaches only scratches the surface of the issue of the lack of readability of privacy notices.

What is needed is scientific data protection law research or call it a research stream for legal design. There are concepts and principles in data protection law that are abstract and especially under the imperative of effectivity, call for empirical evaluation, such as the requirement of “transparency,” “fairness,” or the principle of purpose limitation” in GDPR. Even if those are not new, they still need to be filled with meaning in a comparable and sound way, respecting all stakeholders, and based on scientific reasoning. Arguably, the toolbox of HCI is extremely well -suited to investigate how law unfolds in practice. Both its quantitative and qualitative methods are needed to be able to explore problems existing and then to design and test alternative solutions reliably. It also calls for the HCI community to strongly engage with the legal framework conditions, especially to identify the aspects of the law that need negotiation of interests and interpretation. Such commitment, however, calls for a deeper inspection of also normative requirements. These sometimes may be quite apparent, such as in the case of Article 12 GDPR, which is even named “transparent communication[..],” arguably triggering every HCI researcher right away. In other cases, the normative requirements analysis takes some more effort, as for example the principle of purpose limitation. While not a new principle in data protection, its provision is to strike a balance between information of data subjects on the limits of data processing and practically necessary leeway for processors [8]. With evolving technological means, this balance arguably has shifted, and the implementation of the principle needs evaluation in terms of its effectiveness to inform data subjects. Such components that open a design space need to be filled with meaning, some more obvious, others less so. Especially data protection principles such as those of transparency and purpose limitation are overarching themes in the GDPR, which desperately need scientific interpretation and evaluation for data controllers as well as lawyers to rely on. But there also are more structured aspects, such as requirements for data subject rights and their implementation and assessment of communication requirements and access. In essence, while law benefits from incorporating empirical research of HCI as an example of evidence-based regulation, HCI can benefit from a deeper engagement with legal provisions to make its research more applicable and legally certain, thus increasing connectivity of its research for law.

4 Levels of Engagement: How HCI and Law Can Make Data Protection More Effective

Research in Human–Computer Interaction typically targets improving interactive systems for human interaction. Next to this very down-to-earth perspective stemming from usability engineering and user experience design, a noticeable subgroup of researchers also follows a value-driven, normative agenda. For the example case of data protection, such an agenda manifests inter alia in bringing together law and HCI following the joint goal of providing more user-friendly, safer data protection.

What is more, HCI, like arguably no other discipline, has the methods to inform regulation in both planning and enforcement—not only of data protection. As a result, interdisciplinary collaboration also has a two more levels, beyond designing Usable Privacy experiences (see Table 1). On the backdrop of the urge for effectiveness, HCI can pose as the instance of constant benchmarking of data protection law itself. Such evaluation should in the end not only help users, but likewise reliably inform processors of their duties and possibilities and finally enable regulators to let GDPR match its aim to both protect data subjects but also enable the free flow of data. While there may have been intuitive feelings about this gap between regulatory goal and real world, scientific studies provide well-grounded and reliable insights.

Table 1 Overview of different levels of conceptual engagement of HCI research with data protection law.

Finally, on a third level, HCI can also provide innovation to regulation and its interpretation: Once a non-effective mechanism is identified, HCI can follow up with its multi-stakeholder design process methods, to craft new tradeoffs reflecting both normative, data subject, data controller, and other stakeholders. For doing so, however, an actual in-depth engagement with legal provisions is necessary. In the following, we briefly outline two of the few design spaces in which HCI already engages with data protection law more or less extensively. For doing so, we turn to the cases of Cookie Banners and Data Subject Rights, to highlight and demonstrate our argument of levels of engagement. These three different levels of engagement are loosely connected to a user-centered design lifecycle, and we prototypically name them: implementation, evaluation, and identification.

4.1 Case 1: Cookie Banners

Cookies and other tracking mechanisms are very important pieces for today’s Web, especially for commercial websites to conduct (re-)targeting via ads or other personalization. GDPR requires that such cookies, which do not serve the sole purpose of making the website work from a technical perspective, may only be used after consent of the user. Yet, due to its relevance for maximizing commercial success and the value ascribed to profiles, entire corporate units and research departments are concerned with the optimization of Consent Management. However, this optimization always takes place from a corporate perspective.

In recent years, there were several milestone legal decisions declaring several design practices unfair and thus illegal. Especially well-known is the Planet49 ruling by the European Court of Justice, making pre-ticked boxes illegal for consent [52]. Since then, when designing the consent, care must be taken to ensure that it is designed as a genuine opt-in, i.e., that the user must actually act actively in order to agree. In addition, multiple consents for different purposes must not be handled with a single submit button, but require the user’s separate active action, for example by checking boxes for the various data processing operations. The topic, remains in the regulatory to-do list, as the highly discussed ePrivacy regulation [85] also foresees adaptions to consent for cookies.

There is also a long list of HCI research diving into Web tracking (e.g., [2, 9, 14, 25, 43] and even more so on cookie banners [10, 27, 36, 38, 60, 63, 79, 83]. With regard to legal provisions, typically, studies in this domain look into dark design patterns [36, 38, 79, 83] (see also the chapter “The Hows and Whys of Dark Patterns: Categorizations and Privacy”) or evaluate compliance [22, 63, 79] in terms of transparency and controllability. The bottom line of these in-part large-scale studies is that cookie banners often strategically make use of unfair practices to undermine the users’ free choice to accept or deny cookies.

4.2 Case 2: Data Subject Rights

In both data protection law and research of Usable Privacy, awareness and control over the collection and use of personal data are understood to be cornerstones of digital sovereignty. For example, the European General Data Protection Regulation (GDPR) provides data subjects with the right to access data collected by organizations but remains unclear on the concrete process design.

HCI research has quickly picked up on the design space provided by GDPR. One of the many ambiguities that spark researchers’ interest surrounds the articles 12–18 GDPR. These articles formulate requirements on designing data subject rights, which, as so often in the artifact-bound and context-specific world of HCI, need to be filled with meaning. The design of data subject rights is crucial when it comes to the ability of customers to exercise their right and fulfill regulatory aims such as “transparency.”

According to Article 12 of the GDPR, the controller shall provide information about actions taken regarding the subject access request (SAR) without undue delay and within one month of receipt of the request. What is more, Article 15 of the GDPR requires that the process of claiming data will result in information being provided “in a concise, transparent, intelligible and easily accessible form” [69]. Regarding the design of the process of exercising data subject rights such as the right to access, Art. 12 (2) GDPR [69] highlights that the respective data “controller shall facilitate the exercise.” Lawmakers thus generally see controllers as responsible for helping their users exercise their right to access.

Currently, however, there is still much to be done to reduce uncertainty about which measures will be judged as sufficiently compliant with concepts such as “understandable,” “transparent,” and “accessible” in court—all the more so as these terms partly overlap, depending on the content [89].

4.3 Implementation: What Can Design Do for Law?

This is where, by large, both most HCI research and especially legal design have their focus. However, the level of interdisciplinarity needed to conduct such studies implementing law is rather low. Especially in the case that provisions of GDPR directly resemble, albeit sometimes complex, concepts known in HCI, such as transparency, understandability, and controllability. HCI typically then applies its own methods to qualitatively understand phenomena in the context of interacting with cookie banners and then try to design for improving on those specific aspects. On the downside, these studies have a strong focus on singular aspects and typically work on a qualitative level, thus struggling to be picked up by lawmakers who look for strong evidence on where and how their normative intentions work or fail.

Cookie Banners: Optimizing UX Design

Putting HCI concepts and methods to practice, usability engineering contributes to designing digital technologies such that they become usable at work, and even joyful and desirable for everyday life. With the rise of the digital economy, understanding users (aka customers) has been increasingly professionalized for commercial exploitation, too. On the one hand, knowing customers in terms of e.g., their practices, psychology, demands and behaviour has broadly improved Usability and UX of consumer technology. On the other hand, these advances have also led to the creation of user journeys that nudge users into pressing the “buy” button as seamlessly—if not: quickly—as possible; or into disclosing as muchpersonal information about themselves as possible for commercial use. Given the increase of informational power asymmetries by such “usable, useful, and joyful” technologies, it stands to reason that the same methods can be used with similar success for other, more ethical value-oriented objectives, such as enshrined in data protection law. Of course, the description of the excesses of commercial success of HCI methods is not meant to disparage the merits of HCI research in general. Regarding the legal debate, recent cookie banner design has gained a lot of attention in terms of what not to do [10, 27, 36, 38, 60, 63, 79, 83]. Few studies, however, provide best practices and bright patterns. Habib et al. compare several design options to find that fully blocking consent interfaces with in-line cookie options accompanied by a persistent button to later change consent decisions work best for fulfilling GDPRs goals [39]. Utz et al. provide a field study on privacy notices [88], as well as Kulyk et al. [59]. These studies, however, are small scale. While important for HCI to sensitize, such studies can rarely make an impact on regulation, since they do not show a prevailing scheme of legislation being (in-)effective. Rather few studies, such as Graß et al. [36], provide insights for law makers and data processors alike for best practices. In their work, Graß et al. evaluate bright design patterns on cookie banners found in the wild.

Data Subject Rights: Information Visualization for Interactive Dashboards

Relatively few studies so far have explicitly targeted the implementation of provisions and data subject rights provided by the GDPR. The data subject rights in GDPR are currently being researched, such as the right to data portability [21, 93]. Closely related to the right to access data, transparency-enhancing tools have been proposed—mostly following a dashboard approach. For example, similar to the Usable Privacy dashboard by Raschke et al. [75] mentioned above, Olausson developed a dashboard specifically targeting nurses’ work [67]. Tolsdorf et al. [86] qualitatively compared ten implementations of dashboards, comparing their levels of compliance. Still, dashboard implementations are scarce in practice and are often only adopted by big players on the market. Looking at manual subject access requests (SAR), Alizadeh et al. interviewed customers of German loyalty card systems, who were asked to make use of their right to access [5]. The scope of the study, however, is limited to a single organization, focusing on how data are provided and the potential to help users with their privacy practices. With a similar perspective on supporting sense-making and data literacy, Pins et al. [71] designed and tested a prototype that visualizes the interaction with voice assistants based on data of SARs from Amazon Alexa and Google Assistant. These studies, however, pose singular islands of knowledge providing insights often from a qualitative stance, which lack representative power to inform lawmakers.

4.4 Evaluation: How Well Is Law Currently Working?

The evaluation of current implementations of law is an important prerequisite to being able to innovate law. While almost all HCI research studies do have evaluative parts, their focus lies on testing their very own implementations. The scientific identification and verification of ineffective law, however, needs different kinds of evaluations. These should be rather large-scale assessments of the current state of affairs. They may cover both practical implementation of law and rather abstract concepts, motivating methods and implementations. The evaluative perspective often remains rather “destructive,” in showing how things do or do not work out as meant by law. Still, a core benefit for law and business lies in the provision of worst practices, which can then be made public, avoided, or even fined if taken as a guideline for non-compliance.

Cookie Banners: The “Notice and Choice” mechanism

Noticeably, evaluative studies on the issue of Cookie Banners are quite popular in HCI. The majority of the community is keenly looking at the emergence of “dark patterns’’ in interaction design [36, 38, 79, 83]. Often, Web scraping technologies are used to defer data sharing practices, information provided, and designs applied. Such studies often explicitly target evaluation of existing solutions. For example, Degeling et al. especially measured the impact of GDPR on cookie use and banners [22]. Matte et al. evaluated compliance of IABs consent design [63]. Leenes and Kosta report a case study on how Dutch regulation failed to meet its goals in practice [60]. Regarding data sharing practices, Okoyomon et al. examined 68,051 apps and found out that 10% of those shared personal identifiers with third-party services but did not declare such conduct in their privacy policy. What is more, only 22% of these apps explicitly named third parties, concluding that it is impossible for users to know where their data are being used [66].

On a more conceptual level, the key mechanism in Cookie Banners (and beyond) to obtain a lawful basis to collect and process data is “notice and choice.” HCI and related research have long shown that this mechanism has its limits. Cranor et al. state that notice and choice mechanisms are necessary to understand where and under what conditions personal data flow, yet they also conclude that it is insufficient to properly protect privacy [19]. Fred et al. attest that it is a “poor mechanism for communicating with individuals about privacy” [16], and Warner and Sloan go as far as to say that there “is no acceptable way to rescue Notice and Choice” [92]. Still, data protection regulation such as the GDPR and ePrivacy Directive rely on consent for the processing of data and the use of tracking technologies. More concretely, the case of infamous cookie banners shows, at least to some extent, how HCI research can detect and prove the strategic exploitation of legal loopholes to the detriment of the user—beyond an individually intuitive feeling of judges. For example, a whole range of studies show how unfair practices are used to undermine existing law through nudging (see the chapter “Privacy Nudges and Informed Consent? Challenges for Privacy Nudge Design”) or dark pattern design (see the chapter “The Hows and Whys of Dark Patterns: Categorizations and Privacy”) or even straight out disregarding user choice [10, 22, 36, 48, 63, 79].

To sum up, there is a lot of work on the evaluation of legal provisions such as controllability and transparency of cookies banners in HCI. While in general a very positive sign of HCI research turning to concrete legal issues, the broad adoption consent as a research topic arguably also is supported by another factor: It is rather easy for the HCI community to evaluate cookie banners, as the legal parameters to test against (mainly control and transparency) fall in line with research fields that are long established in Usable Privacy research, too. So, whereas the field of application is new, the exact research interests are pre-existing in HCI research. Moreover, cookie banners were a new phenomenon at the time, which literally any Internet user was exposed to. While this is not a bad thing per se, it shows how HCI did not have to engage with legal issues a lot to identify a potentially misguided and ineffective regulation. Instead, it was brought to the researchers’ personal and professional attention, and unavoidable.

Data Subject Rights: Assessing the Usability of Implementation of GDPR Rights

Usability studies reflect that process design is crucial for users being able to complete an interaction [6, 43]. However, little is known about the factors facilitating or hindering subject right requests in terms of process design. Insights about factors such as user needs and capabilities are highly useful not only from a research perspective, but also for organizations to use data protection as a competitive advantage by optimizing their customer experience [50, 90, 96]. While there is some research on usability of dashboard solutions for addressing data subject rights (especially for the right to access), the market adoption of such generally desirable solutions has been low.

Most studies into the provisions of the GDPR from a consumer perspective in the field of Usable Privacy adopt a consumer perspective on the right to access and/or deletion. However, such studies largely ignore the challenge of getting data in the first place. Instead, studies focus either on the compatibility of the data provided as a result of the subject access request (SAR) with user demands in terms of supporting privacy practices and data literacy [5, 71]—or they take the provision of data for granted by building dashboards on top of the data [75]. Urban et al. [87] contacted 39 companies to check for several SAR parameters such as response time, reaction to questions, and the disclosed information in the context of online advertisement. Similar work has been done by Kröger et al. [57] for app vendors who conducted a longitudinal study on several SAR items such as response time, data provided, and security mechanisms. However, these studies do not apply a processual lens, nor do they evaluate the phases they identify in a user-centered way; instead, these studies merely check against legal provisions.

For companies and organizations in general, the still relatively new GDPR framework raises uncertainty regarding requirements for compliance with the regulation. For the case of the right to access data, organizations need to implement a process for users to claim their data, but it is still unclear how such a SAR process should be designed, how authentication should work, how data should be requested, provided, presented, and explained to customers in a compliant and customer-friendly way.

To the best of our knowledge, there is only one large-scale study on usability assessments of the implementation of SAR processes, namely Pins et al. [72]. In what we believe is kind of a best practice for evaluating legal processes from a user lens, Pins et al. defined a five-phase user experience journey regarding the right to access: finding, authentication, request, access, and data use. Second, and based on this model, Pins et al. had 59 participants exercise their right to access and evaluate the usability of each phase. Drawing on 422 data sets spanning 139 organizations, they inform both law and Usable Privacy research on the current state of affairs with a robust, empirical body. Their paper is one of the first larger scaled approaches to a structured approach for evaluating the design factors that drive or hinder users when conducting data subject rights. This information, however, is relevant both from the standpoint of research on Usable Privacy and for assessing the controller’s role in facilitating the user in this process, as demanded in the GDPR. Both business and data protection agencies can now draw upon this work, to understand best and worst practices for compliance on the aforementioned abstract concepts of, e.g., transparency, processor facilitation, etc. We argue that for the effectiveness and evolution of GDPR implementation to match its regulatory goals, such studies are of great importance, and HCI is the premier field to provide such insights.

4.5 Identification: Challenging Existing Legal Interpretations and Concepts

The usable implementation of regulation is an important endeavor, and it holds potential to carve out wholly new ways of interacting. Likewise, the evaluation of interactions and its legal provisions can inform law on how regulations manifest in practice, paving the way to make improvements.

On a third level, there is the structured identification and framing of design spaces in data protection in the first place. As part of design science, defining a design space is a mapping of dimensions of a research artifact, which is an approach to guide practitioners in designing new solutions [70]. Formally, or informally, as part of gaining contextual understanding in the early phases of the design process—for HCI the identification of a design space is a core competency because outlining a design space conceptually provides researchers with information on the room to maneuver, by showing options available, and often also tools such as taxonomies and a vocabulary to compare, categorize, and communicate different implementations styles. HCI frequently uses such design space definitions. In the realm of Usable Privacy, for example, Schaub et al. [81] came up with a design space for effective privacy notices. Similarly, Feng et al. [30] come up with a design space for privacy choices. It takes deep understanding and critical assessment of the field of the matter at hand.

However, when working with data protection law issues, HCI seems to sometimes forget about this aspect of its work. Taking their task seriously would mean to either explore the field autonomously or take in the necessary competencies from data protection law scholars. Serious interdisciplinary research must also delve into and question the formulations and requirements of the legal system to be able to map the full design space and thus tap into the full potential of multi-stakeholder design processes.

Cookie Banner: The Future of Consent

Next to values and attitudes, HCI can also measure actual behavior in data protection decision-making: For example, HCI has highlighted what is known as “consent fatigue” [17, 73, 77]: Users are confronted with providing consent so often—e.g., in Cookie Banners—that the central point of making an “informed” decision, which constitutes the very heart of giving consent, is at question. This is where regulation can and should arrive at better solutions that are sustainable in the long run because they fulfill the regulatory requirements for a free yet informed consent.

Agent systems represent one possible approach. They offer the possibility of articulating privacy needs without at the same time requiring continuous management from page to page, or not going beyond information. In this respect, the creation of an agent is not necessarily in conflict with a current action goal (to visit a web page) but could be part of the setup process of a browser on the one hand or continuously available for redesign on the other.

Here, however, much depends on the design and given possibility for negotiation. If both users and operators offer no leeway, then even such an approach will come to naught in that it could again amount to blocking an offer if a user does not agree to the terms. Because even if there is a ban on tying, many Web offers de facto refinance themselves with personalized advertising [28].

A bridge in this regard could be the offering of non-personalized advertising as “compensation” or monetary compensation as mediation. In pursuing this alternative, however, entirely new challenges arise [22]. Not only would the question of appropriate pricing in relation to data to be disclosed elsewhere be rekindled. More than that, such a decision could also lead to a division of society in that privacy on the Web would become a luxury good that has to be bought for money.

It is up to HCI to identify new potential ways of finding a new balance of the legal requirement to provide consent individually every time, and the limited capability and willingness of users to make informed decisions regarding a secondary goal such as privacy.

Data Subject Rights: The Implementation of the Principle of Purpose Limitation

The principle of purpose limitation is, among other things, to inform the data subject about the limits of data processing and thus key to many transparency mechanisms of GDPR. They should be unambiguous and help data subjects to identify uses of data that data subjects “might find unexpected, inappropriate or otherwise objectionable” [8]. However, generally accepted patterns have crept in, so to speak, which do not effectively develop their actual protective effect because they are highly generic and to some extend even arbitrary [80]: “Analytics,” “User experience enhancement,” or “profiling” are often used phrases that do not delineate clear boundaries for data subjects. In legal practice, the question of how a purpose must be specified such that data subjects can recognize usage that they might find “unexpected, inappropriate or otherwise objectionable” [8] remains unsolved.

Accordingly, an essential question that arises to increase the effectiveness of data protection law is: How can purposes be formulated so that they can fulfill their actual purpose? An answer to this question is of utmost importance because most other processing principles and legal provisions depend on how the purpose is specified.

While the work on Usable Privacy policies in HCI has found that privacy policies lack readability and understandability, it does not seek to reformulate purposes to make them more meaningful, but rather to take the wording used as given (e.g.: [64, 65, 76, 84]). The very fact that purpose specifications form a design space resource where data subjects are likely to have justified interests must spark HCI interest.

In this regard, there are first steps which HCI could build upon. For example, there is growing interest in the HCI community to use privacy risks as perceived by users to inform about the potential implications of data disclosure. Several studies started looking into the perceived privacy risks as a design resource (especially in the realm of embedded and networked devices such as those of the so-called Internet of Things) [33, 41, 46, 49, 51, 53, 54, 94, 95]. However, these efforts are rather investigating potential design resources for increasing user awareness of privacy implications, when using services. They largely neglect or are unaware of the fact that such approach aligns well with the overall risk-based approach of GDPR and could help data subject to gain an understanding of what data would be used for by processors. Studies specifically do not take into account the principle of purpose limitation, or its first component of purpose specification as expressed in privacy policies.

A no exception is the work of Jakobi et al. who seek to bridge the gap between both risk approaches despite their—also conceptual—differences for the connected car context [49]. On a broader level, in an example of interdisciplinary collaboration of HCI and law research, von Grafenstein et al. [91] first identify this aspect of the legal design space and outline three possible alternatives in formulations stemming from a set of cross-technology focus groups on perceived privacy risks.

They suggest, their categories of what they call “unfavorable data uses” aka “privacy risks” could serve as a reference scheme for data controllers when specifying their processing purposes in future practice. Such an approach could ensure indication of information relevant and useful for data subjects and thus effectively manage privacy expectations. With these perceived privacy risks at least indirectly referencing to risks to the fundamental rights of the data subjects, these rights can serve as an immediate scale to further adjust the protection measures that is well-known to legal scholars, too.

The research results from von Grafenstein et al. are based, so far, on qualitative methods and thus suffer representativeness. Still, even qualitative methods also serve a certain proof of effectiveness [26]. Moreover, the mixed method set of HCI also provides roads to make these empirical results stronger, for example, via triangulation with quantitative methods. This new concept for the formulation of data processing purposes holds potential for more meaningful communication of the ins and outs of data processing for users. Further steps are now to be taken in a classic user-centered design process, which will cover both levels of collaboration previously mentioned: the development, evaluation and implementation of potential solutions, and comparison of existing and future options.

5 The Road Ahead

The requirement to implement the legal norms into the processing design in an effective manner (Art. 25 GDPR) constitutes a recent shift toward including empirical evidence into legal reasoning, which is not yet fully understood. By explicitly declaring the effectiveness of the protection measures to be the legally required result, the legislator raises the question of which methods can be used to test and assure such effectiveness. Extending the legal conformity assessment to the real effects of the required measures opens this assessment to (non-legal) methodologies that are specialized for assessing such empirical facts. This does not mean that lawyers must directly incorporate these methodologies and findings into the legal interpretation of Art. 25 GDPR. Instead, they are usually considered as an (sometimes more, sometimes less) important factor in the interpretation of the norm. However, this effect can become rather dominant in legal practice because the interpreter of the law (e.g., a data protection authority or legal court) cannot easily ignore the methodically assured findings of the other discipline, since these describe the factual situation on which the interpretation of the law is based.

In fact, this interdisciplinary opening in Article 25 fits into a larger development in the regulation discourse. Under the label of evidence-based policymaking, for example, the debate has been discussed for quite some time now not only the increased rationalization of the law by referring to non-legal disciplines, but also the possible pitfalls of this approach, such as the increase in complexity when considering the effects of regulation instruments in the legal reasoning.

Since law and its enforcement must also scale and remain effective in the digital realm, technology such as automated usability evaluation may play an important part for future compliance assessments. Automated evaluation may to some extent provide legal certainty for data controllers and likewise support data protection authorities. It is on HCI to provide tools that meet all stakeholders’ needs: Data controllers want to be able to assess their future tools and products early and constantly with low efforts in terms of fulfilling data protection requirements. Users want to know about companies that champion data protection and data protection agencies want to be able to consult and oversee processors to maintain a high level of data protection in practice.

In this chapter, we showed how legal and HCI research can benefit from each other’s competencies and showed how HCI research so far has (not) seriously engaged with data protection regulation on a broader scale. We argue that both fields can adapt concepts and methods to make the interdisciplinary work even more effective to reach its very “own” objectives. Beyond our specific example, the critical task of mapping the design space will be important to allow for transfer to other data protection principles and rules, especially those whose effectiveness depends on their usability. While HCI has a long history and strong methodology to jump to help here, on the HCI side, the engagement with legal concepts needs to be strengthened to be able to critically assess rooms to maneuver to make law more effective jointly.