In the app stores of Google and Apple, we can find so many popular apps which track their users and trample on data protection. Most people do not question that at all and use these apps heedlessly. In the face of the COVID-19 pandemic, we have developed some of the most privacy-friendly and best-scrutinized apps, and people have questioned them widely—which is a good thing. In the resulting public discussion, it turned out to be difficult to explain a privacy-by-design solution to the public. Clearly, it is hard to understand how tracing of individual contacts and anonymity (or pseudonymity) can be possible at the same time.

One particularly perfidious characteristic of SARS-CoV-2 is that an infected person can already infect others while she still feels perfectly healthy. Therefore, if Alice has been infected by Bob, Alice must be warned immediately and stay at home as soon as Bob learns that he has been infected. Then, those who Alice would have otherwise met later are saved. Contact tracing is a measure to achieve such warnings of contact persons.

Almost immediately after we realized that the virus had reached Europe in early 2020, various projects started in different countries to develop an app which could supplement the usual contact tracing carried out by health authorities. Out of these developments, soon a broad and qualified international discussion emerged about different concepts on how to implement app-based contact tracing. Surprisingly, it turned out that these concepts, when optimized for effective warning, or when optimized for privacy, or when optimized for ease of implementation,Footnote 1 converge to the same result.Footnote 2 In other words, contact tracing apps are a rare case where there is an optimum solution when considered by these three criteria.Footnote 3 This approach is the Bluetooth-based creation of an anonymous contact diary kept locally on the smartphone, later called the “decentralized approach,” which was particularly well elaborated and documented under the designation Decentralized Privacy-Preserving Proximity Tracing (DP-3T).Footnote 4 After intensive public discussionFootnote 5—which unfortunately is usually anything but granted—this factually best solution prevailed in practice in most countries. This is opposed by a “centralized approach,” where pseudonymous records of contacts, at least of the infected users, are transmitted to a server. In this architecture, it cannot be ruled out that a malicious server operator or attacker can infer a social graph or even an interaction graph of participating persons (Troncoso et al. 2020, pp. 43 f). The example shows how important fundamental architectural decisions are for the privacy of the users (Hötzendorfer 2020). There are many more details of applied privacy by design in DP-3T and similar concepts. See Troncoso et al. (2020) for the details.

Soon after the first apps were developed, it became clear that without intervention on the operating system level, Bluetooth could not be used in the required way, especially when the app should run in the background. Then something historic happened: Google and Apple teamed up to implement a privacy-by-design concept developed by European researchers (among others). They implemented a concept for Bluetooth-based contact tracing very similar to DP-3T into Android and iOS, respectively.Footnote 6 Of course, it is a problem in itself that these companies are so dominant and powerful that the world is practically unable to implement such a system without their good will (Veale 2020).

However, Google and Apple may not have chosen the decentralized approach out of noble privacy considerations. In light of the potential information power that comes with the centralized approach, one can raise the hypothesis that they did not want to decide which governments are trustworthy enough to give them these powers and which ones are not. In any case, they implemented the decentralized approach, and so the most privacy-friendly solution prevailed in practice.

As many people may not know, the GDPR, which ushered in a new era of data protection when it came into force in 2018, has not significantly changed the substantive data protection law in Europe. Rather, its fundamental impact results from the penalties it imposes for conduct that, for the most part, was already unlawful before and from the momentum and focused public discussion it created all over Europe and beyond. However, the GDPR did introduce a new principle: data protection by design. This fundamental principle wants us to build privacy into the design of systems from the start. Only slowly an understanding is being developed of what this requirement means in practice and how it can be systematically fulfilled. DP-3T and related concepts and their implementation by Apple and Google can be seen as one of the first widespread real privacy-by-design solutions in the sense that it demonstrated that with a privacy-first attitude, the key functional requirements of a software can be fulfilled without compromise.

However, it made us realize that we are not there yet. The step we took here, from having a sound body of data protection law which should theoretically protect users installing an app to having the app implemented as a data protection by design solution and making that transparent in every detail, was not enough to gain the trust of the users.

This is particularly noteworthy since the quality and depth of the public technological discussion was remarkable. For example, in Austria, not only the Data Protection Authority but also the broader data protection community was involved in the development of the app very early, and the nationally and internationally recognized NGOs and NOYB and the information security research center SBA Research carried out a technical and legal review and published a report containing a list of recommendations which were immediately implemented.Footnote 7 The important realization, that we must actively participate in shaping technology if we are to exercise political control over it, seemed to have suddenly taken effect in civil society and the scientific community. The European Data Protection Supervisor stated: “The public discussion about specific privacy features for a new application, which was only in the early phases of development, was a completely new phenomenon.”Footnote 8

At least from an Austrian and German perspective, it seems that although independent, renowned privacy experts and activists gave a good verdict on the app and published their reasons in detail, they were not able to change the public opinion significantly. Of course, it is almost impossible to check whether the superior concept was implemented correctly in every detail, and bugs can never be ruled out. Also, there are still some factual arguments against contact tracing apps, such as the doubts about the suitability of Bluetooth.Footnote 9 However, many people seem to continue to refuse the app primarily out of a privacy-related gut feeling, even though the privacy of the app had already been thoroughly tested by independent experts and the results were openly available in detail. Therefore, the question arises, how can we as technological experts replace gut-based opinions in the population with fact-based science-driven opinions in the face of the following insight: Many mechanisms and measures that make a technology more secure or less privacy-intrusive are complicated. Take encryption as the most obvious example. In many cases, the exact same mechanism that makes us recommend a software makes it difficult to explain to the general public why we can do so.

Unfortunately, this effect cannot be simply explained by differences in the level of education and expertise in society or even by the increasing hostility toward science. Doubts concerning the app based on gut feelings without substantive arguments were also spread publicly by scholars from other disciplines as well as privacy advocates who had not carried out a thorough analysis of the app themselves. This is not to argue, in a blunt and dangerous way, anyone who has doubts about the app has simply no clue about it, and of course, skepticism about technological developments and specific systems in particular is always appropriate. But if a lot of detailed privacy analysis based on scientific methods is publicly available, this cannot be ignored in the public discussion. If intellectuals and experts in non-technical disciplines write or speak publicly about privacy aspects of a technological system such as a contact tracing app, they should be aware of the existence of such widespread scientific evidence and base their discussion on it, the same way in which they base their discussion on the broad consensus among epidemiologists.

This is not to say that the lack of widespread use of contact tracing apps can only be attributed to factually unfounded privacy concerns. Another problem of contact tracing apps is obviously that their use is a rather passive experience. The app not doing anything recognizable during normal use could lead the user to believe that it is not working.Footnote 10 But in any case, experts were not able to convince a broad public that the decentralized approach and the apps based on it are in fact harmless in terms of privacy and data protection. The European Data Protection Supervisor concludes that: “From all reactions, it appears that the biggest inhibitor to wide uptake and use of tracing apps is the lack of trust in their confidentiality.”Footnote 11

Has the world become so complicated that a broad majority cannot take qualified (democratic) decisions concerning a growing number of domains? One way forward is to strengthen trust in experts and science. But to be honest, we have to realize that this cannot fully succeed in explaining privacy-by-design solutions to a broader public. This might be related to the fact that privacy by design is in a way an attempt to control technology with more technology. And this at least makes things more complicated and hence more complicated to explain.

Clearly, there are domains where digital solutions are conceptually completely inappropriate, and the paper-based solution fulfils the essential requirements appropriately, e.g., a secret ballot.Footnote 12 But the domain of contact tracing apps is a good example where only technology enables a suitable solution, i.e., tracing and anonymity at the same time, which is conceptually impossible to achieve with any paper-based approach.

In many other domains, we might not be able to find such elegant privacy-by-design solutions that fulfil all functional requirements as DP-3T does here. As I am writing these lines exactly 1 year after the Austrian Stopp Corona App was released, another app-based “solution” in the context of the pandemic is around the corner: the “green” app-based pass for demonstrating the fact of being tested, vaccinated, or immune due to a past infection. Unfortunately, here the perfect privacy-by-design concept for implementing such a system does not (yet) suggest itself. At the same time, this is a much more crucial domain than contact tracing because people will be under much more factual pressure to use such a system if they want to participate in public life again. However, it seems that wherever the “green pass” is discussed, it is much clearer than it was a year ago that such a solution must meet the highest standards in privacy and data protection.

To conclude, I think this is the positive legacy of the contact tracing apps in the context of Digital Humanism: We can expect that applied privacy by design will become more common. Also, the public debate about privacy and data protection was elevated onto a new level. Mankind needs to find ways to actively shape technological progress for the greater good, and therefore civil society and the scientific community must involve themselves as it happened here. However, we also learned that this is not enough: As technological development is making the world more difficult to understand every day, we need to find ways to explain “good” technology to the people, including intellectuals and experts in other fields, while maintaining a sound and productive skepticism toward technological developments that influence our lives.