Keywords

1 Introduction

Privacy and security are becoming increasingly important due to the developments in the area of Big Data and the growing social importance of online services. Since the perception of missing privacy or security by users can be reflected in changes in behavior (or maybe not, as can be seen in the privacy paradox, see also the chapter “From the Privacy Calculus to Crossing the Rubicon: An Introduction to Theoretical Models of User Privacy Behavior”), part of the research is concerned with contextual factors that can moderate the relationship between perception and behavior. Especially in areas where data leaks or security breaches could lead to serious consequences, it is worth taking a closer look. In these critical environments, whether in personally sensitive contexts, in the context of crises, or in the area of critical infrastructure, it becomes apparent that the frequently cited privacy paradox does not have general validity.

Helen Nissenbaum’s [37] theory of Privacy as Contextual Integrity already describes the idea that the adequacy of privacy is linked to specific contexts that have to be considered from the perspective of governance. According to this view, norms of appropriateness and information flows, among others, form the context in which the realm of acceptable disclosure can be conceptualized. Relevant parameters, which determine the acceptability of information disclosures, thereby include information type, involved actors, and transmission principles. From this point of view, it becomes understandable that attitudes and behaviors toward privacy protection are not static, but strongly situation-dependent. In the following chapter, these aspects will be highlighted from the perspective of the users by presenting examples of relevant research on these topics. Since in safety-critical environments the aspect of security behavior is often the more salient aspect in research, the first question to be addressed is to what extent security and privacy behavior are interrelated and how these terms can best be conceptualized. In addition, we will specifically look at the issue of context-dependency, namely what kind of data is perceived as private and under which circumstances and with whom users are willing to share this data.

Subsequently, we will look at how privacy is perceived in safety-critical environments. For this purpose, on the one hand, a study will be presented that deals with the question to what extent privacy is a relevant aspect in the smartphone and Internet use of asylum seekers during their flight and which strategies emerge to protect digital privacy during the journey. Afterward, a study is presented that deals with privacy within agriculture, as part of the critical infrastructure. This will shed light on how privacy concerns impact the adoption of digital technologies.

2 On the Relationship Between Cyber Privacy and Security Behavior

How Are Cyber Privacy and Security Behavior Interrelated?

In today’s digital world, appropriate privacy and security behavior are more imperative than ever. Yet, the precise interrelationship between privacy and security behavior is still unclear, as it is rarely addressed in the relevant literature. To date, when it comes to the precise nature or characteristics of the relationship between privacy and security, there is no general consensus. However, against the background of effectively improving both privacy and security behavior in society, it is essential to gain an accurate understanding of how the two are interrelated in different contexts, where conceptual similarities or differences exist, as well as whether privacy and security behavior are similarly influenced by certain factors. Building on a better understanding of these interrelationships is the only way to ensure the development of adequate interventions and software that provide support to users in enhancing their privacy and security.

In general, privacy refers to the prevention of exposure of sensitive information about individuals or groups. This includes, among other things, the nondisclosure of behavior, communications, and descriptive personal data [41]. The general notion of the term privacy today is still quite close to Westin’s early definition, which described privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others” [51]. However, maintaining privacy in a rapidly changing digital environment is much more difficult today. This may be one of the reasons why there is still no general consensus on the exact scope of the concept of privacy.

IT security, on the other hand, refers to the protection of computer systems from theft and damage of hardware, software, and information as well as the disruption of services they are supposed to provide [34]. A good conceptualization of this protection is provided by the so-called CIA triad: secure IT systems should therefore maintain confidentiality, integrity, and availability [40]. Confidentiality hereby refers to the prevention of unauthorized viewing, integrity to the unauthorized modification, and availability to the preservation of access [40]. These definitions suggest that security may, but may not completely, encompass the privacy domain. There is a particular overlap in the factor confidentiality since unauthorized viewing is associated with both unauthorized access as a security breach and the possible exposure of sensitive information about individuals as a privacy breach. On the other hand, integrity and availability define characteristics that may be distinguished from privacy more clearly.

In order to illuminate the relationship between privacy and security, in a previous study we examined privacy and security behavior in connection with certain socio-demographic factors (gender, age, educational background, political ideology). Within a representative survey, people in Germany (N = 1219) were asked to report about their privacy and security behavior regarding the private use of digital devices [5]. The survey evaluation indicates that there is only a low correlation between self-reported privacy and security behavior. Furthermore, the two concepts are differently influenced by certain socio-demographic factors. For example, age and education have a significant impact on security behavior in that older and more educated people show higher security behavior. Such correlations, however, could not be established for privacy behavior. Moreover, for the factor political ideology, there was no relation found with either privacy or security behavior.

On this basis, the often-presumed inherent connection and interrelationship between privacy and security behavior must be put into question. With view to the overall research landscape in this subject area, our study is in stark contrast to the results of many other studies, which generally and interchangeably speak of privacy and security [23, 28, 42]). With view to these studies, there is the risk that false connections are inferred and thereby, e.g., security aspects are wrongly attributed to privacy improvements, when in fact they are only appropriate for enhancing security. This understanding of the relevance of the appropriate distinction between the two concepts can potentially be relevant for both the education of individuals regarding private privacy and security behavior and software developers in terms of the proper protection of either privacy or security.

The results of this current study contribute to the existing body of literature, with particular reference to the investigation of privacy and security behaviors in contrast to people’s corresponding attitudes toward these concepts. Our findings are consistent with previous research indicating that personality traits have different influences on attitudes toward privacy and security and that in this regard, the correlation between respective privacy and security attitudes is marginal [14, 15].

Since a general consensus about the relation between privacy and security is lacking, different studies on the subject implicitly or explicitly suggest hierarchical relationships in which security is either part of privacy [46], privacy is part of security [6, 7, 29], the two are separate domains that gradually overlap [38, 39], or both are just different but related dimensions of the same construct [12, 19] (see Fig. 1). In the empirical study described earlier, the observable correlation between privacy and security was low, but not totally absent. However, this could not be attributed to certain demographic factors. Therefore, it is unclear at which level privacy and security are connected and where exactly the common ground between the two concepts might lie. In order to address this issue of comprehensively contrasting both concepts, we fall back on the technology threat avoidance theory (TTAT). TTAT examines cognitive processes related to threat assessment—including the perception of susceptibility and severity—and coping evaluation that significantly influence subsequent behavior regarding IT threats [32]. TTAT does not differentiate between IT threats associated with privacy and those associated with security. However, we suspect that perceived security could be a dimension on the basis of which different influences on privacy and security could be observed. In other words, the observation of high privacy behavior can give a direct indication of the existence of a high level of perceived severity, as only a person who is concerned about the collection of personal data would show corresponding behavior. With view to security on the other hand, the consequences of insufficient behavior are much more immediately noticeable, e.g., with regard to computer viruses, than with regard to the more abstract risks of inadequate privacy protection. For this reason, a person’s security behavior might be high and their privacy behavior low at the same time. Building on this, TTAT suggests that a common factor for both might lie in the complete avoidance of certain threats associated with technology. While this would only explain a low correlation, there may be differences in privacy and security behavior depending on a person’s underlying beliefs for particular aspects of this common factor, such as accurately assessing an IT threat by evaluating the corresponding perceived severity. In this context, it would be very plausible that age and educational background have an influence on privacy and security behavior—which was confirmed by our empirical study.

Fig. 1
4 sets of circles. 1. Privacy circle is big and has small security circle in it. 2. The security circle is big and has a small privacy circle in it. 3. Same size privacy and security circle overlap each other. 4. A circle split in half as privacy and security.

Conceptualizations of the relationship between privacy and security proposed in the literature (as depicted in [5], inspired by Hurlburt [26])

Based on these findings so far and given the marginal correlation between the two concepts and the ambiguous influence of certain demographic factors, the relation cannot yet be clearly determined. While it is not obvious whether privacy and security represent two hierarchical, merely overlapping, or even interrelated approaches, against the background of TTAT, we propose that the link between privacy and security might be best illustrated by the picture of different but related dimensions of the same construct. According to TTAT, this link is probably best visible by looking at the behavior of avoiding IT threats. In summary, the aspects outlined demonstrate the urgent need for a fine-grained differentiation of privacy and security in order to effectively influence corresponding behavior in the future.

3 Awareness on Data Sharing Functionalities and Acceptance of Private Data Sharing

Which kind of data is perceived as private data? When and with whom are people willing to share their private data?

Today, ever greater amounts of data are being produced, stored, and shared. While in some cases people share their data intentionally, e.g., by the use of social media or messengers, beyond that, much more data are shared by smartphones if certain data sharing functionalities, such as GPS, Bluetooth, or Wi-Fi, are activated. This raises several questions, including whether people are generally aware of the data sharing in public spaces, whether they actively switch on and off certain data sharing functionalities, and which kind of data people want to share with whom. By conducting a representative online survey (n = 980) and face-to-face-interviews (N = 58) with smartphone users in Germany, we investigated self-reports as well as actual data sharing practices (see [25]). The results provide insights into the circumstances in which private data are shared voluntarily, conditionally, or involuntarily.

Many study participants classified all examined data types (name, address, date of birth, bank details, identity card number, personal files, personal location data, personal communication data) as private, whereby bank data in particular were considered as private by nearly all participants. Related research suggest that women are generally more cautious with regard to their private data than men [17, 35, 44, 45]. Moreover, people who care about data protection might also have preferences for certain software. Hence, prior to the study, we presumed a correlation between the classification of data as private and the installed operating system as well as the gender of the study participants. Based on the data collected, only the latter could be confirmed, as female respondents were more likely to consider data as private than male participants, for all data types. With view to the factor of socio-cultural characteristics, other studies have shown that countries differ in terms of the use of social media during emergencies [43], privacy concerns [4], trust in social network sites [30], and the openness in using COVID-19 related apps [48] (see also the chapter “Privacy Research on the Pulse of Time: COVID-19 Contact-Tracing Apps”). In comparison to respondents asked in other European countries [43], the participants in our German sample showed a lower use of social media. Compared to study participants in the USA [30], Germans seem to have a lower level of trust in social media. However, Germans show higher acceptance regarding COVID-19 related apps than US participants, but lower acceptance than Chinese respondents [48]. Regarding the willingness to share private data, we found that this depends on the type of data, the data recipients, and the specific circumstances (see Fig. 2). This is consistent with Helen Nissenbaum’s theory of contextual integrity [36] stating that the flow of private data is dependent on the specific attributes (i.e., data types), actors (i.e., data receiver), and transmission principles. This indicates that attitudes and behavior toward data sharing are highly volatile and difficult to generalize across all types of data and actors involved. With regard to our findings, in terms of data sharing with services or acquaintances, many study participants (46%, 41%) stated that they would want to decide this based on the specific situational context. Prior to the study, we assumed that some people would frequently share their private data and keep Wi-Fi, GPS, and Bluetooth activated simply for convenience. However, while we found this was true for more than 10% of respondents who would always share their data, more than 16% of study participants said they would never agree to share their private data, in each case regardless of the recipient. The services with the highest percentage of people not wanting to share their data with were Google and similar services, which could be due to a lack of incentives or a perceived deficit of transparency and trust. Our results are in line with other studies that show that users care about both the recipient and transparency about the data shared [1, 10].

Fig. 2
A horizontal stacked bar graph. Some of the top values are, services I choose, 46% agreed with explicit allowance per situation. People I choose, 41% agreed with explicit allowance per situation. State bodies such as governments and municipal office, 33% agreed with explicit allowance per situation.

Answers to “who shall be able to view my private data in which cases? Online survey from [25]

According to the study participant’s self-assessment, most would be more likely to share personal data if they themselves were at risk than if other people were at risk. Regarding the latter aspect, 39% of the respondents stated that their data sharing behavior would not change if someone they knew were in danger, and however, more than 50% of the people asked would be willing to disclose more data in this case if these were shared with government agencies or emergency responders. In accordance with previous studies ***(e.g., [48, 49], this highlights that users want to retain control over their data and are generally more willing to disclose private data if they know who the recipients of this data will be and for what purpose it will be used.

Generally, most people are not inclined to always share their data, which however is assumed by some studies [21, 24]. In fact, our results indicate that between 16% and 31% (deviations with regard to certain recipients) of users would never be willing to share private data. Since some people generally see no value at all in data sharing [3], lack of transparency or control over personal data might therefore lead to limited or no use of certain applications [49]. Especially with regard to the disclosure of sensitive medical data, it was shown that this would only be considered acceptable if the societal benefits were extremely high [50].

With view to the activation status of data sharing functionalities, we also compared respondents’ statements with the actual settings on their smartphones. Regarding the settings of Wi-Fi and GPS, we found only minor discrepancies (4%). However, about 17% of study participants were not aware that their Bluetooth function was activated. Building on this, we want to underline the importance of raising users’ awareness about the potential benefits and risks involved in activating these functionalities and how personal data can be adequately protected.

4 Critical Environment I: Digital Privacy Perceptions of Asylum Seekers in Germany

How relevant is digital privacy for asylum seekers? How is privacy related knowledge acquired during the flight? What strategies emerge among asylum seekers to protect their digital privacy during the journey?

After examining how privacy and security behavior are related and what data is typically considered private, we want to shift the focus on privacy considerations in safety-critical environments. An example for such a safety-critical environment for individual users is the context of flight and displacement.

In light of the continuing and increasing migration movements within which numerous refugees try to come to Europe to seek asylum, the role of digital infrastructures such as smartphones has gained on importance, since many asylum seekers frequently use these in order to get information or communicate with friends, family, or acquaintances [13, 47]. To be able to reach their target countries, asylum seekers rely heavily on the access to mobile Internet and online services. While the use of such technologies offers many possibilities, little is known about associated risks perceived by asylum seekers. We addressed this research gap by examining how asylum seekers use mobile information technologies during their flight to Europe, focusing particularly on potential privacy concerns faced by these users (see [47]). By conducting 14 qualitative interviews with asylum seekers in Germany, we especially wanted to investigate digital privacy perceptions and corresponding privacy protection behavior. We found that most asylum seekers are aware of several risks of using digital technologies during their journey, such as surveillance and related possible prosecution by states or other actors. Against this background, it can be observed that there are several strategies to deal with these risks, which is mainly shown in avoidance behavior, but also mitigation strategies (see Table 1). Since such behavior is caused by perceived lack of privacy and trust in certain applications and online services, the design of assistance apps and collaboration platforms should be specifically targeted at these needs.

Table 1 Identified strategies of asylum seekers to protect their digital privacy from Steinbrink et al. [47]. These strategies are characterized by specific protection behaviors that could be identified within the interviews

Most of our results conducted from the interviews were in accordance with research in the field. With regard to the possession of phones, 11 out of 14 study participants stated to have owned a smartphone at least for some time, while the three remaining owned a simple mobile phone. These results are in line with insights stated by Emmer et al. [16]. The interview respondents reported of several incidents during their flights in which their smartphones were confiscated by smugglers or police officers. According to the respondents, they sometimes disposed of the smartphones themselves in order to prevent someone controlling it.

When asked for what purpose they mostly needed their smartphones, the respondents stated (1) GPS applications and (2) communication with relatives, friends, other asylum seekers, or smugglers as the most frequently used smartphone applications, which is in line with previous research [13, 16]. Specifically, online and offline maps were of great importance for the asylum seekers since the access to map applications was essential for their autonomy. These results are confirmed by the findings of Zijlstra and van Liempt [52], who state that map applications contribute to refugees’ mobility and ability to cross borders since they are less reliant on smugglers. The crucial significance of smartphones, e.g., for planning and orientation, was also confirmed by several other studies, including [2, 13, 22].

The results of our study indicate that the challenges and threats associated with border controls may have an effect on the way smartphones are used. Here different user behavior is partly dependent on the origin and reason for fleeing of the respective person. This is partly supported by the findings of Gillespie et al. [22], who find a shift in refugees’ digital practices in that they require online (in)visibility to protect themselves from detection, arrest, or deportation. We find that it is primarily asylum seekers who themselves experienced the negative impact of government surveillance and persecution in their countries of origin who have acquired a profound awareness of the importance of digital privacy. One interview respondent stated his digital privacy to be directly related to his family’s well-being as one must always fear surveillance and arrest due to critical opinions or actions. This finding is supported by the International Rescue Committee [9] and Latonero and Kift [31]. Coles-Kemp and Jensen [8] found that asylum seekers trying to adapt to a new country primarily want to use the advantages of digital services and are less concerned with their data privacy. However, we discovered a direct link between the precarity of a situation and drastic user behavior which can lead up to abandonment of the smartphone or the digital services. This especially applies if data privacy is directly linked to glaring threats, such as the risk of detention or possibly life-threatening consequences. With regard to our interviewees, we could observe that their awareness for data privacy often relied on a specific experience or event. Beyond that, we suspect preconceptions and technological literacy as important drivers behind smartphone use. Taking these aspects more into account during the development of digital tools or the conceptualization of digital help offers could support asylum seekers in using ICT securely.

5 Critical Environment II: The Role of Privacy in Digitalization—Analyzing Perspectives of German Farmers

How relevant is digital privacy for small and medium enterprises? How does privacy affect the adoption of digital technology in agriculture?

While advances in technology bring many conveniences and benefits, they can disrupt entire sectors of society and reshape the fundamental ways in which we interact and work together. Especially changes in domains of critical infrastructure, such as food supply, therefore need to be carefully considered. In doing so and in light of current privacy research, we wanted to examine the effect of the advancing digitalization in the area of agriculture. For this purpose, we conducted 52 qualitative interviews with farmers in Germany (see [33]). Against the background of the introduction of digital tools and services in the sector of agriculture, great challenges arise, particularly for small- and medium-sized enterprises (SMEs) [11]. On the one hand, businesses have to follow the requirements of consumers and retailers, who demand transparency and information about products and respective supply chains [27]. On the other hand, sharing this kind of data with multiple actors along the supply chain also involves a privacy risk that should not be neglected [18, 20]. Especially in light of the hard competition in agriculture, in which small farms have to assert themselves against large players driving technological innovation, data becomes an important resource [20, 33]. Therefore, they have to be careful about who they give insight into their operational data and how much. While too much digital privacy could lead to a further weakening of the market position of small farms and serious consequences for the whole domain, rejection of digital tools will do the same. Here, we found that privacy behavior and the adoption of digital tools are mutually interrelated [33]. In the following, we outline the challenges associated with this. With view to the protection of privacy, digital data have to be adequately managed in order to achieve an appropriate balance between data dissemination and data protection [20, 33]. For this purpose, adequate infrastructure has to be provided which takes these requirements into account from the outset. The introduction of digital processes not only affects individual work steps but also permanently changes the entire profession of a farmer [11, 33]. The smaller the business, the more difficult and expensive it is to introduce automation and digitalization processes. Specifically, this concerns, e.g., the size of the farmland or the increased bureaucratic workload [20, 33].

The traditional farming job mainly consisted of manual work including field or barn work and the maintenance of farming equipment and machinery and only marginally included the planning of work steps and corresponding agreements with other actors. Today, however, with view to the large number of stakeholders and specialists in the field of modern agriculture, the latter aspect takes up a large part of the work. In the conducted interviews, some respondents even indicated that digitalization has not resulted in an easing of workload but has only increased office work. Furthermore, farmers can no longer manage the work of their business themselves but are heavily dependent on third parties for maintenance or data management, whereby sensitive data is potentially exposed to several actors. A further challenge concerns the comprehensive and time-consuming processes of data collection for automated machines. Here, one interviewee expressed his displeasure about the fact that this work step only brings little benefit for the own business compared to the large disclosure of sensitive data. The aspect of increased dependencies is also visible regarding customer retention in relation to manufacturers of digital tools. As farmers are often contractually bound to purchase all their digital tools from only one provider, this often leads to a so-called vendor lock-in effect. Yet another new challenge of modern agriculture concerns the dependencies—not only on the weather as in the past but also on quality network connections, mobile phone coverage, and satellite availability. Based on the outlined challenges, there is nothing surprising in the fact that digital processes are implemented by farms to very different extents. Here, our findings underline that both likelihood and extent of the integration of digitalization deviate with view to different subsectors in the domain (see Fig. 3) and are highly dependent on the anticipated benefit and the necessity of introduction based on competitive pressure or legal requirements. With regard to the impacts on privacy behavior, we found that farmers with advanced experience with the technology have also less reservations about it, which is reflected in the respective data management behavior. At the same time, the reliance on certain technologies may mean that farmers have no choice but to share their data. Interestingly, both factors seem to lower the inhibition threshold regarding privacy concerns, which again highlights the importance of privacy as decisive factor with view to the implementation of digital tools. What became clear through this study is that the advantageous adoption of digital processes by agricultural businesses not only involves financial and physical resources but also to a large extent time flexibility as well as willingness to share sensitive operational data.

Fig. 3
An area graph of benefits of digitalization for the specific branch versus dependency on digitalization. Wine has low likelihood of adopting new technologies. Grain has medium likelihood of adopting new technologies. Cattle has high likelihood of adopting new technologies.

Different agricultural subsectors and how they are affected by digitalization, influencing attitudes toward privacy and the likelihood of adopting new technologies as depicted in [33]

6 Conclusion

In this chapter, we explored how privacy perceptions and behaviors are affected in safety-critical environments. We investigated the relationship between security and privacy behaviors and the results of our study suggested privacy and security behavior are distinct, but possibly two different dimensions of the same construct. Then, we considered the perception of sensitivity of data by end-users in Germany. Our study showed that users are more willing to provide data if they are at risk themselves than if others are at risk when it contributes to their safety. However, this willingness is dependent on the type of data and the party receiving the data (e.g., the government or emergency responders). Furthermore, the results pointed toward the importance of transparency with regard to what purposes the data are used for. Subsequently, we presented two examples of privacy perceptions and behaviors within safety-critical environments and examined the extent to which digital tools pose safety risks, one being the smartphone use of asylum seekers during their flight and the other being the adoption of digital tools within agriculture, potentially affecting the food supply chain. Our results suggest that the perception of these risks by users interfere with the use of ICT and the potential advantages associated with it. These results are somewhat contradictory to the privacy paradox, which states that the value of privacy is often not reflected in behavior. This highlights the importance of considering the context of privacy and the corresponding user behavior, especially in safety-critical environments. Consequently, the results presented can be well embedded into Helen Nissenbaum’s theory [37] of Privacy as Contextual Integrity. This theory and the empirical findings highlighted suggest that privacy cannot be viewed as a static value, but that conformity to individual norms about what constitutes appropriate disclosure is variable and can alter how privacy is valued. Hence, it is crucial for future research to take contextual factors into account, when regarding privacy protection behaviors.