1 Introduction

Autonomous driving has already been technologically implemented in parts in many countries and will enhance daily mobility in cities in the near future (Grush and Niles 2018). Applications range from commonly used advanced driver assistance systems, such as adaptive cruise control, over prototypes of autonomous buses in public transport, up to tracks on which the possibilities of the innovative technology are tested in real life environments (Haas et al. 2020; Portouli et al. 2017; Reid 2019). The feasibility of such applications includes intelligent transportation systems to coordinate traffic (pedestrians, vehicles, road infrastructure, etc.) (Alam et al. 2016). Based on information and communication technologies, mesh networks provide the opportunity for multiple connected entities to exchange data and interact (Arena and Pau 2019). To classify levels of automation, different systems have been established, such as the standard of the National Highway Transport Safety Administration (NHTSA), the SAE standard, or the standard of the German Federal Highway Research Institute (BASt). Chances for urban mobility are to avoid traffic jams and reduce accidents, thus improving road safety, travel efficiency, and environmental impacts (Jiménez et al. 2016). Therefore, the scientific attention is great and integrates different perspectives: besides technical vehicle development (Alam et al. 2016), also computer science (Dartmann et al. 2019) and social science (Brell et al. 2019d; 2019c).

The real benefit autonomous vehicle technology can bring against potential negative consequences is difficult to assess for the public. The evaluation of the positive and negative consequences of autonomous mobility for people might depend on various conditions: such might be the prevailing knowledge about autonomous driving, the utility of autonomous driving in different usage contexts, the alternatives to using autonomous vehicles, the availability of autonomous vehicles, and the public understanding of personal, technical, and data-law aspects associated with the novel mobility. For the overall success of autonomous driving and its seamless implementation in societies, public perceptions are a major cornerstone that should be incorporated quite early in the implementation process to foster a participatory procedure in line with public opinions (Hess 2018; Biegelbauer and Hansen 2011).

In order to integrate the user perspective not only into the technological development but also in public communication and information strategies at an early stage, it is central to understand which advantages and disadvantages citizens consider with regard to autonomous driving. This is particularly important to assess the extent to which these fears can be attributed more to the novelty of the technology (due to lacking user experience), to a perceived loss of control by artificial intelligence, or to data and privacy-related issues.

In this paper, we provide deeper insights into the user perspective on autonomous driving. Special regard is given to data distribution as a key to effective vehicular communication and reliable autonomous mobility. Previous studies have shown that the handling of personal information and privacy-related issues is relevant to acceptance in this context (Brell et al. 2019a; Garidis et al. 2020; Walter and Abendroth 1011). Building on these findings, our study contributes to further conclusions, e.g., regarding the storage and sharing of sensitive user data, perceived data risks and chances, to derive reasoned recommendations for practice.

2 Acceptance of autonomous driving

Acceptance and the willingness of people to use new technologies, to accept them willingly, and to deal with the consequences of innovation are central to societies and their well-being, but of course it is also a question of policy and governance in innovation management. In the following, we detail the theoretical base of technology acceptance and its importance to integrate acceptance perspectives in technology development at an early stage (see Section 2.1). Also, the perception of privacy (see Section 2.2) and trust (see Section 2.3) is specifically addressed as both concepts seem to play a cardinal role in the public perception of autonomous driving and the willingness to use autonomous mobility.

2.1 Technology acceptance and innovation management

The question of whether, and if so under which conditions, people accept technological innovations has received attention in research and development since the 1980s. The historically most influential model—the Technology Acceptance Model (TAM) and its successors—focused on information and communication technologies in the office context. The intended use of such technologies is predominately influenced by two key factors, the ease of use and the perceived usefulness of the technology. The TAM was subsequently extended in more differentiated versions (Venkatesh and Davis 2000; Venkatesh and Bala 2008; Venkatesh et al. 2003; Venkatesh et al. 2012): for example, demographic variables and other user characteristics as well as conditions of use (e.g., the voluntary nature of use) were added as predictive factors for technology acceptance. Another approach to theoretically model persons’ willingness to accept and use technical devices are technology diffusion theories (Rogers 1995). Accordingly, users show diverse adoption reactions to innovations, from “early adopters”, thus persons who are much more willing to adopt an innovation, to “laggards”, thus users who refuse the adoption of the innovation as long as possible (Rogers 1995).

In this context, risk perceptions have been identified to impact the societal acceptance of large-scale technologies (Burger 2012; Huijts et al. 2012; Gupta et al. 2012). Public concerns or even protests against a novel technology occur as the public responds to unknown but imputed risks of a novel technology even though technology might also deliver benefits to society (Gunter and Harris 1998; Horst 2005). Thus, perceptions of risk refer to persons’ subjective evaluations of the probability of harm through technology and possible consequences of negative events (Sjöberg et al. 2004). Risk perceptions are impacted by different cultural and social values and also by individual knowledge and personal attitudes (Zaunbrecher et al. 2018; Arning et al. 2018; Linzenich et al. 2019). Recent research indicates that people weigh up the perceived risks and benefits for the decision to adopt a technology (Linzenich et al. 2016).

Recently, a large empirical study with more than 1700 adults in the USA (Ward et al. 2017) examined the risk and benefit perceptions in the context of automated vehicles. The findings corroborated that trust, risk, and benefit perceptions are related to acceptance of automated vehicles. Demographic factors, such as generation, age, and gender, also influenced the knowledge and the reported trust towards acceptance of and willingness to use automated vehicles (Ward et al. 2017; Hulse et al. 2018; Hohenberger et al. 2016).

In addition, risk assessments in autonomous driving technology were found to be influenced by the prior experience with technology in general and the experience with driver assistance systemsFootnote 1 (Brell et al. 2019d; 2019c).

With increasing knowledge of and experience with automated driving systems, risk perceptions towards autonomous driving decreased and acceptance of the technology increased (Brell et al. 2019d; 2019c; Ward et al. 2017). Apparently, the familiarity with the handling of advanced speed regulation systems increased the trust perceptions in the reliability and safety of the system, and the discomfort towards the autonomous (uncontrollable) nature of the system decreased. Independently of experience, however, the most critical factors for the broad acceptance are users’ attitudes towards invasions of privacy and distrust in a transparent data handling (Brell et al. 2019d; 2019c).

2.2 Perceptions of data distribution, data handling, and privacy

The enormous advantages of intelligent vehicle technology—with respect to safety by controlling of traffic jams, congestion by conserving fossil fuels, and reducing noise levels in cities—can only be exploited because data from the vehicles and its routes is connected to infrastructure and other road users. This allows the planning and management of networked, individual, adaptive, and overall efficient traffic routes within and across cities. On the one hand, the utilization of data is associated with these enormous social and societal benefits; on the other hand, it also has significant disadvantages in the context of data protection and privacy (Gantz and Reinsel 2012; Dritsas et al. 2006).

The protection of privacy and the careful handling of data represent the most sensible part in the roll-out process and the critical point with respect to public perception and acceptance of autonomous mobility. The development of an appropriate privacy policy for citizens, their willingness to tolerate a broad data collection, and the tolerance of (technical) surveillance (Tene and Polonetsky 2012; Karabey 2012) is vital in this context. Characteristically, trade-offs need to be negotiated on different levels and situations: for example, the trade-off between keeping personal privacy on the one hand, the provision of open infrastructures, and open data on the other hand is of importance. Also, questions with regard to responsibility in the data handling and use are of vital impact as well as the critical issue of data ownership that needs to be carefully determined in order to provide adaptive and individually tailored services (Ziefle et al. 2016; Ziefle et al. 2019).

In line with the increasing digitization of societies and the area-wide use of electronic devices, “information privacy” is defined as the disclosure of personal information to a third party (Finn et al. 2013; Smith et al. 2011). It is important to note that the perception of privacy risks is not identical with the factual technical risks. Rather, users follow an affective or experience-based understanding of data or information sensitivity (Schomakers et al. 2019a; Ziefle et al. 2016). This potential mismatch between perceived sensitivity of data and the technical information sensitivity provides a rich base for misconceptions. As a consequence, careless user behaviors on the one hand and exaggerated concerns on the other may arise. When it comes to the question if consumers want to share their data, the temporary benefits of the novel services might be higher than the concerns what could happen with the data. This is referred to as privacy calculus (Dinev and Hart 2006).

Not only minor technical knowledge levels but also over-trust in having control of data might be responsible for observed privacy behaviors (Schomakers et al. 2019b; 2020). Recent research showed that the majority of users are quite sensitive in the context of data exchange and privacy issues, especially when the data is used by third parties without public transparency (Lidynia et al. 2017; Valdez and Ziefle 2019). Another critical issue for users concerns the question for how long data may be stored and which authority is responsible for the storage. The longer the data storage and the more data is stored on servers beyond the control of the users (e.g., central servers of companies or the traffic management), the lower is the willingness to share data, independently of the type of data (Schmidt et al. 2015a). Concerns are also higher the more personal the information is and the higher the probability of being identifiable (Valdez and Ziefle 2019; Ziefle et al. 2016). However, there is also empirical evidence that people seem to be differently vulnerable for those concerns (Schmidt et al. 2015a; Schomakers et al. 2018; Schomakers et al. 2019b). All in all, however, there is a widely prevailing public distrust which seems to have two different sides: one is an unspecific distrust in authorities with regard to a careful, protective, and diligent handling of data; the other is an archival concern towards invasions of privacy.

2.3 Trust—the hidden player

Whenever humans get in touch with automated systems, trust is a key to successful interaction, but it is also sensitive to uncertainties that may lead to users’ distrust and the rejection of technology (Hoff and Bashir 2015; Parasuraman and Riley 1997). However, the impact of human (dis)trust on acceptance decisions may not always be immediately apparent, but possibly hidden behind other narratives and experiences, carried, and influenced by other parameters (Siegrist 2019). For instance, trust has been identified as driving the perceived reliability and reliance on automation which are decisive for the evaluation and use behavior (Dzindolet et al. 2003; Lee and See 2004). Trust may also be expressed in terms of individual expectations or concerns determining the trade-off between perceived risks and opportunities, e.g., with regard to data exchange in autonomous driving (Schmidt et al. 2015a). To understand the dynamics of trust and acceptance in human-automation interaction, relevant predictors need to be accurately identified and carefully considered both in isolation and interaction.

According to Janssen et al. (2019), using automated systems increasingly involves time-sensitive or safety-critical settings, embodied and situated systems (i.e., subsets of automated systems), and non-professional users, which all applies to self-driving cars. Autonomous driving offers not only great usage potentials but also perceived risks from the user perspective (Kaur and Rampersad 2018; Schmidt et al. 2015b). Therefore, a research focus on trust in automation is required. Kaur and Rampersad (2018) investigated key factors influencing the adoption of driverless cars and identified performance expectations and perceived reliability as relevant determinants, pointing out the relevance of empirical studies and the inclusion of user needs in the technical development with special emphasis on trust issues. Further research has shown the influence of trust on the acceptance of autonomous mobility (Choi and Ji 2015), revealing trust as predicting the interest in using and the willingness to purchase a self-driving car (Ward et al. 2017). Consequently, studies on supporting trust in automation in this usage context are numerous (e.g., Häuslschmid et al. 2017; Koo et al. 2015; Waytz et al.2014).

Common to many studies is that trust is directly addressed and measured through self-reporting (using, e.g., Likert scales), e.g., with regard to immediate responses in experimental settings (Sheng et al. 2019) or scenario-based evaluations (Brell et al. 2019b). As there are indications that trust in the performance of a particular automated system is not only influenced by explicit but also implicit attitudes (e.g., towards automation in general) which users are often not aware of Merritt et al. (2013), it is of great interest to what extent trust as a hidden player has a role in the evaluation of autonomous mobility.

Therefore, in this survey, we explored ways in which trust is (indirectly) expressed and perceived in relation to other acceptance-relevant factors, such as data security and privacy.

3 Empirical research design

The research aim was to better understand the perspective of future users on autonomous driving in terms of acceptance and the probability of use rejection. We set a specific focus to the perception of data security and privacy when using autonomous vehicles.

Yet, as autonomous driving is only partly entering real life experiences, capturing users’ mental models about autonomous driving, their perceptions and understanding of using this technology at this point in time offer valuable input for scientific evaluation in human-automation interaction, technical development, and public communication strategies.

3.1 Research scenario and questions addressed

The survey was conducted in Germany in 2019 using German language. For the classification of automation level, we referred to the German Federal Highway Research Institute standard (BASt). The standard defines driving tasks of the driver according to automation level, from “driver only” (no automation, level 0) to “fully automated” (level 4) (BASt 2018). Following level 4 of the BASt standard, our research scenario referred to driving features that are capable to drive the vehicle themselves (i.e., performing driving tasks autonomously), allowing the human driver to pursue other activities while driving. An introduction to the topic, including the scenario and aim of research, was presented to the participants in advance.Footnote 2

We addressed the following research questions with focus on data distribution and privacy:

  • RQ1: What types of expectations, fears, and risks do potential users face in autonomous driving?

  • RQ2: What barriers and benefits do they consider?

  • RQ3: What influences the user’s perception and evaluation of data use?

  • RQ4: How are these factors related to the intention to use autonomous vehicles?

3.2 Mixed methods approach

As users’ perceptions and technology acceptance may vary depending on the sample, context, and approach (i.e., there is an interaction between method and research object (Wilkowska et al. 2015)), mixed methods represent a reliable response, also with regard to complex research questions (Lund 2012). Hence, we followed a multi-tiered process to develop and validate relevant assumptions using qualitative and quantitative methods. This way, methodological advantages were combined to compensate for potential shortcomings and to achieve in-depth insights.

Figure 1 shows our empirical research design. The overall approach was exploratory and structure-discovering. First, we conducted focus groups and guided interviews for a deep understanding of the multi-faceted and diverse user perspective on autonomous driving with special regard to the handling of personal data and travel information (see Section 4). Key findings (i.e., novel, often, or commonly mentioned attitudes, needs, and demands) were then operationalized, transferred into survey items, quantitatively assessed, and related to each other in two consecutive online questionnaires focusing on risk perceptions, benefits and barriers of use, and user requirements towards data exchange to gain valid conclusions in this context, also regarding the intention to use autonomous driving (see Section 5).

Fig. 1
figure 1

Empirical research design

For the social science perspective, the iterative, consecutive implementation of qualitative and quantitative methods has already proven its value and effectiveness, but has often been limited to the application of a two-step approach (e.g., Brell et al. 2019c). The novel combination of already validated methods in an empirical four-step approach, in which the applied qualitative and quantitative surveys carefully build on and complement each other in the best possible way, is therefore the key to our research approach.

3.3 Data acquisition

In order to capture an unbiased view on the topic, ad hoc participants were addressed. We contacted volunteer participants for the interviews and focus group discussions in our personal environment under consideration of diverse social settings and characteristics. Online links to the questionnaires were distributed through social networks (e.g., on Facebook in personal feeds and groups), instant messaging, and email. Participants covered a broad age range and came from all parts of Germany. Participants volunteered to take part in the studies, were not gratified for their efforts, but took part for the sake of interest in the topic. Before the participants started the survey, the interviews, and the focus groups, they were informed that it is central for us to understand their free opinions and perspectives on autonomous driving, and the opinions that are prevailing in the public. We stressed that there are no “wrong” answers but encouraged them to spontaneously and honestly report their personal views. Participants were also informed that their participation is completely voluntarily. In line with ethical research science standards, we confirmed a high privacy protection in handling the participants’ data and assured that none of their answers can be referred to them as persons.Footnote 3

In order to assure an overall understanding of the material provided, three independent random pre-testers (28–36 years of age, no technical experts) checked the materials used for the empirical studies. We asked them to carefully control the information texts as well as the questionnaire items regarding (a) understandability (complicated or ambiguous wording, grammar, and orthographic issues), (b) length and perceived burden when filling in the questionnaires, and (c) bias and objectiveness of introducing the topic to participants (presenting the topic in a neutral manner).

Information and content presented below to illustrate the methods and results were translated from German.

4 Understanding the users’ narratives on autonomous mobility

The use of qualitative methods in empirical research is a key to addressing individual perspectives of particular stakeholders on a specific topic and thus providing insights into the many facets of social reality. Group discussions and interviews haven been proven reliable for exploring yet unknown aspects of subjective perception, knowledge, experience, and attitude. To develop a broad and deep understanding of the users’ perspective on autonomous driving, we used an integrative qualitative research approach including focus groups (N = 14) and guided interviews (N = 7) (see Fig. 1).

The following sections describe our qualitative research approach (development and implementation) (see Section 4.1), the participants (see Section 4.2), the obtained results (see Section 4.3), and lessons learned for follow-up research (see Section 4.4).

4.1 Qualitative research approach

In focus groups, the participants exchanged personal ideas and attitudes regarding autonomous driving in a joint and lively discussion under guidance. Main topics addressed user expectations (e.g., What would be different when driving in an autonomous car?), perceived risks (e.g., Who should take the responsibility for driving?), and data privacy (e.g., Are you willing to share passenger information?) (see Section 4.3.1).

Face-to-face interviewsFootnote 4 provided deeper insights into individual perceptions of sensitive issues related to data and information distribution in autonomous driving. Here, the focus was on collecting and sharing data (e.g., Which data may (not) be stored? Who should (not) have access to your data?) as well as data security (e.g., What steps should be taken to ensure data protection in autonomous driving?) (see Section 4.3.2).

All participants provided socio-demographics (age, gender, education) and data on their mobility behavior (driver’s license, experience with driving assistance systems).

The average survey time was between 60 and 90 min. The dialogues were audio-recorded, transcribed verbatim, and analyzed by qualitative content analysis (Mayring 2015), which is particularly useful for processing large quantities of material (Mayring and Fenzl 2019). First, analysis units (coding, context, and evaluation units) are determined to systematically reduce the text material (i.e., the transcripts) to essential meanings, which are then categorized; the aim is to develop a category system that includes all relevant aspects of analysis (i.e., categories) (Mayring 2015). In the present survey, the definition of categories was primarily inductive, i.e., based on the text material, but was deductively supplemented by theoretical considerations (based on the survey guidelines).

4.2 Participants

In total, 21 participants took part in the qualitative survey, thereof n = 11 men (52.4%) and n = 10 women (47.6%). Age ranged between 16 and 67 years (M = 40.6, SD = 16.3). Education was comparatively high with n = 9 (42.9%) university graduates, n = 7 (33.3%) high school graduates, and n = 5 (23.8%) participants holding a secondary school certificate (cf. Statistisches Bundesamt (Destatis) 2020).

The majority of participants had a driver’s license (n = 20, 95.2%). Besides, the participants indicated a regular (daily to weekly) car use, whereas previous experiences with driving assistance systems (e.g., automatic parking, lane keeping assistant, and adaptive cruise control) varied.

4.3 Results

First, user expectations and risk perceptions are described with special regard to data privacy (see Section 4.3.1). Then, data-related factors relevant to mobility acceptance are outlined (see Section 4.3.2).

4.3.1 Expectations and risk perceptions

In general, the participants showed a high interest and openness towards autonomous driving. Individual perceptions and evaluations were influenced by trade-offs: The participants considered diverse expectations in detail to carefully balance between perceived disadvantages and advantages of use.

Expected advantages

In particular, not only enhanced comfort by assigning driving tasks to the autonomous vehicle but also the possibility of pursuing other activities while driving and time saving were perceived positively. Besides, increased safety, e.g., with regard to faster reaction times in critical traffic situations was appreciated.

Feared disadvantages

The participants also expressed concerns about losing their driving experience when using autonomous vehicles, often associated with a negative feeling of technology dependency. Liability risks were frequently discussed revealing uncertainties concerning who will be legally responsible for driving, particularly in the event of damage (e.g., the human on-board or the manufacturer).

Data privacy

To clarify liability and investigate accidents (including the question of guilt), the participants showed a high, dedicated willingness to provide and share relevant data by using a black box for journey recording, for example. Apart from that, the distribution of personal and travel information was considered critically due to perceived privacy restrictions. Fears regarding data robbery and misuse (e.g., through hacker attacks) became apparent.

4.3.2 Data use in autonomous driving

In the following, we report the user-centered evaluation of data and information distribution in autonomous driving, in which data collection, data sharing, and data security were considered as key criteria to acceptance.

Data collection

The participants expressed considerable information needs concerning the purpose and duration of storing personal data and showed high control requirements, particularly as regards the amount of data. Besides, they strongly demanded to decide which data is collected. Concerning the type of data, some of the participants would only provide information about their destination and the route, while others could also imagine having vital signs measured in-vehicular for health prevention (e.g., to help communicating with rescue services in an emergency).

Data sharing

Details about the data addressee were required a condition for information exchange. Whereas sharing data with the vehicle and road infrastructure (e.g., traffic lights) was considered necessary and therefore accepted, distributing information to the manufacturer or public authorities was rather rejected out of concerns for data misuse. There were tendencies that the acceptance of data distribution varied with the mobility service and was greater for Car-Sharing (e.g., for user identification) than private vehicles.

Data security

Concerns about data protection were repeatedly reported as acceptance barrier. To ensure data privacy and increase the willingness to use autonomous vehicles, the participants suggested regular, external security checks, also with regard to necessary software updates, for example.

4.4 Lessons learned for follow-up research

Autonomous driving is seen as a highly useful and appreciated technology envisioned for the future. Individual expectations and concerns are expressed in terms of usage benefits and barriers with special emphasis on perceived challenges in data exchange, especially in the context of potential data misuse and hacking. The following lessons learned served as a basis for follow-up studies to quantify and validate the obtained research findings (see Section 5):

  • Expectations positively relate to improved user experience and road safety.

  • Individual risk perceptions and potential drawbacks are manifold, mainly described in terms of data security and privacy-related issues.

  • The willingness to share data strongly depends on the individually perceived usefulness and necessity (e.g., smooth and safe travel).

  • Perceived data challenges relate to the handling of personal information: Transparency and the possibility to decide on the distribution of personal data seem to be a key to acceptance.

5 Measuring user attitudes and data requirements

The use of quantitative methods in empirical research is to measure knowledge, opinions, and attitudes towards selected indicators in large samples. To validate previously obtained research findings (see Section 4), we conducted a consecutive quantitative survey including two online questionnaires (see Fig. 1). The aim was to increase our understanding of underlying concepts and relationships as regards data-related acceptance factors in autonomous driving.

The following sections describe our quantitative research approach (development and implementation) (see Section 5.1), the participants (see Section 5.2), and the obtained results (see Section 5.3).

5.1 Quantitative research approach

We requested personal information on the participants’ socio-demography (age, gender, education, income) and mobility behavior (driver’s license, experience with driving assistance systems) to identify sample characteristics. Instructions relevant for answering all questions were presented in easy to understand text descriptions.

In order to validate the items of the questionnaireFootnote 5, we calculated Cronbach’s Alpha (α) revealing scale consistency with α > .7 which can be interpreted as good reliability (Field 2009). Answers to the scales were given voluntarily.

The first questionnaire (N = 183) addressed attitudes towards using autonomous vehicles (see Section 5.3.1). We measured perceived risks (4 items, α = .864), usage benefits (7 items, α = .882), and barriers (7 items, α = .723) identified as central before (see Section 4.3.1) on 6-point Likert scales (min = 1 full disagreement, max = 6 full agreement). Special focus was on comfort and safety as well as cyber-security. The participants were also asked whether they could imagine using an autonomous vehicle (yes/no/undecided) (Davis et al. 1989). Table 4 lists the items used in this study.

The second questionnaire (N = 100) deepened insights into users’ data requirements in autonomous driving (see Section 5.3.2). Based on pre-study results (see Section 4.3.2), we addressed preferences for data storage location (5 items, multiple choice), attitudes towards the use of health data (5 items, α = .895), and data privacy and security (5 items, α = .740). Special focus was on the in-vehicular collection of vital signs and data protection strategies. For comparative values, we also asked about general attitudes towards data use, such as sharing personal information in everyday life (5 items, α = .781).

Also, the intention to use autonomous vehicles was evaluated (3 items, α = .837) (Davis et al. 1989).

Likert items were assessed on 6-point scales (min = 1 full disagreement, max = 6 full agreement). Table 5 lists the items used in this study.

5.2 Participants

In total, 283 people participated in the quantitative survey, thereof N = 183 in study I and N = 100 in study II. Sample characteristics are compared in Table 1.

Table 1 Sample characteristics

On average, the participants were older in study I (age range 20–90) than in study II (age range 19–68). Gender and education distribution was similar: Overall, more men than women took part and educational levels were comparatively high according to predominant proportions of university graduates (cf. Statistisches Bundesamt (Destatis) 2020). The monthly net household income was higher in study I, which may be explained by the sample’s higher average age and related life situation.

The overall proportion of driving license holders was high. Regarding the use of driver assistance systems in cars (e.g., lane keeping assistant, automatic parking, cruise control), the majority was experienced, especially in study II.

5.3 Results

First, attitudes towards using autonomous driving are described (see Section 5.3.1, N = 183). Then, insights in users’ data requirements are provided (see Section 5.3.2, N = 100).

For data analysis, we used descriptive and inferential statistics. The level of significance (α) was set at 5%.

5.3.1 Attitudes towards using autonomous driving

In general, attitudes towards using autonomous driving were rather positive. Nearly half of the participants (47%, n = 86) could imagine to use a fully automated car, followed by 39.9% (n = 73) undecided ones, and 13.1% (n = 24) refusers. Figure 2 shows evaluations of perceived usage benefits and barriers (min = 1, max = 6).

Fig. 2
figure 2

Evaluation of perceived usage benefits and barriers of fully automated driving (mean values and standard errors, \(\min \limits =1\); \(\max \limits =6\))

Considering perceived benefits, less traffic jams and improved traffic flows (M = 4.9;SD = 1.4), more comfort by letting the vehicle take over driving tasks (M = 4.8;SD = 1.5), low accident risks (M = 4.6;SD = 1.5), and time savings (M = 4.5;SD = 1.5) were considered as usage advantages. These also included less fuel consumption (M = 4.4;SD = 1.6) and the expectation of improved insurance conditions (lower risk category) (M = 3.9;SD = 1.7). Privileges, such as free parking or using the bus lane, were not expected to be a benefit of use (M = 3.2;SD = 1.8).

Considering perceived barriers, legal issues as liability in the event of damage were seen particularly challenging (M = 5.1;SD = 1.3), followed by technical risks (e.g., errors) (M = 4.7;SD = 1.5), and insufficient data security (M = 4.7;SD = 1.5). Besides, not only ethical issues, e.g., responsibility for decisions in accident situations (M = 4.4;SD = 1.7) but also economic challenges, such as payments for infrastructure and road transport investments (M = 4.2;SD = 1.5), were identified as potential barriers to use. In contrast, incapacitation (i.e., limited self-determination) (M = 3.6;SD = 1.7) and health risks (e.g., through electrosmog) (M = 2.7;SD = 1.7) were considered less critical.

The evaluation of perceived risks (min = 1, max = 6) revealed feelings of distrust of the innovative technology and indicated high information and education needs of the public. The participants considered that there were still many issues to be technically and legally clarified on autonomous driving in the public (M = 4.8;SD = 1.3) and expressed concerns as regards the technical reliability (M = 4.0;SD = 1.7). In addition, they worried about cyber-criminals who could gain control of the vehicle (M = 4.0;SD = 1.7). To this, hacker attacks were perceived as deterrent to use (M = 3.8;SD = 1.7).

Correlation analyses showed that the intention to use autonomous vehicles was related to perceived risks (r = .456, p < .001) and usage barriers (r = .406, p < .001): The stronger agreements with perceived risks and usage barriers, the more likely participants were to decide against the use of autonomous vehicles. In detail, relations with technical unreliability (r = .471, p < .001) and health risks (r = .423, p < .001) were found (see Table 2). Usage benefits correlated weakly with the use intention (r = −.238, p < .01).

Table 2 Spearman-Rho correlation coefficients for perceived risks, usage barriers, and the intention to use autonomous driving (“*” corresponds to p < .05, “**” corresponds to p < .01, “***” corresponds to p < .001)

Considering user factors, gender correlated with use intention (r = .264, p < .001), perceived risks (r = .288, p < .001), and usage barriers (r = .318, p < .001), indicating men to be more willing to drive in an autonomous vehicle and also to have lower concerns than women, particularly as regards economic (r = .200, p < .01), legal (r = .200, p < .01), and health (r = .322, p < .01) issues. Age showed no significant correlations.

To better understand these relationships, we considered usage requirements with focus on data security and privacy in more detail in order to provide validated indications concerning future users’ willingness to drive in an autonomous vehicle (see Section 5.3.2).

5.3.2 Data use(r): distribution, safety, and privacy needs

The willingness to use autonomous vehicles was high (min = 1, max = 6): The participants indicated that they would like to experience autonomous vehicles (M = 4.7; SD = 1.4) and could imagine using them regularly in the future (M = 4.1; SD = 1.4). Less agreement was reached on the idea that there should be only autonomous vehicle transport in the future (M = 3.0; SD = 1.5).

To better understand user requirements for data distribution in mobility contexts, we took a general look at personal opinions on using and sharing data, such as in daily life (see Fig. 3): Most participants generally cared about what happens with their data (M = 4.8; SD = 1.2) and which data is being stored (M = 4.7; SD = 1.2) indicating high control needs. It was therefore not surprising that sharing personal information was rather critically seen (M = 2.8; SD = 1.3), especially data distribution to third parties (M = 2.2; SD = 1.3). Concerns about user profiles were indicated (M = 3.8; SD = 1.2).

Fig. 3
figure 3

Evaluation of attitudes towards data use in general (mean values and standard errors, \(\min \limits =1\); \(\max \limits =6\))

Correlation analysis revealed that attitudes towards data use in general were related to attitudes towards data privacy and security in autonomous driving (r = .492, p < .001): The stronger general agreements on data privacy and control, the greater these were also with regard to autonomous driving. Here (see Fig. 4), the participants considered regular security checks by an independent company important as regards both the vehicle software (M = 5.3; SD = 0.9) and the service provider (M = 5.4; SD = 0.9). The participants also indicated to less likely assume well-developed data protection concepts from manufacturers and service providers (M = 2.8; SD = 1.3) indicating distrust towards individual stakeholders. Besides, privacy concerns about data access by third parties (M = 4.9; SD = 1.1) and data hacking (M = 4.6; SD = 1.2) were expressed.

Fig. 4
figure 4

Evaluation of attitudes towards data privacy and security in autonomous driving (mean values and standard errors, \(\min \limits =1\); \(\max \limits =6\))

Particularly concerning the collection and distribution of sensitive information (i.e., health data) opinions varied (see Fig. 5). The participants tended to reject the recording of vital signs (e.g., blink or pulse) for safety reasons (M = 3.4; SD = 1.6) as well as driving adjustments according to vital signs (M = 3.1; SD = 1.6). However, the recording of health data for emergency situations was approved (M = 3.7; SD = 1.7) as well as distributing vital signs to rescue services (M = 4.5; SD = 1.5). The participants also agreed that autonomous vehicles should be aware of disabilities (e.g., blindness) to adapt user interfaces to individual needs (M = 4.1; SD = 1.7).

Fig. 5
figure 5

Evaluation of attitudes towards health data use (mean values and standard errors, \(\min \limits =1\); \(\max \limits =6\))

Considering preferences about data storage location of personal information, the majority 60% (n = 60) chose the country of residence, followed by 21% (n = 21) who agreed for their data to be stored in the country of traveling. A few participants (6%, n = 6) selected any country in the EU as data storage location. 13% (n = 13) stated that the location did not matter.

Correlation analyses revealed that the intention to use autonomous driving was greater the lower perceived risks on data privacy and security were (r = −.311, p < .01) and the more open participants were in sharing sensitive health data (r = .370, p < .001). Perceived risks on data privacy and security in terms of data access by unknowns (r = −.326, p < .01) and hacker attacks (r = −.320, p < .01) were negatively related to the use intention, whereas particularly the willingness to provide health data for emergencies (r = .350, p < .001) and to share them with rescue services (r = .375, p < .001) showed positive correlations in this context (see Table 3).

Table 3 Pearson correlation coefficients for perceived risks on data privacy and security, attitudes towards the use of health data, and the intention to use autonomous driving (“*” corresponds to p < .05, “**” corresponds to p < .01, “***” corresponds to p < .001)

Considering user factors, age was related to the use intention of autonomous vehicles (r = −.316, p < .001) which was all the greater the younger the participants were. Besides, age correlated with attitudes towards health data use (r = −.299, p < .01), indicating that younger participants tended to be more open about recording and using health data. Age was also related to perceived risks on data privacy and security (r = .214, p < .05) with older participants being more concerned about safety lacks in autonomous driving, particularly as regards unauthorized data access (r = .240, p < .05). In addition, older participants were more skeptical that vehicle manufacturers or service providers would do enough to protect the vehicles from external attacks (r = −.239, p < .05). Gender showed no significant correlations.

6 Discussion of results

Regarding the mixed methods approach of this survey, the iterative use of qualitative and quantitative methods allowed an intense exploration of user perspectives, perceived expectations, and challenges of data distribution in autonomous driving. As user perceptions, feelings, and requirements towards innovative technology are very individual, research requires high sensitivity, especially with regard to privacy and trust. This was realized by focus groups and interviews in which relevant factors were identified, individually addressed, and consolidated for appropriate measurement. Subsequent quantification provided validated results on user evaluations and showed significant correlations, particular between perceived (data protection) risks, usage barriers, and the willingness to use autonomous mobility. Also, significant correlations for age and gender were found, which, however, varied depending on the study: This may not only be due to diverse sample sizes and characteristics but also the items used, and thus needs to be re-considered in future work.

Results allow innovation management to properly address user requirements and compensate for potential usage barriers early in the technical development. Key findings may be used to develop transparent information and communication strategies for communes and cities. The communication concepts may serve two different goals: One is to increase public knowledge and the awareness for future urban mobility in order to empower citizens to derive informed decisions whether, to which extent, and under which (data) usage conditions citizens would support and use autonomous vehicles. The other is to inform technical designers and communication professionals about the public’s viewpoint and to develop an understanding that those public concerns need to be taken seriously and to be met with care.

In the following, we discuss specific aspects which are essential to understand the broad acceptance of autonomous vehicles and the handling and perspectives on data collection and distribution in future mobility.

6.1 Perceived risks and (dis)advantages

Risk perceptions and expectations were discussed in terms of perceived disadvantages and advantages of use, with data privacy as a strongly considered challenge in information distribution. Measurements of usage benefits and barriers confirmed pre-study results and previous research (Ward et al. 2017; Schmidt et al. 2015b): Not only increases in comfort, road safety, and travel efficiency regarding time saving by the ability to do other things while driving but also less traffic load and environmental pollution presented salient advantages of use. Next to legal and technical risks, data security was perceived a barrier to use. Fears of cyber-criminality in terms of data hacking and misuse were frequently mentioned.

The evaluation of risk perceptions indicated trust issues. The participants expressed doubts not only about the vehicle technology but also towards individual stakeholders, such as manufactures and service providers responsible for data protection. Since interpersonal trust has been identified as relevant for trust in automation (Hoff and Bashir 2015), we suggest communication concepts to establish contacts between future users, responsible companies, organizations, and policies for greater exchange, mutual understanding, and trusting relationships. It is also advisable to promote first hand user experience of the innovative technology in demonstrations or trials, as experience may positively affect trust perceptions (Gold et al. 2015) and the evaluation of technology (Brell et al. 2019c). The same applies to users’ understanding of technology (Koo et al. 2015): As the participants agreed that there are still many unresolved issues on autonomous mobility, our results demonstrate the urgency of early, user-centered information and education initiatives to increase the visibility of technical progress and improve technology know-how, in particular of inexperienced users, to foster trust.

6.2 Data use evaluation

Issues related to data collection, data sharing, and data security were relevant to the evaluation of data use in autonomous driving. The willingness to provide data for specific purposes was high if deemed necessary, such as accident investigation and emergency prevention. In contrast to Schmidt et al. (2015a) and Valdez and Ziefle (2019), health data use met with positive reaction. It seemed as monitoring was perceived reasonable to compensate for perceived barriers to use (e.g., liability and health risks). Follow-up research should focus on usage situations and conditions in which the distribution of data is preferred and accepted, also with regard to diverse user groups (older people, children, etc.).

Data control needs and information requests became apparent. Especially in situations involving unknown people, the participants perceived privacy restrictions. Again, the fear of unauthorized intruders gaining access to passenger and vehicle data was predominant. To reduce data concerns, we suggest third-party inspections to ensure not only data protection but also vehicle safety and stakeholder reliability as related uncertainties were repeatedly reported a barrier to use. Certifications to visualize security standards and clarify regulations may also improve feelings of privacy and trust. It is up to subsequent works to explore how this may attract the interest of service providers and users, which information needs to be addressed, and how it could be visually designed. Until then, comprehensible information guidelines on data handling are just as necessary as the involvement of users in deciding which data are collected and shared.

6.3 Usage intention

Despite perceived risks on data distribution, the reported willingness to use a fully automated vehicle was high, confirming previous findings (e.g., Panagiotopoulos and Dimitrakopoulos 2018; König and Neumayr 2017). Presumably, expected advantages may increase users’ interest and curiosity to experience the new technology. However, mainly concerns about usage (especially hacker attacks) were negatively related to the intention to use.

As a preliminary conclusion, the removal of risks and barriers (e.g., in terms of reliable data protection strategies) may be more decisive than incentives as regards the decision to adopt or reject autonomous mobility. However, this assumption needs to be addressed in follow-up studies in which participants have to decide which potential barrier or benefit weighs stronger for them in which usage situation (e.g., conjoint analysis). Such decision simulations would allow us to understand the trade-offs between the pro-using and the contra-using motivation and to identify so-called no-go-situations in which the public would not be willing to use autonomous vehicles, under no circumstances.

In this context, it is to be noted that the methodology used includes evaluations that do not base on real experience with automated vehicles. Rather, participants envision whether to be willing to use automated mobility and if so, under which circumstances. Of course, one could critically argue that the reliability of laypeople’s evaluation is low—due to the missing experience with automated driving. However, from a social science point of view, even evaluations of laypersons without hands-on experience might be an especially valuable source of information for all institutions and persons involved in the development and implementation process of automated driving: technical planners, persons responsible in communal policy, communication professionals, the teaching and education sector, and industry. Public perceptions represent the current status of technical knowledge (which can be increased by appropriate information designs) and the prevailing affect heuristics (Slovic et al. 2005; Keller et al. 2006) in terms of trust and the emotional evaluation of persons towards technology innovations in general and automated vehicles in particular. Public perceptions can be used early in the evaluation process to steer technical decisions, to develop information and communication strategies, and to inform and consult policy and governance (Offermann-van Heek et al. 2020).

7 Limitations and future works

As the study aim was to explore data risk perceptions and expectations of future users on a broad basis, we have not considered and compared the needs of diverse user groups so far. Since other studies on the perception and acceptance of autonomous mobility indicated the importance of user diversity in this context (Brell et al. 2019a; Brell et al. 2019c), the consideration of individual user perspectives in relation to this survey’s key findings has to catch up in subsequent studies. Effects of user factors to be addressed could regard not only, for example, gender, age and technology generation, health status, education, and technology know-how—especially as the participants in this survey were comparatively highly educated and often experienced in using advanced driver assistance systems—but also preferred user roles when driving (e.g., driver vs. passenger) and resulting requirements.

Another limitation regards our empirical methodology. We combined qualitative and quantitative procedures to catch both argumentation narratives as well as the quantification of user perspectives and their expectations towards benefits and challenges of autonomous driving. Still, we need to consider that our research methods provide only “anecdotal” evidence of acceptance, as the limited sample sizes do not allow a deeper applied insight into the interaction of human users with autonomous vehicles. Future studies could meet this limitation in two ways: one is to cross-validate the acceptance of users which do have some experience with automated vehicles already. Thereby, it could be determined whether the envisioned expectations towards benefits and challenges of autonomous driving are modulated by the increasing experience in the handling of autonomous vehicles. The second way of cross-validating is to replicate the studies country-wide in order to understand the acceptance patterns in a representative sample.

Besides, country-comparative studies should be conducted, as perceptions of urban mobility and the implementation of future mobility concepts may vary depending on culture, social shifts, and trends (Fraedrich and Lenz 2014; Theoto and Kaminski 2019).

Finally, in this study, we predominately focused on security-related aspects, thus expectations and risk perceptions with special regard to data privacy and data-related factors relevant to mobility acceptance. We did not include other positive effects of autonomous driving, as, e.g., the environmental benefit aspect. Future studies should also address environmental benefits of autonomous driving (Liu et al. 2019a; Nègre and Delhomme 2017) in order to receive a comprehensive picture of public perception of autonomous mobility.

Two more very essential aspects which need to be addressed in future work are the role of experience and individuals’ knowledge on acceptance as well as the role of information which is given to the public in the roll out process. What we know so far is that users’ experience with automated vehicle functions as well as drivers’ knowledge about vehicle automation is influencing public acceptance of automated cars: experienced persons (relying on both theoretical and/or practical hands-on knowledge) tend to be more open to vehicle innovations in general and automated driving in particular (Brell et al. 2019c; Ward et al. 2017). The explanation why experienced persons have higher acceptance levels, however, might be due to different reasons: on the one hand, users might know factually more (about benefits and risks), which allows them to evaluate autonomous driving realistically; on the other, users might feel to be better informed about potential factual and perceived risks which, as a consequence, increases the trust towards automated vehicle technology (Zaunbrecher et al. 2018; Petersen et al. 2018; Distler et al. 2018). Thus, both cognitive and affective factors influence public acceptance (Zaunbrecher et al. 2018; Liu et al. 2019b; Graf and Sonnberger 2020). However, at this point, it is the question how a transparent and diligent information policy could help to allow future users to realistically evaluate not only the enormous potential of autonomous vehicle technology but also the risks and uncertainties which come with it. Thus, future research should examine different information formats, media, and contents that increase the ability of future users to deal adequately with vehicle innovations in a transparent way and allow them to draw informed and diligent decisions by this forming a solid and sustainable public understanding and acceptance.