Skip to main content

Challenges of Older Drivers’ Adoption of Advanced Driver Assistance Systems and Autonomous Vehicles

Part of the Lecture Notes in Computer Science book series (LNISA,volume 9755)

Abstract

The personal vehicle is increasingly the preferred mode of travel for aging adults. There are greater numbers of older drivers on the roads driving more miles than ever before, and it is important to be aware of declines that might affect them. Existing technology adoption frameworks are reviewed and relevant issues surrounding older adults’ adoption of advanced driver assistance systems and/or autonomous vehicles are discussed. A secondary analysis is performed on recently collected Floridian survey data that over-sampled older adults (age 55+ yr). Exploratory factor scores are calculated based on survey responses and the predictive effects of age, gender, annual household income, ease of new technology use, and providing information relating to the technologies are examined. Results are discussed in terms of how best to increase older adults’ familiarity with and trust of these transportation technologies in order to help ensure their adoption and safe usage.

Keywords

  • Older adults
  • Technology adoption
  • Advanced driver assistance systems
  • Autonomous vehicles

1 Disclaimer

The opinions, findings, and conclusions expressed in this publication are those of the authors and not necessarily those of the State of Florida Department of Transportation or the U. S. Department of Transportation.

2 Introduction

2.1 Population Aging and the Aging Driver

Older drivers are driving more miles on average than ever before, with drivers aged 75–79 years have driven 60 % more miles than their predecessors (1995–96 cohort versus 2008 cohort) and 51 % more miles for drivers 81 and older [1]. There are currently increased levels of licensure and driving by older adults (OAs) [2], and with Baby Boomers aging, it is projected that one in five Americans will be over the age of 65 in 2030 [3]. With greater numbers of older drivers driving more miles than ever before, it is important to be aware of age-related declines in sensory, cognitive, and psychomotor abilities that might affect older drivers on the road. It has been shown both experimentally and through literature review that OAs take 1.7–2.0 times longer than their younger counterparts for elementary information processing operations [4]. For more in-depth review of how age-related declines affect the driving task, please refer to the following sources [510].

Older drivers tend to keep away from driving situations they perceive as difficult, and drive less overall (e.g., [11]), though as stated earlier, they are beginning to drive more miles, later in life than previous cohorts [1]. Some older drivers choose to self-regulate their driving to avoid situations they find stressful (e.g., heavy traffic), or situations in which these deficits are more pronounced (e.g., driving at night, or in poor weather conditions), and might even cease driving while they are still relatively safe drivers (for a complete literature review and framework of self-regulation in driving by OAs, see [12]). In-vehicle telematics systems (e.g., GPS, collision warning systems) are generally perceived as beneficial among OAs, especially older women, in increasing confidence in driving [13], and might help them avoid the deleterious effects related to driving cessation (e.g., [14]).

3 Older Adults and Technology Adoption

OAs differ from the general population in physical and cognitive capabilities as described earlier, and commonly have less familiarity with new technologies than younger adults [1517]. It has been shown that OAs are aware of technological benefits though, and are willing to try new, useful technologies [18]. OAs are attracted to technologies they find useful and that provide clear benefits to their current lifestyle, and they are generally reluctant if they cannot foresee possible advantages (e.g., [19]). Czaja et al. [17] surveyed 1,204 participants and found that general technology use was predicted by age, education, race, fluid and crystallized intelligence, computer self-efficacy, and computer anxiety, with greater technology use generally found in younger, more highly educated individuals. Importantly, the relationship between age and technology use was mediated by cognitive abilities, computer self-efficacy, and computer anxiety. Due to a lack of proper assessment of OAs’ needs, this large demographic group with considerable spending power is underserved by industry [16, 20, 21].

3.1 Models of Technology Adoption

Technology Acceptance Model and Its Offshoots.

The Technology Acceptance Model (TAM; [22]) was developed as an empirical framework to explain user acceptance and adoption of information technology, and has been proven to be quite robust in explaining the adoption patterns of many different types of information systems in different contexts [2325]. TAM consists of two main factors: Perceived usefulness (PU; i.e., the belief that use of a new technology will help or enhance a person’s job performance) and perceived ease of use (PEOU; i.e., the belief that use of a new technology will be relatively free of effort). Other influencing factors have been added to TAM in subsequent work. Venkatesh and Davis’ [25] TAM2 included seven external variables that influenced users’ PU and PEOU (i.e., voluntariness, experience, subjective norm, image, job relevance, output quality, and result demonstrability). Another variation of TAM is the Unified Theory of Acceptance and Use of Technology (UTAUT; [26]), that poses three direct determinants of behavioral intent to use a technology: performance expectancy, effort expectancy, and social influence. Trust was added to TAM by researchers in e-commerce [27] and e-government [28] due to the risk/uncertainty inherent in web-based environments. Experience with technology and its effects have also been discussed in the TAM literature, with increasing experience shifting personal judgments away from social information or norms, and toward personal preference and attitudes [25, 29]. Experience moderates behavioral intent (BI) to use a technology such that effort expectancy is stronger in early stages of experience, and facilitating conditions becoming stronger in later stages [26].

UTAUT Extension to Advanced Driver Assistance Systems. Adell [30] let 38 drivers trial a system that alerted the driver when (1) the car was too close to the vehicle ahead of it, (2) when positive relative speed suggested an impending collision, (3) speed was too high considering road geometry, and (4) also when the car was exceeding the speed limit. The study defined driver support system acceptance as, “the degree to which an individual intends to use a system and, when available, to incorporate the system in his/her driving” (p. 482). Findings showed support for UTAUT in the area of driver support systems, but with low explanatory power. These results showed the importance of social influence for behavioral intent, but not effort expectancy. This lead the author to stress that UTAUT constructs be measured in the context of driver support systems with special attention given to performance expectancy. The author further suggested that more extensive studies with more targeted experimental design be conducted with larger sample sizes.

TAM Extension to Autonomous Vehicles. Choi and Ji [31] developed a model based on TAM explaining early intent to use an autonomous vehicle (AV) that demonstrated that PU and trust were necessary precursors to the intention to use an AV, with a very weak effect of PEOU on behavioral intent to use AV. They discerned three constructs that positively influenced the individual’s trust in AV: (1) System Transparency (i.e., degree to which users can predict and understand the operating of AV), (2) Technical Competence (i.e., degree of user perception on the performance of AV), (3) Situation Management (i.e., user’s belief that he or she can recover control in a situation whenever desired), which in turn alleviated their level of perceived risk. They further proposed that users with an external locus of control might take a more passive role with an automated system [32], making it easier for them to rely on automated driving system [33].

UTAUT Extension to Automated Road Transport Systems. Madigan et al. [34] investigated user acceptance of automated road transport systems (ARTS) involved with the European CityMobil2 project through the UTAUT model. ARTS are highly automated vehicles that run at low speeds on dedicated, demonstrated routes (i.e., predetermined demonstrated path, not based on active mapping) meant to complement and feed in to the main public transport network in areas of low or dispersed demand [35]. Prior survey work [36] found that travelers’ decision to use motorized public transportation hinged the quality of the weather, illumination, on-board comfort, and distance travelled on foot, with a preference for cybernetic transport systems (i.e., ARTS, portrayed by system descriptions and in operation around the surveyed city) that increased with age. Madigan et al. [34] assessed user acceptance of ARTS vehicles being used in two different European cities (La Rochelle in France and Lausanne in Switzerland) as part of the CityMobil2 project using the UTAUT framework, and found that performance expectancy, effort expectancy, and social influence were predictors of behavioral intentions to use ARTS, with performance expectancy having the strongest impact. The authors further suggested that future studies should gauge on-board comfort, as hedonic motivation has been shown to be an important determinant of behavioral intention in consumer-based contexts [37]. The authors also suggested that actual interaction with ARTS rather than the system descriptions used by Delle Site et al. [36] lead to age not having a significant impact on UTAUT variables.

Automation Acceptance Model.

The Automation Acceptance Model (AAM; [38]) is an augmentation of TAM that stresses the importance of trust and task-technology compatibility in order to account for automation use’s dynamic and multi-level nature. AAM additionally emphasizes that actual system use (i.e., experience with the automated system) feeds back into the user’s perceptions of compatibility, trust, PU, PEOU, and behavioral intention to use automation. Trust formation through experience and task-technology compatibility are expanded upon below and then discussed in the context of using Advanced Driver Assistance Systems (ADAS) or AV to enhance OAs’ mobility.

Trust in the Technology. Trust is social in nature and is largely based on our interpersonal relationships with other humans. A framework discerning the similarities and differences between human-human and human-automation (decision support systems, to be specific) trust can be found in Madhavan and Wiegmann [39]. Broadly, this framework states that humans naturally tend to react socially to seemingly intelligent machines, deferring to them as advisors, as decision support systems rigorously following well-designed schema to make their decisions. This deference is fragile though, and prone to break down due to the system’s rigidity, as they lack human adaptability. Attitudes relating to trust in automation play an important role in user reliance and acceptance [40, 41], and indeed, users show more reliance on automation they trust [4244].

After reviewing the trust literature, Lee and See [40] described the basis of trust as comprising three dimensions: Purpose (i.e., degree to which the automation being used as intended by the designer), Process (i.e., how the automation functions in the situation to fulfill the user’s needs), and Performance (i.e., past or present operation of the automation including reliability, predictability, and ability). These three dimensions are judged by the user on the system’s surface features (i.e., aesthetics, feel, information structure; [4547]) and depth features (i.e., the automation’s performance, observability, controllability; [40]). These dimensions should lead to properly calibrated levels of trust being placed in the automation based on its capabilities. Over-trust can lead to misuse (i.e., using the automation in situations where it is not appropriate to) and complacency (discussed later in the Task-Technology Compatibility section), while under-trust can lead to disuse (i.e., not using the automation when it is capable of helping; [48]). Lee and See [40] stress that automation should be designed to be technically capable of performing a prescribed task (i.e., trustworthy), but also be operationally unintimidating and easily understood (i.e., trustable). Hoff and Bashir’s [49] dissection of factors that influence trust formation in automation use resulted in a three-layered model that accounts for unique characteristics belonging to the user (dispositional trust), situation (situational trust), and the dynamic effects of experience (learned trust), and aligns well with AAM.

Task-Technology Compatibility. It has been shown that adding task-technology fit model constructs into TAM helps predictions of use [50]. Compatibility consists of the degree of fit between the human, technology, task to-be-performed, and the situation [51]. This involves an older driver and their potentially age-compromised faculties, the particular device or devices that might complement them, and the level of assistance called for in the situation (e.g., providing warnings to draw the driver’s attention to unheeded hazards or some form of automated take-over if the driver does not respond to these hazards in time).

In the automation literature, high levels of automation (LOAs) have been shown to lead to complacency, degraded situational awareness, de-skilling, and mode confusion (e.g., [48]), and conversely, low LOAs can lead to poor performance when the system’s demands exceed the operator’s capacity [52]. With this in mind, the appropriate LOA and type of ADAS to maximize task-technology compatibility for OAs can be discussed. It has been suggested that fully automating a process should be limited to situations where the user fails to respond, or cannot respond fast enough [53]. An example of when automated takeover of this nature might occur would be when an older driver (1) does not notice a sudden stop by the vehicle they are following, or (2) does not or cannot react fast enough to this sudden stop. In the first case, a forward collision warning system might call attention to an unheeded deceleration by the car in front of them and allow the older driver to react appropriately, and hence might not require further intervention or takeover of the driving task. But in the second case, an older driver’s reaction time to this surprise deceleration of the car in front of them might not be sufficient, and an automatic braking system might help them avoid a rear-end collision. Interestingly, it has been shown that both methods can be effective, but drivers are actually more accepting of collision warnings than automatic braking which overrides their control, even if it performs better than they could [54].

3.2 Important Considerations for OA Technology Adoption

Facilitators and Determinants of OA Technology Adoption.

Lee and Coughlin [55] conducted a review of the technology acceptance literature and posited ten facilitators and determinants of OAs’ technology adoption, which show some overlap with TAM and UTAUT (bolded items are novel contributions of [55] that are particular to OAs):

  • Value: perception of usefulness and potential benefit (analogous to PU from TAM)

  • Usability: perception that technology is easy to use, user-friendly (analogous to PEOU from TAM)

  • Affordability: perception of potential cost savings

  • Accessibility: knowledge of the technology’s existence and its availability in the market place

  • Technical support: Availability of quality professional assistance throughout use (analogous to facilitating conditions from UTAUT)

  • Social support: support from family, friends, and peers (analogous to social influence from UTAUT)

  • Emotion: perception of emotional and psychological benefits

  • Independence: perception of social visibility, or how a technology makes them look to others (analogous to image from UTAUT)

  • Experience: relevance with their prior experiences and interactions

  • Confidence: empowerment without anxiety or intimidation

OA Adoption of ADAS and/or AV. In the case of using ADAS or AV to help OAs maintain their mobility, Lee and Coughlin’s [55] accessibility, independence, and confidence are of particular interest. Accessibility is important in that an OA needs to know that the ADAS or AV technology exists and that it is capable of helping them safely maintain their personal mobility in order for them to be urged to adopt it, and it must be within their price range. Many of these technologies are included on luxury brand vehicles, which might be too expensive for an older driver thinking of giving up their keys. Many older adults equate the personal vehicle as vital to their well-being and independence (e.g., [56]), and these technologies may help them maintain that sense of independence so long as they avoid stigmatization, as it has been shown to drive OAs away from adoption [57, 58]. Finally, in terms of confidence, these technologies have the best chance of being used properly if their adoption is discretionary, rather than mandatory (i.e., forced to do so in order to maintain licensure), as under mandatory or forced use an individual might delay, obstruct, underutilize or sabotage a system [59].

As fully autonomous vehicles are not available in the short term, and their time of arrival is questioned by experts in the area, Ghazizadeh and colleagues’ [38] insistence on task-technology compatibility is nicely informed by Eby and colleagues’ [60] review of in-vehicle technologies and their potential to help older drivers extend the amount of time they are able to drive safely. Eby et al. [60] emphasize the importance of training and/or education on the operation of the particular system to maximize these systems’ effectiveness and safe usage.

4 Methods

As part of an FDOT contract aimed at assessing attitudes towards AV in older Floridians, Duncan et al. [61] collected survey data. This survey included questions on familiarity with, general opinion of, and willingness to use particular ADAS systems that had not yet been examined, as the report was commissioned for, and largely dealt with AV. Hence, secondary analyses were conducted on the data set regarding OAs’ attitudes and acceptance of not only autonomous vehicles, but ADAS as well. Data were collected by mailing out surveys using voter registration lists, and over-sampled older adults (age 55+) in order to gain a better idea of this age group’s knowledge and preferences related to AV and ADAS. Respondents either completed and mailed back their surveys, or completed the survey online. In total, 5,000 surveys were mailed out in two waves, and 459 total responses were received, for a response rate of 9.18 %, which was consistent with other mail-out survey response rates. Before being mailed, half of the survey packets were randomly selected to include an additional informational insert describing AV and different ADAS systems. This lead to 188 of survey respondents receiving this extra information (271 did not receive this informational insert). This was done to examine the effects of a basic level of AV and ADAS education on respondents’ attitudes toward these technologies. After controlling for age and income level, these groups were not found to differ on willingness to use AV (F(1, 392) = 2.96, p = .09), and were combined for the following analyses.

5 Results

5.1 Exploratory Factor Analysis of AV Survey

A principal components exploratory factor analysis with varimax rotation was conducted on 51 survey items concerning familiarity with, general opinion of, and willingness to use AV and select ADAS systems (Cruise Control, Lane Departure Warning, Blind Spot Monitor, Active Lane Centering, Automatic Braking, Adaptive Cruise Control, and Self-Parking Systems), comfort of yourself or having a loved one riding in AV, AV-related concerns and benefits (e.g., concern with AV driving as well as human drivers, benefits of less traffic congestion with AV), questions of pricing of AV, and interest in different ownership models (privately-owned, shared-ownership, autonomous public transit, AV for hire). Principal components was used to identify and compute composite scores for factors underlying survey respondents’ attitudes towards AV and ADAS. This resulted in 10 unique factors that explained 70.4 % of the variance.

5.2 Regression Analyses of Factors

Multiple linear regressions were calculated to assess potential predictors (age, gender, annual income, education level, and ease of new technology use) of each factor from the exploratory factor analysis. Age was measured in years; gender was self-reported as 0 = female, 1 = male; annual income was self-reported by selecting one of six levels: under $25k, $25k–49,999, $50k–74,999, $75k–99,999, $100k–150k, or more than $150k; ease of new technology use was self-reported on a 5-point Likert scale ranging from 1 = ‘strongly disagree new technology is easy to use’ to 5 = ‘strongly agree new technology is easy to use’; and received info was dummy coded as 0 = did not receive informational insert and 1 = received informational insert. See Table 1 for the results of these regressions. Significant predictors are elaborated on below.

Table 1. Regression results. Non-significant predictors withheld. ^ = p < .10, * = p < .05, ** = p < .01, *** = p < .001

Factor 1: General AV Attitudes and Willingness. Table 1 shows that those reporting higher levels of ease of new technology use, and to a lesser extent those who received an information sheet, were generally more accepting of automated driving technology (both AV and self-parking systems) for themselves and others, were more interested in different models of AV ownership, and looked forward more to potential benefits like fewer crashes, reduced crash severity, and more enjoyable travel.

Factor 2: Benefits of AV. Those reporting higher levels of ease of new technology use were generally more prone to think the introduction of AV would lead to benefits such as fewer crashes with reduced severity, less traffic congestion, shorter travel time, better fuel economy, more enjoyable travel, as well as enhanced mobility for those unable to drive and improved pedestrian safety.

Factor 3: Concerns of AV. Those reporting higher levels of ease of new technology use, who received an information sheet, and to a lesser extent those reporting higher income levels, reported less concern about issues involving AV deployment such as safety consequences, legal liability in a crash, vehicle security, data privacy, sharing the road, safely interacting with AV, as well as learning to use AV.

Factor 4: ADAS Familiarity. Greater comfort using new technology, greater household income, and being male were associated with greater familiarity with ADAS systems.

Factor 6: Willingness to Use ADAS. Higher age, levels of comfort using new technology and reported income were associated with greater willingness to use ADAS, while curiously those who had received an informational sheet expressed lower willingness to use ADAS.

Factor 8: Cruise Control Familiarity, Opinion, and Willingness to Use. Greater comfort with new technology use and being male were associated with higher levels of familiarity, opinion, and willingness to use cruise control.

Factor 9: Different AV Owner/Ridership Models. Men were more willing to use different AV owner/ridership models such as shared-ownership, AV public transportation, and/or AV for hire.

6 Discussion

It is clear that TAM’s PU and PEOU maintain their importance in the case of OA adoption of AV and/or ADAS. Adell [30], despite being under-powered and evaluating a mix of systems combined as one, provides a good first step and informs future studies of ADAS acceptance about the importance of social influence, effort expectancy, and tailoring measures to the driving task. Madigan and colleagues’ [32] study on acceptance of ARTS highlighted comfort’s role in the consumer acceptance domain, as well as the necessity for some sort of actual interaction with the technology. Choi and Ji [31] showed strong effects for PU in relation to personal AV adoption, but weaker effects for PEOU. This weaker effect of PEOU might be true for their younger sample, but might not remain true for an older sample. Here within an older sample, self-reported ratings of their ease of new technology use were found to be a significant predictor for factors relating to increased general comfort with and willingness to use AV (F1), greater expectation of AV related benefits (F2), reduced AV concerns (F3), increased familiarity with ADAS (F4), and greater willingness to use ADAS (F6), and was marginally associated with more passive modes of AV (F9).

The importance of trust in automation adoption is highlighted by AAM [39], and Choi and Ji [31] show that this extends to AV. Choi and Ji [31] stress that a user’s trust in an AV is predicated on its level of predictability and easy comprehension of its function (system transparency), an acceptable level of performance perceived by the user (technical competence), and finally, the perception that they can intervene if they find it necessary (situation management). Appreciation for trust’s dynamic process of formation during adoption of automation, and its many levels and sublevels during use of automation elaborated by Hoff and Bashir [49] provide frameworks to accurately assess trust throughout these processes.

It is apparent based on the literature that trust is a major component of adopting automation, and that this trust is dynamic in nature (e.g., [38]). Many technology adoption studies assessing ADAS systems (e.g., [30]), are cross-sectional in nature, and hence are unable to account for how trust in a system may change with repeated use and/or successful incorporation of the technology into the individual’s lifestyle. Furthermore, as Hoff and Bashir’s [49] layers of trust suggest, cross-sectional studies dealing with trust at the very least might only be assessing the individual’s dispositional trust, or an under-informed trust that may grow with repeated successful usage, if given the chance. Future studies should account for this by incorporating more dynamic measurements of trust into a longitudinal design.

The exploratory factors calculated based on the Floridian AV survey need to be replicated by other similar studies, but make sense in light of AV and ADAS adoption by aging drivers. Interestingly, receiving the informational sheet significantly lowered AV concerns (F3), and was marginally associated with more positive attitudes towards AV (F1), but was associated with lower willingness to use ADAS (F6). Of particular interest is age positively predicting willingness to use ADAS (F6), and provides support for Eby and colleagues’ [60] “optimistic yes” to the question of whether ADAS could help older drivers drive more safely, later in life.

One area of interest for future research is investigating the effects of different levels of training on a particular ADAS’s acceptance by an older driver. As was seen in the secondary analysis of the survey data, the informational sheet did not affect respondents’ willingness to use AV, but it did impact more global views of AV, as shown by the first factor. Eby et al. [60] stressed that some ADAS (e.g., adaptive cruise control) needed to be paired with driver training in order to be recommended for OAs’ use. Training can take a variety of forms, from written manuals or instructions to informational videos or hands-on tutorials. Assessing which one of these is most quick and effective in accurately calibrating OAs’ trust in a given system would be valuable for advancing the field.

References

  1. Insurance Institute for Highway Safety: Decline in crash risk spurs better outlook for older drivers. Status report, vol. 50, no. 2 (2015). http://www.iihs.org/iihs/sr/statusreport/article/50/2/2

  2. Sivak, M., Schoettle, B.: Recent changes in the age composition of drivers in 15 countries. Report No. UMTRI-2011-43, University of Michigan Transportation Research Institute, Ann Arbor, MI (2011)

    Google Scholar 

  3. Colby, S.L., Ortman, J.M.: Projections of the size and composition of the U.S. population: 2014 to 2060. Current Population reports, pp. P25–1143. U.S. Census Bureau, Washington, DC (2015)

    Google Scholar 

  4. Jastrzembski, T.S., Charness, N.: The model human processor and the older adult: parameter estimation and validation within a mobile phone task. J. Exp. Psychol.-Appl. 13, 224–248 (2007)

    CrossRef  Google Scholar 

  5. Marottoli, R.A., Richardson, E.D., Stowe, M.H., Miller, E.G., Brass, L.M., Cooney Jr., L.M., Tinetti, M.E.: Development of a test battery to identify older drivers at risk for self-reported adverse driving events. J. Am. Geriatr. Soi. 46, 562–568 (1998)

    CrossRef  Google Scholar 

  6. Charlton, J., Koppel, S., O’Hare, M., Andrea, D., Smith, G., Khodr, B., Langford, J., Odell, M., Fildes, B.: Influence of chronic illness on crash involvement of motor vehicle drivers. Report No. 213. Monash University Accident Research Centre, Victoria, Australia (2004)

    Google Scholar 

  7. Anstey, K.J., Wood, J., Lord, S., Walker, J.G.: Cognitive, sensory and physical factors enabling driving safety in older adults. Clin. Psychol. Rev. 25(1), 45–65 (2005)

    CrossRef  Google Scholar 

  8. Dobbs, B.M.: Medical conditions and driving: a review of the scientific literature (1960–2000). Report No. DOT HSW 809 690. US Department of Transportation, Washington, DC (2005)

    Google Scholar 

  9. Eby, D., Molnar, L., Kartje, P.: Maintaining Safe Mobility in an Aging Society. Taylor and Francis, London (2009)

    Google Scholar 

  10. Boot, W.R., Stothart, C., Charness, N.: Improving the safety of aging road users: a mini-review. Gerontology 60(1), 90–96 (2014)

    CrossRef  Google Scholar 

  11. West, C.G., Gildengorin, G., Haegerstrom-Portnoy, G., Lott, L.A., Schneck, M.E., Brabyn, J.A.: Vision and driving self-restriction in older adults. J. Am. Geriatr. Soi. 51(10), 1348–1355 (2003)

    CrossRef  Google Scholar 

  12. Molnar, L.J., Eby, D.W., Zhang, L., Zanier, N., St. Louis, R.M., Kostyniuk, L.P.: Self-regulation of driving by older adults: a synthesis of the literature and framework for future research. AAA Foundation for Traffic Safety Website (2015). https://www.aaafoundation.org/self-regulation-driving-older-adults-longroad-study-0

  13. Hutchinson, T.E., Massachusetts Institute of Technology. Department of Civil and Environmental Engineering: Driving confidence and in-vehicle telematics: a study of technology adoption patterns of the 50+ driving population, 98, [6] (2004). http://dspace.mit.edu/handle/1721.1/29389

  14. Chihuri, S., Mielenz, T.J., DiMaggio, C.J., Betz, M.E., DiGuiseppi, C., Jones, V.C., Li, G.: Driving Cessation and Health Outcomes in Older Adults. AAA Foundation for Traffic Safety, Washington, DC (2015)

    Google Scholar 

  15. Brown, S., Venkatesh, V.: Model of adoption of technology in households: a baseline model test and extension incorporating household life cycle. MIS Q. 29(3), 399–426 (2005). doi:10.2307/25148690

    Google Scholar 

  16. Carrigan, M., Szmigin, I.: In pursuit of youth: what’s wrong with the older market (1999). doi:10.1108/02634509910285637

    Google Scholar 

  17. Czaja, S.J., Charness, N., Fisk, A.D., Hertzog, C., Nair, S.N., Rogers, W.A., Sharit, J.: Factors predicting the use of technology: findings from the Center for Research and Education on Aging and Technology Enhancement (CREATE). Psychol. Aging 21, 333–352 (2006). doi:10.1037/0882-7974.21.2.33

    CrossRef  Google Scholar 

  18. Demiris, G., Rantz, M., Aud, M., Marek, K., Tyrer, H., Skubic, M., Hussam, A.: Older adults’ attitudes towards and perceptions of “smart home” technologies: a pilot study. Med. Inform. Internet Med. 29(2), 87–94 (2004). doi:10.1080/14639230410001684387

    CrossRef  Google Scholar 

  19. Melenhorst, A.-S., Rogers, W.A., Caylor, E.C.: The use of communication technologies by older adults: exploring the benefits from the user’s perspective. In: Proceedings of the Human Factors and Ergonomics Society 45th Annual Meeting (2001). doi:10.1177/154193120104500305

    Google Scholar 

  20. Hopkins, C.D., Roster, C.A., Wood, C.M.: Making the transition to retirement: appraisals, post-transition lifestyle, and changes in consumption patterns. J. Consum. Mark. 23(2), 87–99 (2006). doi:10.1108/07363760610655023

    CrossRef  Google Scholar 

  21. Niemelä-Nyrhinen, J.: Baby boom consumers and technology: shooting down stereotypes. J. Consum. Mark. 24(5), 305–312 (2007). doi:10.1108/07363760710773120

    CrossRef  Google Scholar 

  22. Davis, F.D.: Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS Q. 13, 319–339 (1989)

    CrossRef  Google Scholar 

  23. Davis, F.D., Venkatesh, V.: A critical assessment of potential measurement biases in the technology acceptance model: three experiments. Int. J. Hum.-Comput. Stud. 45(1), 19–45 (1996)

    CrossRef  Google Scholar 

  24. Davis, F.D., Bagozzi, R.P., Warshaw, P.R.: User acceptance of computer technology: a comparison of two theoretical models. Manag. Sci. 35(8), 982–1003 (1989)

    CrossRef  Google Scholar 

  25. Venkatesh, V., Davis, F.D.: A theoretical extension of the technology acceptance model: four longitudinal field studies. Manag. Sci. 46(2), 186–204 (2000)

    CrossRef  Google Scholar 

  26. Venkatesh, V., Morris, M.G., Davis, G.B., Davis, F.D.: User acceptance of information technology: toward a unified view. Inf. Manag. 27(3), 425–478 (2003)

    Google Scholar 

  27. Pavlou, P.A.: Consumer acceptance of electronic commerce: integrating trust and risk with the technology acceptance model. Int. J. Electron. Commer. 7(3), 101–134 (2003)

    Google Scholar 

  28. Carter, L., Bélanger, F.: The utilization of e-government services: citizen trust, innovation and acceptance factors. Inf. Syst. J. 15(1), 5–25 (2005)

    CrossRef  Google Scholar 

  29. Karahanna, E., Straub, D.W., Chervany, N.L.: Information technology adoption across time: a cross-sectional comparison of pre-adoption and post-adoption beliefs. MIS Q. 23(2), 183–213 (1999)

    CrossRef  Google Scholar 

  30. Adell, E.: Acceptance of driver support systems. In: Proceedings of the European Conference on Human Centred Design for Intelligent Transport Systems, Berlin, Germany, pp. 475–486 (2010)

    Google Scholar 

  31. Choi, J.K., Ji, Y.G.: Investigating the importance of trust on adopting an autonomous vehicle. Int. J. Hum.-Comput. Interact. 31(10), 692–702 (2015)

    CrossRef  Google Scholar 

  32. Stanton, N.A., Young, M.S.: Driver behaviour with adaptive cruise control. Ergonomics 48, 1294–1313 (2005)

    CrossRef  Google Scholar 

  33. Rudin-Brown, C., Ian Noy, Y.: Investigation of behavioral adaptation to lane departure warnings. Transp. Res. Rec. 1803, 30–37 (2002)

    CrossRef  Google Scholar 

  34. Madigan, R., Louw, T., Dziennus, M., Graindorge, T., Ortega, E., Graindorge, M., Merat, N.: Acceptance of automated road transport systems (ARTS): an adaptation of the UTAUT model. In: Proceedings of 6th Transport Research Arena, Warsaw, Poland (2015)

    Google Scholar 

  35. Alessandrini, A., Campagna, A., Delle Site, P., Filippi, F., Persia, L.: Automated vehicles and the rethinking of mobility and cities. Transp. Res. Procedia 5, 145–160 (2015)

    CrossRef  Google Scholar 

  36. Delle Site, P., Filippi, G., Giustiniani, G.: Users’ preferences towards innovative and conventional public transport. Procedia-Soc. Behav. Sci. 20, 906–915 (2011)

    CrossRef  Google Scholar 

  37. Venkatesh, V., Thong, J.Y.L., Xu, X.: Consumer acceptance and use of information technology: extending the unified theory of acceptance and use of technology. MIS Q. 36(1), 157–178 (2012)

    Google Scholar 

  38. Ghazizadeh, M., Lee, J.D., Boyle, L.N.: Extending the technology acceptance model to assess automation. Cogn. Technol. Work 14(1), 39–49 (2012)

    CrossRef  Google Scholar 

  39. Madhavan, P., Wiegmann, D.A.: Similarities and differences between human–human and human–automation trust: an integrative review. Theor. Issues Ergon. Sci. 8(4), 277–301 (2007)

    CrossRef  Google Scholar 

  40. Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors 46(1), 50 (2004)

    CrossRef  Google Scholar 

  41. Muir, B.M.: Trust between humans and machines, and the design of decision aids. Int. J. Man-Mach. Stud. 27(5–6), 527–539 (1987)

    MathSciNet  CrossRef  Google Scholar 

  42. Lee, J.D., Moray, N.: Trust, control strategies and allocation of function in human-machine systems. Ergonomics 35(10), 1243–1270 (1992)

    CrossRef  Google Scholar 

  43. Lee, J.D., Moray, N.: Trust, self-confidence, and operators’ adaptation to automation. Int. J. Hum.-Comput. Stud. 40(1), 153 (1994)

    CrossRef  Google Scholar 

  44. Parasuraman, R., Sheridan, T.B., Wickens, C.D.: Situation awareness, mental workload, and trust in automation: viable, empirically supported cognitive engineering constructs. J. Cogn. Eng. Decis. Mak. 2(2), 140–160 (2008)

    CrossRef  Google Scholar 

  45. Kim, J., Moon, J.Y.: Designing towards emotional usability in customer interfaces—trustworthiness of cyber-banking system interfaces. Interact. Comput. 10(1), 1–29 (1998)

    MathSciNet  CrossRef  Google Scholar 

  46. Fogg, B.J., Marshall, J., Laraki, O., Osipovich, A., Varma, C., Fang, N., Paul, J., Rangnekar, A., Shon, J., Swani, P.: What makes web sites credible? A report on a large quantitative study. In: Proceedings of the SIGCHI Conference on Human Factors Computing Systems, pp. 295–299 (2001)

    Google Scholar 

  47. Karvonen, K., Parkkinen, J.: Signs of trust: a semiotic study of trust formation in the web. In: Smith, M.J., Salvendy, G., Harris, D., Koubek, R.J. (eds.) First International Conference on Universal Access in Human-Computer Interaction, Erlbaum, Mahwah, vol. 1, pp. 1076–1080 (2001)

    Google Scholar 

  48. Parasuraman, R., Riley, V.: Humans and automation: use, misuse, disuse, abuse. Hum. Factors 39(2), 230–253 (1997)

    CrossRef  Google Scholar 

  49. Hoff, K.A., Bashir, M.: Trust in automation integrating empirical evidence on factors that influence trust. Hum. Factors 57(3), 407–434 (2015)

    CrossRef  Google Scholar 

  50. Dishaw, M.T., Strong, D.M.: Extending the technology acceptance model with task-technology fit constructs. Inf. Manag. 36(1), 9–21 (1999)

    CrossRef  Google Scholar 

  51. Karahanna, E., Agarwal, R., Angst, C.M.: Reconceptualizing compatibility beliefs in technology acceptance research. MIS Q. 30(4), 781–804 (2006)

    Google Scholar 

  52. Parasuraman, R., Sheridan, T.B., Wickens, C.D.: A model for types and levels of human interaction with automation. IEEE Trans. Syst. Man Cybern. Part A: Syst. Hum. 30(3), 286–297 (2000)

    CrossRef  Google Scholar 

  53. Moray, N., Inagaki, T., Itoh, M.: Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. J. Exp. Psychol.-Appl. 6(1), 44 (2000)

    CrossRef  Google Scholar 

  54. Inagaki, T., Itoh, M., Nagai, Y.: Support by warning or by action: which is appropriate under mismatches between driver intent and traffic conditions? IEICE Trans. Fundam. Electron. Commun. Comput. Sci. 90(11), 2540 (2007)

    CrossRef  Google Scholar 

  55. Lee, C., Coughlin, J.F.: PERSPECTIVE: older adults’ adoption of technology: an integrated approach to identifying determinants and barriers. J. Prod. Innov. Manag. (2014). doi:10.1111/jpim.12176

    Google Scholar 

  56. Hassan, H., King, M., Watt, K.: The perspectives of older drivers on the impact of feedback on their driving behaviours: a qualitative study. Transp. Res. Part F: Traffic Psychol. Behav. 28, 25–39 (2015)

    CrossRef  Google Scholar 

  57. Demiris, G., Rantz, M., Aud, M., Marek, K., Tyrer, H., Skubic, M., Hussam, A.: Older adults’ attitudes towards and perceptions of “smart home” technologies: a pilot study. Med. Inform. Internet Med. 29(2), 87–94 (2004). doi:10.1080/14639230410001684387

    CrossRef  Google Scholar 

  58. Kang, H.G., Mahoney, D.F., Hoenig, H., Hirth, V.A., Bonato, P., Hajjar, I., Lipsitz, L.A.: In situ monitoring of health in older adults: technologies and issues. J. Am. Geriatr. Soc. (2010). doi:10.1111/j.1532-5415.2010.02959.x

    Google Scholar 

  59. Leonard-Barton, D.: Implementation characteristics of organizational innovations: limits and opportunities for management strategies. Commun. Res. 15(5), 603–631 (1988)

    CrossRef  Google Scholar 

  60. Eby, D.W., Molnar, L.J., Zhang, L., St. Louis, R.M., Zanier, N., Kostyniuk, L.P.: Keeping older adults driving safely: a research synthesis of advanced in-vehicle technologies. AAA Foundation for Traffic Safety Website (2015). https://www.aaafoundation.org/keeping-older-adults-driving-safely-research-synthesis-advanced-vehicle-technologies-longroad-study

  61. Duncan, M., Charness, N., Chapin, T., Horner, M., Stevens, L., Richard, A., Souders, D.J., Crute, J., Riemondy, A., Morgan, D.: Enhanced mobility for aging populations using automated vehicles. Florida Department of Transportation Website (2015). http://www.dot.state.fl.us/research-center/Completed_Proj/Summary_PL/FDOT-BDV30-977-11-rpt.pdf

Download references

Acknowledgements

This research was funded in part by the Florida Department of Transportation, Contract BVD30 977-11 Enhanced mobility for aging populations using automated vehicles http://www.dot.state.fl.us/research-center/Completed_Proj/Summary_PL/FDOT-BDV30-977-11-rpt.pdf.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dustin Souders .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Souders, D., Charness, N. (2016). Challenges of Older Drivers’ Adoption of Advanced Driver Assistance Systems and Autonomous Vehicles. In: Zhou, J., Salvendy, G. (eds) Human Aspects of IT for the Aged Population. Healthy and Active Aging. ITAP 2016. Lecture Notes in Computer Science(), vol 9755. Springer, Cham. https://doi.org/10.1007/978-3-319-39949-2_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-39949-2_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-39948-5

  • Online ISBN: 978-3-319-39949-2

  • eBook Packages: Computer ScienceComputer Science (R0)