Abstract
Often joint human–automation performance depends on the factors influencing the operator’s tendency to rely on and comply with automation. Although cognitive engineering (CE) researchers have studied automation acceptance as related to task–technology compatibility and human–technology coagency, information system (IS) researchers have evaluated user acceptance of technology, using the Technology Acceptance Model (TAM). The parallels between the two views suggest that the user acceptance perspective from the IS community can complement the human–automation interaction perspective from the CE community. TAM defines constructs that govern acceptance and provides a framework for evaluating a broad range of factors influencing technology acceptance and reliance. TAM is extensively used by IS researchers in various applications and it can be applied to assess the effect of trust and other factors on automation acceptance. Likewise, extensions to the TAM framework use the constructs of task–technology compatibility and past experience to extend its description of the role of human–automation interaction in automation adoption. We propose the Automation Acceptance Model (AAM) to draw upon the IS and CE perspectives and take into account the dynamic and multi-level nature of automation use, highlighting the influence of use on attitudes that complements the more common view that attitudes influence use.
Similar content being viewed by others
References
Adamson I, Shine J (2003) Extending the new Technology Acceptance Model to measure the end user information systems satisfaction in a mandatory environment: a bank’s treasury. Technol Anal Strateg Manag 15(4):441–455
Ajzen I (1991) The theory of planned behavior. Organ Behav Hum Decis Process 50(2):179–211
Andre AD, Wickens CD (1995) When users want what’s not best for them. Ergon Des Q Hum Factors Appl 3(4):10–14
Bailey RW (1993) Performance versus preference. In: Proceedings of human factors and Ergonomics society 37th annual meeting, pp 282–286
Bajaj A, Nidumolu SR (1998) A feedback model to understand information system usage. Inf Manag 33(4):213–224
Bernheim BD (1994) A theory of conformity. J Polit Econ 102(5):841–877
Brown SA, Massey AP, Montoya-Weiss MM, Burkman JR (2002) Do I really have to? user acceptance of mandated technology. Eur J Inf Syst 11(4):283–295
Carter L, Bélanger F (2005) The utilization of e government services: citizen trust, innovation and acceptance factors. Inf Syst J 15(1):5–25
Chen CF, Chen PC (2011) Applying the TAM to travelers’ usage intentions of GPS devices. Expert Syst Appl 38:6217–6221
Davis FD (1989) Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS Q 13:319–340
Davis FD, Venkatesh V (1996) A critical assessment of potential measurement biases in the Technology Acceptance Model: three experiments. Int J Hum Comput Stud 45(1):19–45
Davis FD, Bagozzi RP, Warshaw PR (1989) User acceptance of computer technology: a comparison of two theoretical models. Manag Sci 35(8):982–1003
Dishaw MT, Strong DM (1998) Experience as a moderating variable in a task-technology fit model. In: Proceedings of fourth Americas conference on information systems, pp 722–724
Dishaw MT, Strong DM (1999) Extending the Technology Acceptance Model with task-technology fit constructs. Inf Manag 36(1):9–21
El Jaafari M, Forzy JF, Navarro J, Mars F, Hoc JM (2008) User acceptance and effectiveness of warning and motor priming assistance devices in car driving. In: Proceedings of European conference on human centred design for intelligent transport systems, pp 311–320
Fishbein M, Ajzen I (1975) Belief, attitude, intention and behavior: an introduction to theory and research. Addison-Wesley, Reading
Flach JM, Vicente KJ, Tanabe F, Monta K, Rasmussen J (1998) An ecological approach to interface design. In: Proceedings of the human factors and Ergonomics society 42nd annual meeting, vol 42, pp 295–299
Fogg BJ, Marshall J, Laraki O, Osipovich A, Varma C, Fang N, Paul J, Rangnekar A, Shon J, Swani P (2001) What makes Web sites credible? A report on a large quantitative study. In: Proceedings of CHI 2001 conference on human factors in computing systems, pp 61–68
Gao J, Lee JD (2006) Extending the decision field theory to model operators’ reliance on automation in supervisory control situations. IEEE Trans Syst Man Cybern A Syst Hum 36(5):943–959
Gao J, Lee JD, Zhang Y (2006) A dynamic model of interaction between reliance on automation and cooperation in multi-operator multi-automation situations. Int J Ind Ergon 36(5):511–526
Gefen D, Karahanna E, Straub DW (2003) Trust and TAM in online shopping: an integrated model. MIS Q 27(1):51–90
Goodhue DL, Thompson RL (1995) Task-technology fit and individual performance. MIS Q 19(2):213–236
Goodrich MA, Boer ER (2003) Model-based human-centered task automation: a case study in ACC system design. IEEE Trans Syst Man Cybern A Syst Hum 33(3):325–336
Gray WD, Boehm-Davis DA (2000) Milliseconds matter: an introduction to microstrategies and to their use in describing and predicting interactive behavior. J Exp Psychol Appl 6(4):322
Guinan PJ, Cooprider JG, Sawyer S (2010) The effective use of automated application development tools. IBM Syst J 36(1):124–139
Hartwick J, Barki H (1994) Explaining the role of user participation in information system use. Manag Sci 40(4):440–465
Hoc JM (2001) Towards a cognitive approach to human-machine cooperation in dynamic situations. Int J Hum Comput Stud 54(4):509–540
Hollnagel E, Woods DD (1983) Cognitive systems engineering: new wine in new bottles. Int J Man Mach Stud 18(6):583–600
Hollnagel E, Woods DD (2005) Joint cognitive systems: foundations of cognitive systems engineering. CRC Press, Boca Raton
Inagaki T (2003) Adaptive automation: sharing and trading of control. Handbook of cognitive task design 147–169
Inagaki T (2006) Design of human–machine interactions in light of domain-dependence of human-centered automation. Cognit Technol Work 8(3):161–167
Inagaki T, Itoh M, Nagai Y (2007) Support by warning or by action: which is appropriate under mismatches between driver intent and traffic conditions? IEICE Trans Fundam Electron Commun Comput Sci 90(11):2540
Inagaki T, Itoh M, Nagai Y (2008) Driver support functions under resource-limited situations. J Mech Syst Transp Logist 1(2):213–222
Jackson CM, Chow S, Leitch RA (1997) Toward an understanding of the behavioral intention to use an information system. Decis Sci 28(2):357–389
Karahanna E, Straub DW, Chervany NL (1999) Information technology adoption across time: a cross-sectional comparison of pre-adoption and post-adoption beliefs. MIS Q 23(2):183–213
Karahanna E, Agarwal R, Angst CM (2006) Reconceptualizing compatibility beliefs in technology acceptance research. MIS Q 30(4):781–804
Karvonen K, Parkkinen J (2001) Signs of trust: a semiotic study of trust formation in the Web. In: Smith MJ, Salvendy G, Harris D, Koubek RJ (eds) First international conference on universal Access in human-computer interaction, vol 1. Erlbaum, Mahwah, pp 1076–1080
Kim SS, Malhotra NK (2005) A longitudinal model of continued IS use: an integrative view of four mechanisms underlying postadoption phenomena. Manag Sci 51(5):741–755
Kim J, Moon JY (1998) Designing towards emotional usability in customer interfaces–trustworthiness of cyber-banking system interfaces. Interact Comput 10(1):1–29
Kirlik A (1993) Modeling strategic behavior in human-automation interaction: why an “aid” can (and should) go unused. Hum Factors J Hum Fact Ergon Soc 35(2):221–242
Lee J, Moray N (1992) Trust, control strategies and allocation of function in human-machine systems. Ergonomics 35(10):1243–1270
Lee JD, Moray N (1994) Trust, self-confidence, and operators’ adaptation to automation. Int J Hum Comput Stud 40(1):153
Lee JD, See KA (2004) Trust in automation: designing for appropriate reliance. Hum Factors J Hum Factors Ergon Soc 46(1):50
Legris P, Ingham J, Collerette P (2003) Why do people use information technology? a critical review of the Technology Acceptance Model. Inf Manag 40(3):191–204
Leonard-Barton D (1988) Implementation characteristics of organizational innovations: limits and opportunities for management strategies. Commun Res 15(5):603–631
Moray N, Inagaki T, Itoh M (2000) Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. J Exp Psychol Appl 6(1):44
Muir BM (1987) Trust between humans and machines, and the design of decision aids. Int J Man Mach Stud 27(5–6):527–539
Muir BM, Moray N (1996) Trust in automation: II. Experimental studies of trust and human intervention in a process control simulation. Ergonomics 39(3):429–460
Navarro J, Mars F, Forzy JF, El-Jaafari M, Hoc JM (2010) Objective and subjective evaluation of motor priming and warning systems applied to lateral control assistance. Accid Anal Prev 42(3):904–912
Nielsen J, Levy J (1994) Measuring usability: preference versus performance. Commun ACM 37(4):66–75
Parasuraman R, Riley V (1997) Humans and automation: use, misuse, disuse, abuse. Hum Factors 39(2):230–253
Parasuraman R, Sheridan TB, Wickens CD (2000) A model for types and levels of human interaction with automation. IEEE Trans Syst Man Cybern A Syst Hum 30(3):286–297
Parasuraman R, Sheridan TB, Wickens CD (2008) Situation awareness, mental workload, and trust in automation: viable, empirically supported cognitive engineering constructs. J Cognit Eng Decis Mak 2(2):140–160
Pavlou PA (2003) Consumer acceptance of electronic commerce: integrating trust and risk with the Technology Acceptance Model. Int J Electron Commer 7(3):101–134
Rawstorne P, Jayasuriya R, Caputi P (1998) An integrative model of information systems use in mandatory environments. In: Proceedings of the nineteenth international conference on information systems, pp 325–330
Reeves B, Nass C (1996) How people treat computers, television and new media like real people and places. Cambridge University Press, New York
Rempel JK, Holmes JG, Zanna MP (1985) Trust in close relationships. J Pers Soc Psychol 49(1):95–112
Rogers EM (1995) Diffusion of innovations, 4th edn. The Free Press, New York
Roth EM, Bennett KB, Woods DD (1987) Human interaction with an” intelligent” machine. Int J Man-Mach Stud 27(5–6):479–525
Rotter JB (1980) Interpersonal trust, trustworthiness, and gullibility. Am Psychol 35(1):1–7
Rouse WB (1988) Adaptive aiding for human/computer control. Hum Factors J Hum Factors Ergon Soc 30(4):431–443
Sarter NB, Woods DD, Billings CE (1997) Automation surprises. Handbook Hum Fact Ergon 2:1926–1943
Scerbo MW (1996) Theoretical perspectives on adaptive automation. In: Parasuraman R, Mouloua M (eds) Automation and human performance: theory and applications. Lawrence Erlbaum Associates, Inc., Mahwah, pp 37–63
Seppelt BD, Lee JD (2007) Making adaptive cruise control (ACC) limits visible. Int J Hum Comput Stud 65(3):192–205
Sheridan TB (1992) Telerobotics, automation and human supervisory control. The MIT press, Cambridge
Sheridan TB, Verplank WL (1978) Human and computer control of undersea teleoperators. Tech Rep MIT Man-Machine Systems Laboratory, Cambridge, MA
Taylor S, Todd PA (1995) Understanding information technology usage: a test of competing models. Inf Syst Res 6(2):144–176
Thompson RL, Higgins CA, Howell JM (1994) Influence of experience on personal computer utilization: testing a conceptual model. J Manag Inf Syst 11(1):167–187
Van Der Laan JD, Heino A, De Waard D (1997) A simple procedure for the assessment of acceptance of advanced transport telematics. Transp Res C Emerg Technol 5(1):1–10
Venkatesh V (2000) Determinants of perceived ease of use: integrating control, intrinsic motivation, and emotion into the Technology Acceptance Model. Inf Syst Res 11(4):342–365
Venkatesh V, Davis FD (2000) A theoretical extension of the Technology Acceptance Model: four longitudinal field studies. Manag Sci 46(2):186–204
Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. Inf Manag 27(3):425–478
Wilkinson R (2011) The many meanings of adoption. In: Pannell D, Vanclay F (eds) Changing land management: adoption of new practices by rural landholders. CSIRO Publishing, Collingwood VIC 3066, Australia, p 39
Xu C, Wang W, Chen J, Wang W, Yang C, Li Z (2010) Analyzing travelers’ intention to accept travel information: structural equation modeling. Transp Res Rec J Transp Res Board 2156:93–110
Yi MY, Jackson JD, Park JS, Probst JC (2006) Understanding information technology acceptance by individual professionals: toward an integrative view. Inf Manag 43(3):350–363
Acknowledgments
The authors would like to thank the members of the Cognitive Systems Laboratory (CSL) at the University of Wisconsin–Madison, for their helpful comments on earlier versions of this manuscript.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ghazizadeh, M., Lee, J.D. & Boyle, L.N. Extending the Technology Acceptance Model to assess automation. Cogn Tech Work 14, 39–49 (2012). https://doi.org/10.1007/s10111-011-0194-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10111-011-0194-3