Skip to main content

Should I Add Recommendations to My Warning System? The RCRAFT Framework Can Answer This and Other Questions About Supporting the Assessment of Automation Designs

  • Conference paper
  • First Online:
Human-Computer Interaction – INTERACT 2021 (INTERACT 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12935))

Included in the following conference series:

Abstract

Automation is widespread in interactive applications, promising multiple benefits to users, including enhancing comfort, safety, security and entertainment. Implicitly or explicitly, automation is now a critical design option for interactive application designers. Unfortunately, despite its long use (especially in safety-critical systems) assessing the benefits and the drawbacks of design alternatives including automation remains a craft activity, unsupported by conceptual frameworks or tools. In order to address this problem, we present the RCRAFT framework. The framework considers five attributes of automation: Resources, Control Transitions, Responsibility, Authority, and System Functions and User Tasks. We show how these attributes support the assessment of designs involving automation. Furthermore, adding the RCRAFT concepts to task models makes it possible to evaluate automation properties such as transparency, congruence and controllability in addition to usability. We demonstrate the utility of our approach in a case study, comparing the design for an existing Flight Warning System currently deployed in Airbus A350 against a redesigned version which incorporates functionality to provide recommendations to pilots handling unusual situations. We demonstrate that the RCRAFT framework helps in highlighting the implications of different design alternatives by making the impact of proposed changes on both users’ work and the required properties of automated components explicit.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Amershi, S., et al.: Guidelines for human-AI interaction. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI 2019), Paper 3, pp. 1–13. Association for Computing Machinery, New York (2019)

    Google Scholar 

  2. Barboni, B., Ladry, J.-F., Navarre, D., Palanque, P., Winckler, M.: Beyond modelling: an integrated environment supporting co-execution of tasks and systems models. In: ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS 2010), pp. 165–174. ACM (2010)

    Google Scholar 

  3. Bernhaupt, R., Cronel, M., Manciet, F., Martinie, C., Palanque, P.: Transparent automation for assessing and designing better interactions between operators and partly-autonomous interactive systems. In: 5th International Conference on Application and Theory of Automation in Command and Control Systems (ATACCS 2015), pp. 129–139. ACM (2015)

    Google Scholar 

  4. Bouzekri, E., et al.: Engineering issues related to the development of a recommender system in a critical context: application to interactive cockpits. Int. J. Hum.-Comput. Stud. 121, 122–141 (2019)

    Article  Google Scholar 

  5. Bouzekri, E., Martinie, C., Palanque, P.: A-RCRAFT framework for analysing automation: application to SAE J3016 levels of driving automation. In: Olaverri-Monreal, C., García-Fernández, F., Rossetti, R.J.F. (eds.) Human Factors in Intelligent Vehicles. River Publishers (2020). 9788770222037

    Google Scholar 

  6. Boy, G.A.: Orchestrating situation awareness and authority in complex socio-technical systems. In: Aiguier, M., et al. (eds.) Complex Systems Design and Management, pp. 285–296. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-34404-6_19

    Chapter  Google Scholar 

  7. Brooke, J.: System Usability Scale (SUS): A Quick-and-Dirty Method of System Evaluation User Information, vol. 43. Digital Equipment Co. Ltd., Reading (1986)

    Google Scholar 

  8. Bye, A., Hollnagel, E., Brendeford, T.S.: Human–machine function allocation: a functional modelling approach. Reliab. Eng. Syst. Saf. 64(2), 291–300 (1999)

    Article  Google Scholar 

  9. Campos, J.C., Fayollas, C., Martinie, C., Navarre, D., Palanque, P., Pinto, M.: Systematic automation of scenario-based testing of user interfaces. In: 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS 2016), pp. 138–148. ACM (2016)

    Google Scholar 

  10. Cramer, S., Kaup, I., Siedersberger, K.: Comprehensibility and perceptibility of vehicle pitch motions as feedback for the driver during partially automated driving. IEEE Trans. Intell. Veh. 4(1), 3–13 (2019)

    Article  Google Scholar 

  11. Cummings, M.L.: Automation and accountability in decision support system interface design. J. Technol. Stud. 32(1), 23–31 (2006)

    Google Scholar 

  12. Dearden, A., Harrison, M.D., Wright, P.C.: Allocation of function: scenarios, context and the economics of effort. Int. J. Hum.-Comput. Stud. 52(2), 289–318 (2000)

    Article  Google Scholar 

  13. Drogoul, F., Palanque, P.: How to make automation a good solution to the current problems in ATM? Hermes Air Transportation organization, April R19-PP/05, 7p (2019). http://hermes.aero/wp-content/uploads/2019/06/R19-PP_05-EUROCONTROL.pdf

  14. Fayollas, C., Martinie, C., Palanque, P., Ait-Ameur, Y.: QBP notation for explicit representation of properties, their refinement and their potential conflicts: application to interactive systems. In: Clemmensen, T., Rajamanickam, V., Dannenmann, P., Petrie, H., Winckler, M. (eds.) INTERACT 2017. LNCS, vol. 10774, pp. 91–105. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92081-8_9

    Chapter  Google Scholar 

  15. Flemisch, F., Adams, C.A., Conway, S.R., Goodrich, K.H., Palmer, M.T., Schutte, P.C.: The H-Metaphor as a guideline for vehicle automation and interaction. NASA Technical report (2005). https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040031835.pdf

  16. Flemisch, F., Heesen, M., Hesse, T., Kelsch, J., Schieben, A., Beller, J.: Towards a dynamic balance between humans and automation: authority, ability, responsibility and control in shared and cooperative control situations. Cogn. Tech. Work 14, 3–18 (2012). https://doi.org/10.1007/s10111-011-0191-6

    Article  Google Scholar 

  17. Gram, C., Cockton, G.: Internal properties: the software developer’s perspective. In: Gram, C., Cockton, G. (eds.) Design Principles for Interactive Software. ITIFIP, pp. 53–89. Springer, Boston (1996). https://doi.org/10.1007/978-0-387-34912-1_3

    Chapter  MATH  Google Scholar 

  18. Harrison, M.D., Johnson, P.D., Wright, P.C.: Relating the automation of functions in multi-agent control systems to a system engineering representation. In: Handbook of Cognitive Task Design, pp. 503–524 (2003)

    Google Scholar 

  19. Hassenzahl, M.: The effect of perceived hedonic quality on product appealingness. Int. J. Hum.–Comput. Interact. 13, 481–499 (2001)

    Article  Google Scholar 

  20. Heer, J.: Agency plus automation: designing artificial intelligence into interactive systems. PNAS 116(6), 1844–1850 (2019). https://doi.org/10.1073/pnas.1807184115

    Article  Google Scholar 

  21. Hollnagel, E.: From function allocation to function congruence. In: Dekker, S., Hollnagel, E. (eds.) Coping with Computers in the Cockpit. Ashgate, Aldershot (1999)

    Google Scholar 

  22. International Organization for Standardization. Ergonomics of human-system interaction—Part 11: Usability: Definitions and concepts, ISO 9241-11:2018(E). ISO (2018)

    Google Scholar 

  23. ISO. “ISO 9241-210:2019”. ISO. International Organization for Standardization. https://www.iso.org/standard/77520.html. Accessed 17 Feb 2020

  24. Martinie, C., Palanque, P., Barboni, E., Ragosta, M.: Task-model based assessment of automation levels: application to space ground segments. In: 2011 IEEE International Conference on Systems, Man, and Cybernetics, pp. 3267–3273 (2011)

    Google Scholar 

  25. Martinie, C., Palanque, P., Bouzekri, E., Cockburn, A., Canny, A., Barboni, E.: Analysing and demonstrating tool-supported customizable task notations. PACM on Hum. Comput. Interact. 3(EICS), 26 (2019). Article ID 12

    Google Scholar 

  26. Martinie, C., Palanque, P., Ragosta, M., Fahssi, R.: Extending procedural task models by systematic explicit integration of objects, knowledge and information. In: 31st European Conference on Cognitive Ergonomics (ECCE 2013), Article ID 23, pp. 1–10. ACM (2013)

    Google Scholar 

  27. Maudoux, G., Pecheur, C., Combéfis, S.: Learning safe interactions and full-control. In: Weyers, B., Bowen, J., Dix, A., Palanque, P. (eds.) The Handbook of Formal Methods in Human-Computer Interaction. HIS, pp. 297–317. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-51838-1_11

    Chapter  Google Scholar 

  28. McDermott, P., Dominguez, C., Kasdaglis, N., Ryan, M., Trhan, I., Nelson, A.: Human Machine Teaming Systems Engineering Guide, MP180941. The MITRE Corporation, McLean (2018)

    Google Scholar 

  29. Mirnig, A., et al.: Control transition interfaces in semiautonomous vehicles: a categorization framework and literature analysis. In: 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2017), pp. 209–220. ACM (2017)

    Google Scholar 

  30. Norman, D.A.: The Design of Everyday Things. Basic Book, New York (1988)

    Google Scholar 

  31. Palanque, P.: Ten objectives and ten rules for designing automations in interaction techniques, user interfaces and interactive systems. In: Proceedings of the International Conference on Advanced Visual Interfaces (AVI 2020), Article ID 2, pp. 1–10. ACM (2020)

    Google Scholar 

  32. Palanque, P.: Engineering automations: from a human factor perspective to design, implementation and validation challenges. In: ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS 2018), Article ID 2, pp. 1–2. ACM (2018)

    Google Scholar 

  33. Palmer, E.: Oops, it didn't arm - a case study of two automation surprises. In: 8th International Symposium on Aviation Psychology, Columbus, OH, pp. 227–232 (1995)

    Google Scholar 

  34. Parasuraman, R., Sheridan, T.B., Wickens, C.D.: A model for types and levels of human interaction with automation. IEEE Trans. Syst. Man Cybern. – Part A: Syst. Hum. 30(3), 286–297 (2000)

    Article  Google Scholar 

  35. Paternò, F., Mancini, C., Meniconi, S.: ConcurTaskTree: a diagrammatic notation for specifying task models. In: IFIP TC 13 International Conference on Human-Computer Interaction (INTERACT 1997), pp. 362–369. Chapman & Hall (1997)

    Google Scholar 

  36. Pritchett, A.R., Kim, S.Y., Feigh, K.: Modeling human–automation function allocation. J. Cogn. Eng. Decis. Mak. 8(1), 33–51 (2014). https://doi.org/10.1177/1555343413490944

    Article  Google Scholar 

  37. Ricci, F., Rokach, L., Shapira, B.: Introduction to recommender systems handbook. In: Ricci, F., Rokach, L., Shapira, B., Kantor, P.B. (eds.) Recommender Systems Handbook, pp. 1–35. Springer, Boston (2011). https://doi.org/10.1007/978-0-387-85820-3_1

    Chapter  MATH  Google Scholar 

  38. Roto, V., Palanque, P., Karvonen, H.: Engaging automation at work – a literature review. In: Barricelli, B.R., et al. (eds.) HWID 2018. IAICT, vol. 544, pp. 158–172. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-05297-3_11

    Chapter  Google Scholar 

  39. Roy, Q., Zhang, F., Vogel, D.: Automation accuracy is good, but high controllability may be better. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI 2019), Paper 520, pp. 1–8. ACM (2019)

    Google Scholar 

  40. Sarter, N., Woods, D., Billings, C.E.: Automation surprises. Handb. Hum. Factors Ergon. 2, 1926–1943 (1997)

    Google Scholar 

  41. Schmid, D., Korn, B., Stanton, N.A.: Evaluating the reduced flight deck crew concept using cognitive work analysis and social network analysis: comparing normal and data-link outage scenarios. Cogn. Technol. Work 22, 109–124 (2020)

    Article  Google Scholar 

  42. Skitka, L.J., Mosier, K., Burdick, M.D.: Accountability and automation bias. Int. J. Hum.-Comput. Stud. 52(4), 701–717 (2000)

    Article  Google Scholar 

  43. Steffel, M., Williams, E.F., Perrmann-Graham, J.: Passing the buck: delegating choices to others to avoid responsibility and blame. Org. Behav. Hum. Decis. Process. 135, 32–44 (2016). https://doi.org/10.1016/j.obhdp.2016.04.006

    Article  Google Scholar 

  44. Tan, D., Chen, W., Wang, H., Gao, Z.: Shared control for lane departure prevention based on the safe envelope of steering wheel angle. Control Eng. Pract. 64, 15–26 (2017)

    Article  Google Scholar 

  45. Wehrmeister, M.A., Pereira, C.E., Rammig, F.J.: Aspect-oriented model-driven engineering for embedded systems applied to automation systems. IEEE Trans. Ind. Inform. 9(4), 2373–2386 (2013). https://doi.org/10.1109/TII.2013.2240308

    Article  Google Scholar 

  46. Westin, C., Borst, C., Hilburn, B.: Automation transparency and personalized decision support: air traffic controller interaction with a resolution advisory system. IFAC-PapersOnLine 49(19), 201–206 (2016)

    Article  Google Scholar 

  47. Wright, P., Fields, R., Harrison, M.: Analyzing human-computer interaction as distributed cognition: the resources model. Hum. Comput. Interact. 15(1), 1–41 (2000)

    Article  Google Scholar 

  48. Wu, Y., Wei, H., Chen, X., Xu, J., Rahul, S.: Adaptive authority allocation of human-automation shared control for autonomous vehicle. Int. J. Automot. Technol. 21, 541–553 (2020)

    Article  Google Scholar 

  49. Yerkes, R.M., Dodson, J.D.: The relation of strength of stimulus to rapidity of habit-formation. J. Comp. Neurol. Psychol. 18, 459–482 (1908)

    Article  Google Scholar 

  50. Zhang, Z., Zhao, D.: Master-slave control strategy of tele-manipulator. In: International Conference on Robotics and Biomimetics (ROBIO 2009), pp. 2063–2067. IEEE Press (2009)

    Google Scholar 

  51. Ziemann, M., Eren, Y., El-Osta, A.: Gene name errors are widespread in the scientific literature. Genome Biol. 17, 177 (2016). https://doi.org/10.1186/s13059-016-1044-7

    Article  Google Scholar 

Download references

Acknowledgement

The authors are deeply indebted with the shepherd of INTERACT 2021 program committee who suggested and made significant improvements in the preparation of the final version of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Philippe Palanque .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bouzekri, E., Martinie, C., Palanque, P., Atwood, K., Gris, C. (2021). Should I Add Recommendations to My Warning System? The RCRAFT Framework Can Answer This and Other Questions About Supporting the Assessment of Automation Designs. In: Ardito, C., et al. Human-Computer Interaction – INTERACT 2021. INTERACT 2021. Lecture Notes in Computer Science(), vol 12935. Springer, Cham. https://doi.org/10.1007/978-3-030-85610-6_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85610-6_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85609-0

  • Online ISBN: 978-3-030-85610-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics