Cybernics pp 235-265 | Cite as

Human–Machine Coagency for Collaborative Control

  • Toshiyuki Inagaki


This chapter discusses some of the issues that are at the center of designing human–machine coagency where humans and smart machines collaborate and cooperate sensibly in a situation-adaptive manner. The first is the issue of authority and responsibility. It is argued that the machine may be given authority to improve safety and to alleviate possible damage to the human–machine system, even in a framework of human-centered automation. The second is the issue of the human operator’s overtrust in and overreliance on automation, where it is argued that possibilities and types of overtrust and overreliance may vary depending on the characteristics of the automated system. The importance of the design of a human–machine interface and human–machine interactions is included in the discussion.


Human supervisory control Function allocation Human-centered automation Authority and responsibility Overtrust and overreliance Levels of automation 


  1. 1.
    Sheridan TB (1992) Telerobotics, automation, and human supervisory control. MIT Press, Cambridge, MAGoogle Scholar
  2. 2.
    Bainbridge L (1983) Ironies in automation. Automatica 19(3):775–779CrossRefGoogle Scholar
  3. 3.
    Rasmussen J, Goodstein LP (1987) Decision support in supervisory control of high-risk industrial systems. Automatica 23(5):663–671Google Scholar
  4. 4.
    Woods D (1989) The effects of automation on human’s role: experience from non-aviation industries. In: Norman S, Orlady H (eds) Flight deck automation: promises and realities, NASA CR-10036. NASA-Ames Research Center, Moffett Field, pp 61–85Google Scholar
  5. 5.
    Billings CE (1997) Aviation automation—the search for a human-centered approach. LEA, MahwahGoogle Scholar
  6. 6.
    Sheridan TB (2002) Humans and automation: system design and research issues. Human Factors and Ergonomics Society & Wiley, Santa MonicaGoogle Scholar
  7. 7.
    Endsley MR (1995) Towards a theory of situation awareness in dynamic systems. Hum Factors 37(1):32–64CrossRefGoogle Scholar
  8. 8.
    Wickens CD (1994) Designing for situation awareness and trust in automation. In: Proceedings of IFAC integrated systems engineering, Baden-Baden, Germany, pp 77–82Google Scholar
  9. 9.
    Sarter NB, Woods DD, Billings CE (1997) Automation surprises. In: Salvendy G (ed) Handbook of human factors and ergonomics, 2nd edn. Wiley, New York, pp 1926–1943Google Scholar
  10. 10.
    Parasuraman R, Molloy R, Singh IL (1993) Performance consequences of automation-induced ‘complacency. Int J Aviat Psychol 3(1):1–23CrossRefGoogle Scholar
  11. 11.
    Moray N, Inagaki T (2000) Attention and complacency. Theor Issues Ergon Sci 1(4):354–365CrossRefGoogle Scholar
  12. 12.
    Ferrell WR, Sheridan TB (1967) Supervisory control of remote manipulation. IEEE Spectr 4(10):81–88CrossRefGoogle Scholar
  13. 13.
    Sarter NB, Woods DD (1995) How in the world did we ever get into that mode? Mode error and awareness in supervisory control. Hum Factors 37(1):5–19CrossRefGoogle Scholar
  14. 14.
    Dornheim M (1995) Dramatic incidents highlight mode problems in cockpits. Aviat Week Space Technol 142(5):57–59Google Scholar
  15. 15.
    Rouse WB (1991) Design for success: a human centered approach to designing successful products and systems. Wiley, New YorkGoogle Scholar
  16. 16.
    Fitts PM (ed) (1951) Human engineering for an effective air-navigation and traffic-control system. The Ohio State University Research Foundation, ColumbusGoogle Scholar
  17. 17.
    Hancock PA, Scallen SF (1998) Allocating functions in human-machine systems. In: Hoffman RR et al (eds) Viewing psychology as a whole. American Psychological Association, Washington, DC, pp 509–539Google Scholar
  18. 18.
    Price HE (1985) The allocation of function in systems. Hum Factors 27(1):33–45Google Scholar
  19. 19.
    Sharit J (1997) Allocation of functions. In: Salvendy G (ed) Handbook of human factors and ergonomics, 2nd edn. Wiley, New York, pp 301–339Google Scholar
  20. 20.
    Grote G, Ryser C, Wafler T, Windischer A, Weik S (2000) KOMPASS: a method for complementary function allocation in automated work systems. Int J Hum-Comput Stud 52:267–287CrossRefGoogle Scholar
  21. 21.
    Rouse WB (1988) Adaptive aiding for human/computer control. Hum Factors 30(4):431–443Google Scholar
  22. 22.
    Parasuraman R, Bhari T, Deaton JE, Morrison JG, Barnes M (1992) Theory and design of adaptive automation in aviation systems, Progress report no NAWCADWAR-92033-60. Naval Air Development Center Aircraft Division, Warminster, PAGoogle Scholar
  23. 23.
    Scerbo MW (1996) Theoretical perspectives on adaptive automation. In: Parasuraman R, Mouloua M (eds) Automation and human performance. LEA, Mahwah, pp 37–63Google Scholar
  24. 24.
    Moray N, Inagaki T, Itoh M (2000) Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. J Exp Psychol Appl 6(1):44–58CrossRefGoogle Scholar
  25. 25.
    Scallen SF, Hancock PA (2001) Implementing adaptive function allocation. Int J Aviat Psychol 11(2):197–221CrossRefGoogle Scholar
  26. 26.
    Scerbo MW, Freeman FG, Mikulka PJ, Parasuraman R, Di Nocero F, Prinzel III LJ (2001) The efficacy of psychophysiological measures for implementing adaptive technology. NASA/TP-2001-211018Google Scholar
  27. 27.
    Inagaki T (2003) Adaptive automation: sharing and trading of control. In: Hollnagel E (ed) Handbook of cognitive task design. LEA, Mahwah, pp 147–169CrossRefGoogle Scholar
  28. 28.
    Parasuraman R, Sheridan TB, Wickens CD (2000) A model for types and levels of human interaction with automation. IEEE Trans Syst Man Cybern 30(3):286–297CrossRefGoogle Scholar
  29. 29.
    FAA (2011) Introduction to TCAS II version 7.1 booklet HQ-111358. Washington, DCGoogle Scholar
  30. 30.
    Bresley B, Egilsrud J (1997) Enhanced ground proximity warning system. Boeing Airliner, pp 1–13Google Scholar
  31. 31.
    Billings CE (1992) Human-centered aircraft automation: a concept and guidelines, vol 103885, NASA technical memorandum. NASA-Ames Research Center, Moffett FieldGoogle Scholar
  32. 32.
    Cacciabue PC (2004) Guide to applying human factors methods: human error and accident management in safety critical systems. Springer, LondonCrossRefGoogle Scholar
  33. 33.
    Wickens CD, Lee JD, Liu Y, Becker SEG (2004) An introduction to human factors engineering, 2nd edn. Prentics-Hall, Upper Saddle RiverGoogle Scholar
  34. 34.
    Orlady HW, Orlady LM (1999) Human factors in multi-crew flight operations. Ashgate, AldershotGoogle Scholar
  35. 35.
    Endsley MR, Kiris EO (1995) The out-of-the-loop performance problem and the level of control in automation. Hum Factors 37(2):3181–3194CrossRefGoogle Scholar
  36. 36.
    Parasuraman R, Riley V (1997) Humans and automation: use, misuse, disuse, abuse. Hum Factors 39(2):230–253CrossRefGoogle Scholar
  37. 37.
    Inagaki T, Stahre J (2004) Human supervision and control in engineering and music: similarities, dissimilarities, and their implications. Proc IEEE 92(4):589–600CrossRefGoogle Scholar
  38. 38.
    Hollnagel E, Woods DD (2005) Joint cognitive systems: foundations of cognitive systems engineering. CRC Press, HobokenCrossRefGoogle Scholar
  39. 39.
    Inagaki T (2006) Design of human-machine interactions in light of domain-dependence of human-centered automation. Cognit Technol Work 8(3):161–167CrossRefMathSciNetGoogle Scholar
  40. 40.
    Inagaki T, Kunioka T (2002) Possible automation surprises in the low-speed range adaptive cruise control system. In: IASTED international conference on applied modelling and simulation, Cambridge, MA, pp 335–340Google Scholar
  41. 41.
    ITARDA (2003) Anecdotal report on traffic accident investigations and analyses (in Japanese). ITARDA, Tokyo, JapanGoogle Scholar
  42. 42.
    Scott WB (1999) Automatic GCAS: “you can’t fly any lower”. Aviat Week Space Technol 150(5):76–79Google Scholar
  43. 43.
    Kingsley-Jones M, Warnick G (2006) Airbus studies emergency traffic avoidance system to act without pilots. Flight International 22 Mar 2006Google Scholar
  44. 44.
    Kaminski-Morrow D (2009) Airbus A350 could be equipped with automatic emergency descent system. Flight International 15 Aug 2009Google Scholar
  45. 45.
    Inagaki T, Sheridan TB (2012) Authority and responsibility in human-machine systems: probability theoretic validation of machine-initiated trading of authority. Cognit Technol Work 14(1):29–37CrossRefGoogle Scholar
  46. 46.
    Inagaki T, Itoh M, Nagai Y (2006) Efficacy and acceptance of driver support under possible mismatches between driver’s intent and traffic conditions. In: Proceedings of HFES 50th annual meeting, San Francisco, CA, pp 280–283Google Scholar
  47. 47.
    Inagaki T, Itoh M, Nagai Y (2007a) Driver support functions under resource-limited situations. In: Proceedings of HFES 51st annual meeting, Baltimore, MD, pp 176–180Google Scholar
  48. 48.
    Inagaki T, Itoh M, Nagai Y (2007) Support by warning or by action: which is appropriate under mismatches between driver intent and traffic conditions? IEICE Trans Fundam E90-A(11):264–272Google Scholar
  49. 49.
    Inagaki T (2011) To what extent may assistance systems correct and prevent ‘erroneous’ behaviour of the driver? In: Cacciabue PC et al (eds) Human modelling in assisted transportation. Springer, Milan, pp 33–41CrossRefGoogle Scholar
  50. 50.
    Lee JD, Moray N (1992) Trust, control strategies and allocation of function in human-machine systems. Ergonomics 35(10):1243–1270CrossRefGoogle Scholar
  51. 51.
    Mosier K, Skitka LJ, Heers S, Burdick M (1998) Automation bias: decision making and performance in high-tech cockpits. Int J Aviat Psychol 8:47–63CrossRefGoogle Scholar
  52. 52.
    Meyer J (2001) Effects of warning validity and proximity on responses to warnings. Hum Factors 43(4):563–572CrossRefGoogle Scholar
  53. 53.
    Sheridan TB, Parasuraman R (2005) Human-automation interaction. In: Nickerson RS (ed) Reviews of human factors and ergonomics, vol 1. Human Factors and Ergonomics Society, Santa Monica, pp 89–129Google Scholar
  54. 54.
    Inagaki T (2010) Traffic systems as joint cognitive systems: issues to be solved for realizing human-technology coagency. Cognit Technol Work 12(2):153–162CrossRefMathSciNetGoogle Scholar
  55. 55.
    Ladkin PB (2002) ACAS and the south German midair. Technical note RVS-Occ-02-02.
  56. 56.
    Learmount D (2002) Questions hang over collision. Flight International, 8Google Scholar
  57. 57.
    Inagaki T, Moray N, Itoh M (1998) Trust self-confidence and authority in human-machine systems. In: Proceedings of IFAC man-machine systems, Kyoto, Japan, pp 431–436Google Scholar
  58. 58.
    Inagaki T (1999) Situation-adaptive autonomy: trading control of authority in human-machine systems. In: Scerbo MW, Mouloua M (eds) Automation technology and human performance: current research and trends. Lawrence Erlbaum Associates, Mahwah, pp 154–159Google Scholar
  59. 59.
    Inagaki T (2000a) Situation-adaptive autonomy for time-critical takeoff decisions. Int J Model Simul 20(2):175–180Google Scholar
  60. 60.
    Inagaki T, Takae Y, Moray N (1999) Automation and human interface for takeoff safety. In: Proceedings of tenth international symposium on aviation psychology, Columbus, OH, pp 402–407Google Scholar
  61. 61.
    Inagaki T, Furukawa H (2004) Computer simulation for the design of authority in the adaptive cruise control systems under possibility of driver’s over-trust in automation. In: Proceedings of IEEE SMC conference, The Hague, The Netherlands, pp 3932–3937Google Scholar
  62. 62.
    Hollnagel E (2006) A function-centered approach to joint driver-vehicle system design. Cognit Technol Work 8:169–173CrossRefGoogle Scholar
  63. 63.
    Hollnagel E (1999) From function allocation to function congruence. In: Dekker SWA, Hollnagel E (eds) Coping with computers in the cockpit. Ashgate, Brookfield, pp 29–53Google Scholar
  64. 64.
    Inagaki T (1993) Situation-adaptive degree of automation for system safety. In: Proceedings of 2nd IEEE international workshop on robot and human communication, Tokyo, Japan, pp 231–236Google Scholar
  65. 65.
    Inagaki T (2000b) Situation-adaptive autonomy: dynamic trading of authority between human and automation. In: Proceedings of HFES 44th annual meeting, San Diego, CA, pp 3.13–3.16Google Scholar
  66. 66.
    Jordan N (1963) Allocation of functions between man and machines in automated systems. J Applied Psychology 47(3):161–165Google Scholar
  67. 67.
    Hollnagel E (2003) Prolegomenon to cognitive task design. In: Hollnagel E (ed) Handbook of cognitive task design. LEA, Mahwah, pp 3–15Google Scholar

Copyright information

© Springer Japan 2014

Authors and Affiliations

  1. 1.Graduate School of Systems and Information EngineeringUniversity of TsukubaTsukubaJapan

Personalised recommendations