Advertisement

An Interactive Strategic Mission Management System for Intuitive Human-Robot Cooperation

  • Elsa Andrea KirchnerEmail author
  • Hagen Langer
  • Michael Beetz
Chapter
Part of the Intelligent Systems, Control and Automation: Science and Engineering book series (ISCA, volume 96)

Abstract

To enable cooperative task planning and coordination between the human operator and robot teams, new types of interfaces are needed. We present an interactive strategic mission management system (ISMMS) for underwater explorations performed by mixed teams of robots and human investigators that enables cooperative task planning and coordination between the human operator and robot teams. Main goals of the ISMMS are to enable robots to “explain” their intentions, problems, and situation fast and in an intuitive fashion to humans, to allow smooth blending between autonomous behavior and human control, to provide smart interfaces to mandatory external control and to enable adaptive task sharing while being optimized with respect to intuitive usage and interaction measured by behavioral and physiological human data.

References

  1. 1.
    Song H, Rawat DB, Jeschke S, Brecher C (2016) Cyber-physical systems: foundations, principles and applications, reprint. Academic Press, Cambridge, MAGoogle Scholar
  2. 2.
    Rammert W (2009) Hybride Handlungsträgerschaft: Ein Soziotechnisches Modell verteilten Handelns, in Intelligente Objekte. Springer, Berlin, Heidelberg, pp 23–33CrossRefGoogle Scholar
  3. 3.
    Marconi L, Melchiorri C, Beetz M, Pangercic D, Siegwart R, Leutenegger S, Carloni R, Stramigioli S, Bruyninckx H, Doherty P, Kleiner A, Lippiello V, Finzi A, Siciliano B, Sala S, Tomatis N (2012) The SHERPA project: smart collaboration between humans and ground-aerial robots for improving rescuing activities in alpine environments. In: IEEE international symposium on safety, security, and rescue robotics (SSRR)Google Scholar
  4. 4.
    Kirchner EA, Kim S-K, Tabie M, Wöhrle H, Maurus M, Kirchner F (2016) An intelligent man-machine interface—multi-robot control adapted for task engagement based on single-trial detectability of p300. Front Human Neurosc 10:291. ISSN 1662-5161. https://doi.org/10.3389/fnhum.2016.00291. 229, 235, 244, 246
  5. 5.
    Sonsalla R, Cordes F, Christensen L, Roehr TM, Stark T, Planthaber S, Maurus M, Mallwitz M, Kirchner EA (2017) Field testing of a cooperative multi-robot sample return mission in mars analogue environment. In: Proceedings of the 14th symposium on advanced space technologies in robotics and automation (ASTRA)Google Scholar
  6. 6.
    Planthaber S, Maurus M, Bongardt B, Mallwitz M, Vaca Benitnez LM, Christensen L, Cordes F, Sonsalla R, Stark T, Roehr T (2017) Controlling a semi-autonomous robot team from a virtual environment. In: proceedings of the companion of the 2017 ACM/IEEE international conference on human-robot interaction (HRI ’17). ACM, New York, NY, USA, 417-417. https://doi.org/10.1145/3029798.3036647
  7. 7.
    Straube S, Rohn M, Roemmermann M, Bergatt C, Jordan M, Kirchner EA (2011) On the closure of perceptual gaps in man-machine interaction: virtual immersion, psychophysics and electrophysiology. Perception, 40 ECVP Abstract Supplement:177. 241Google Scholar
  8. 8.
    Wickens CD (1984) Processing resources in attention. In: Parasuraman R, Davies D (eds) Varieties of attention, Academic Press, pp 63–101Google Scholar
  9. 9.
    Wickens CD (1992) Engineering psychology and human performance (2nd. ed.), New York: HarperCollinsGoogle Scholar
  10. 10.
    Gerson AD, Parra LC, Sajda P (2006) Cortically coupled computer vision for rapid image search. IEEE Trans Neural Syst Rehabil Eng. 14(2):174-9CrossRefGoogle Scholar
  11. 11.
    Kirchner EA, Kim S-K, Straube S, Seeland A, Wöhrle H, Krell MM, Tabie M, Fahle M (2013) On the applicability of brain reading for predictive human-machine interfaces in robotics. PLoS ONE, Public Library of Science, volume 8, number 12, pages e81732Google Scholar
  12. 12.
    Coles M (1989) Modern mind-brain reading: psychophysiology, physiology, and cognition. Psychophysiology 26(3):251–269CrossRefGoogle Scholar
  13. 13.
    Beetz M, Beßler D, Haidu A, Bozcuoglu AK, Bartels G (2018) KnowRob 2.0—A 2nd generation knowledge processing framework for cognition-enabled robotic agents. In: International conference on robotics and automation (ICRA)Google Scholar
  14. 14.
    Tenorth M, Winkler J, Beßler D, Beetz M (2015) Open-EASE—a cloud-based knowledge service for autonomous learning, KI - Künstliche IntelligenzGoogle Scholar
  15. 15.
    Tenorth M, Beetz M (2013) KnowRob—a knowledge processing infrastructure for cognition-enabled robots. Int J Robot Res 32(5):566–590CrossRefGoogle Scholar
  16. 16.
    Bozcuoglu AK, Kazhoyan G, Furuta Y, Stelter S, Beetz M, Okada K, Inaba M (2018) The exchange of knowledge using cloud robotics. Robot Autom Lett 3(2):1072–1079CrossRefGoogle Scholar
  17. 17.
    Baddeley AD (1986) Working memory. Clarendon PressGoogle Scholar
  18. 18.
    Endsley MR (2013) Situation awareness. Oxf. Handb. Cogn. EngGoogle Scholar
  19. 19.
    Woods DD (1991) Representation aiding: a ten year retrospective, pp 1173–1176Google Scholar
  20. 20.
    Jamieson GA (2007) Ecological interface design for petrochemical process control: an empirical assessment. IEEE Trans Syst Man Cybern Part Syst Hum 37(6): 906–920CrossRefGoogle Scholar
  21. 21.
    Burns CM et al (2008) Evaluation of ecological interface design for nuclear process control: situation awareness effects. Hum Factors 50(4):663–679CrossRefGoogle Scholar
  22. 22.
    St John M, Smallman HS, Manes DI, Feher BA, Morrison JG (2005) Heuristic automation for decluttering tactical displays. Hum Factors 47(3):509–525CrossRefGoogle Scholar
  23. 23.
    Murphy-Chutorian E, Trivedi MM (2009) Head pose estimation in computer vision: a survey. IEEE Trans Pattern Anal Mach Intell 31(4):607–626CrossRefGoogle Scholar
  24. 24.
    Hansen DW, Qiang J (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500CrossRefGoogle Scholar
  25. 25.
    la Torre FD, Cohn JF (2011) Facial expression analysis. Visual analysis of humans. Springer, London, pp 377–409CrossRefGoogle Scholar
  26. 26.
    Dinges DF, Mallis MM, Maislin G, Iv P, JW (1998) Evaluation of Techniques for Ocular Measurement as an Index of Fatigue and the Basis for Alertness ManagementGoogle Scholar
  27. 27.
    Dong Y, Hu Z, Uchimura K, Murayama N (2011) Driver inattention monitoring system for intelligent vehicles: a review. IEEE Trans Intell Transp Syst 12(2):596–614CrossRefGoogle Scholar
  28. 28.
    Banda N, Robinson P (2011) Multimodal affect recognition in intelligent tutoring systems. Affect Comput Intell Interact, 200–207Google Scholar
  29. 29.
    Whitehill J et al (2011) Towards an optimal affect-sensitive instructional system of cognitive skills 2011:20–25Google Scholar
  30. 30.
    Kim Y, Lee H, Provost EM (2013) Deep learning for robust feature generation in audiovisual emotion recognition, pp 3687–3691Google Scholar
  31. 31.
    Borghini G, Astolfi L, Vecchiato G, Mattia D, Babiloni F (2014) Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci Biobehav Rev. 44:58–75CrossRefGoogle Scholar
  32. 32.
    Scharinger C, Soutschek A, Schubert T, Gerjets P (2017) Comparison of the working memory load in n-back and working memory span tasks by means of EEG frequency band power and P300 amplitude. Front Human Neurosc 11:6Google Scholar
  33. 33.
    Dasari D, Shou G, Ding L (2017) ICA-derived EEG correlates to mental fatigue, effort, and workload in a realistically simulated air traffic control task. 11, 297Google Scholar
  34. 34.
    Wascher E, Rasch B, Sänger J, Hoffmann S, Schneider D, Rinkenauer G, Heuer H, Gutberlet I (2014) Frontal theta activity reflects distinct aspects of mental fatigue. Biol Psychol 96:57–65CrossRefGoogle Scholar
  35. 35.
    Pomer-Escher AG, Pinheiro de Souza MD, Bastos Filho TF (2014) Methodology for analysis of stress level based on asymmetry patterns of alpha rhythms in EEG signals. In: Biosignals and Biorobotics Conference: Biosignals and Robotics for Better and Safer Living (BRC), 5th ISSNIP-IEEE, pp. 1–5Google Scholar
  36. 36.
    Fan M, Tootooni MS, Sivasubramony RS, Miskovic V, Rao PK, Chou C-A (2016) Acute stress detection using recurrence quantification analysis of electroencephalogram (EEG) Signals. Springer, Cham, pp 252–261CrossRefGoogle Scholar
  37. 37.
    Garcia Molina G, Tsoneva T, Nijholt A (2009) Emotional brain-computer interfaces. In: International conference on affective computing and intelligent interaction, pp 138–146Google Scholar
  38. 38.
    Hamid NH, Sulaiman N, Aris SAM, Murat ZH, Taib MN (2010) Evaluation of human stress using EEG power spectrum. In: 6th International colloquium on signal processing and its applications (CSPA), pp 1–4Google Scholar
  39. 39.
    Aftanas LI, Varlamov AA, Pavlov SS, Makhnev VP, Reva NV (2001) Affective picture processing: event related synchronization within individually defined human theta band is modulated by valence dimension. Neurosci Lett 303:115–118CrossRefGoogle Scholar
  40. 40.
    Kirchner EA, Drechsler R (2013) A formal model for embedded brain reading. Ind Robot: Int J 40(6):530–540. https://doi.org/10.1108/IR-01-2013-318. 233, 234, 238, 242, 243, 254CrossRefGoogle Scholar
  41. 41.
    Folgheraiter M, Jordan M, Straube S, Seeland A, Kim SK, Kirchner EA (2012) Measuring the improvement of the interaction comfort of a wearable exoskeleton. Int J Soc Robot 4(3):285–302. https://doi.org/10.1007/s12369-012-0147-x. 234, 235, 238, 240, 263CrossRefGoogle Scholar
  42. 42.
    Iturrate I, Montesano L, Minguez J (2010) Robot reinforcement learning using EEG-based reward signals. In: IEEE international conference of on robotics and automation (ICRA), pp 4181–4184Google Scholar
  43. 43.
    Kim SK, Kirchner EA, Stefes A, Kirchner F (2017) Intrinsic interactive reinforcement learning—using error-related potentials for real world human-robot interaction. Sci Reports 7:17562CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Elsa Andrea Kirchner
    • 1
    Email author
  • Hagen Langer
    • 2
  • Michael Beetz
    • 2
  1. 1.RIC and Robotics LabDFKI GmbH and University of BremenBremenGermany
  2. 2.Institute for Artificial IntelligenceUniversity of BremenBremenGermany

Personalised recommendations