Advertisement

A Neurophysiological Examination of Multi-robot Control During NASA’s Extreme Environment Mission Operations Project

  • John G. BlitchEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 499)

Abstract

Previous research has explored the use of an external or “3rd person” view in the context of augmented reality, video gaming, and robot control. Few studies, however, involve the use of mobile robot to provide that viewpoint, and fewer still do so in dynamic, unstructured, high stress environments. This study examined the cognitive state of robot operators performing complex search and rescue tasks in a simulated crisis scenario. A solo robot control paradigm was compared with a dual condition in which an alternate (surrogate) perspective was provided via voice commands to a second robot employed as a highly autonomous teammate. Subjective and neurophysiological measurements indicate an increased level of situational awareness was achieved in the dual condition along with a reduction in workload and decision oriented task engagement. These results are discussed in the context of mitigation potential for cognitive overload in complex and unstructured task environments.

Keywords

Human robot interaction Cognitive state Situational awareness Workload Decision making Robot assisted rescue 

Notes

Acknowledgments

This work was sponsored by the Warfighter Interface Division of the 711th Human Performance Wing at the Air Force Research Laboratory. The author would like to extend an especially warm and profound expression of gratitude to Bill Todd and Jason Poffenberger from NASA/JSC for their outstanding support, as well as Ethan Blackford, Jeff Bolles, and James Christensen for their tremendous prowess in handling complex data collection and participant management issues under daunting conditions and an extremely tight schedule.

References

  1. 1.
    Salamin, P., Thalmann, D., Vexo, F.: The benefits of third-person perspective in virtual and augmented reality?. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM (2006)Google Scholar
  2. 2.
    Milgram, P., Rastogi, A., Grodski, J.J.: Telerobotic control using augmented reality. In: Robot and Human Communication, Proceedings 4th IEEE International Workshop on RO-MAN’95 TOKYO. IEEE (1995)Google Scholar
  3. 3.
    Okura, F., et al.: Teleoperation of mobile robots by generating augmented free-viewpoint images. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE (2013)Google Scholar
  4. 4.
    Leeper, A.E., et al.: Strategies for human-in-the-loop robotic grasping. In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. ACM (2013)Google Scholar
  5. 5.
    Hashimoto, S., et al.: Touchme: an augmented reality based remote robot manipulation. In: Proceedings of ICAT2011 21st International Conference on Artificial Reality and Telexistence. (2011)Google Scholar
  6. 6.
    Blitch, J.G., et al.: Correlations of spatial orientation with simulation based robot operator training. In: 4th International Conference on Applied Human Factors and Ergonomics (AHFE). San Francisco CA (2012)Google Scholar
  7. 7.
    Adams, J.A., Kaymaz-Keskinpala, H.: Analysis of perceived workload when using a PDA for mobile robot teleoperation. In: Proceedings IEEE International Conference on Robotics and Automation ICRA’04. IEEE (2004)Google Scholar
  8. 8.
    Hancock, P.A., et al.: A meta-analysis of factors affecting trust in human-robot interaction. Hum. Fact. J. Hum. Fact. Ergon. Soc. 53(5), 517–527 (2011)CrossRefGoogle Scholar
  9. 9.
    Parasuraman, R., Cosenzo, K.A., De Visser, E.: Adaptive automation for human supervision of multiple uninhabited vehicles: Effects on change detection, situation awareness, and mental workload. Mil. Psychol. 21(2), 270 (2009)CrossRefGoogle Scholar
  10. 10.
    Scerbo, M.: Adaptive automation. In: Parasuraman, R., Rizzo, M. (eds.) Neuroergonomics: The Brain at Work, pp. 239–252. Oxford University Press, 198 Madison Ave, 10016, New York NY. (2007) Google Scholar
  11. 11.
    Parasuraman, R., Wilson, G.F.: Putting the brain to work: neuroergonomics past, present, and future. (Cover story). Hum. Fact. 50(3), 468–474 (2008)CrossRefGoogle Scholar
  12. 12.
    Matthews, G., et al.: The psychometrics of mental workload multiple measures are sensitive but divergent. Hum. Fact. J. Hum. Fact. Ergon. Soc. 57(1), 125–143 (2015)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Berka, C., et al.: Real-time analysis of EEG indexes of alertness, cognition, and memory acquired with a wireless EEG headset. Int. J. Hum-Comp. Inter. 17(2), 151–170 (2004)CrossRefGoogle Scholar
  14. 14.
    Adams, S., Kane, R., Bates, R.: Validation of the China Lake Situational Awareness Scale with 3D SART and S-CAT. Naval Air Warfare Center Weapons Division (452330D), China Lake, CA (1998)Google Scholar
  15. 15.
    Hart, S.: NASA-task load index (NASA-TLX); 20 years later. Annu. Meet. Hum. Fact. Ergon. Soc. 50(9), 904–908 (2006)CrossRefGoogle Scholar
  16. 16.
    Newton, P., Bristoll, H.: Psychometric Success Spatial Ability Practice Test 1, pp. 1–12 (2010)Google Scholar
  17. 17.
    Kozhevnikov, M., Hegarty, M.: A dissociation between object manipulation spatial ability and spatial orientation ability. Mem. Cogn. 29(5), 745–756 (2001)CrossRefGoogle Scholar
  18. 18.
    Wickens, C.D.: Statistics. Ergon. Des. Q. Hum. Fact. Appl. 6(4), 18–22 (1998)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  1. 1.AFRL 711th HPW/RHC, Wright Patterson AFBDaytonUSA

Personalised recommendations