Advertisement

Displaying Information to Support Transparency for Autonomous Platforms

  • Anthony R. SelkowitzEmail author
  • Cintya A. Larios
  • Shan G. Lakhmani
  • Jessie Y.C. Chen
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 499)

Abstract

The purpose of this paper is to summarize display design techniques that are best suited for displaying information to support transparency of communication in autonomous systems interfaces. The principles include Ecological Interface Design, integrated displays, and pre-attentive cuing. Examples of displays from two recent experiments investigating how transparency affects operator trust, situational awareness, and workload, are provided throughout the paper as an application of these techniques. Specifically, these interfaces were formatted using the Situation awareness-based Agent Transparency model as a method of formatting the information in displays for an autonomous robot—the Autonomous Squad Member (ASM). Overall, these methods were useful in creating usable interfaces for the ASM display.

Keywords

Agent-Transparency Human-Robot Interaction Ecological interface design Cognitive engineering 

Notes

Acknowledgements

This research was supported by U.S. Department of Defense’s Autonomy Research Pilot Initiative. The authors wish to thank MaryAnne Fields, Daniel Barber, Erica Valiente, Andrew Watson, Kelvin Oie, and Susan Hill for their contribution to this project.

References

  1. 1.
    Chen, J.Y.C., Procci, K., Boyce, M., Wright, J., Garcia, A., Barnes, M.: Situation awareness-based agent transparency. Technical Report: ARL-TR-6905, Aberdeen Proving Ground MD: US Army Research Laboratory (2014)Google Scholar
  2. 2.
    Lyons, J.B.: Being transparent about transparency: A model for human-robot interaction. In: AAAI Spring Symposium Series. Palo Alto, California (2013)Google Scholar
  3. 3.
    Kilgore, R., Voshell, M.: Increasing the transparency of unmanned systems: applications of ecological interface design. Virtual, augmented and mixed reality. Applications of virtual and augmented reality, pp. 378–389. Springer International Publishing, Heraklion (2014)Google Scholar
  4. 4.
    Lyons, J.B., Havig, P.R.: Transparency in a human-machine context: approaches for fostering shared awareness/intent. Virtual, Augmented and Mixed Reality. Designing and Developing Virtual and Augmented Environments, pp. 181–190. Springer International Publishing, Heraklion (2014)Google Scholar
  5. 5.
    Endsley, M.R.: Toward a theory of situation awareness in dynamic systems. Hum Factors: J Hum Factors Ergon Soc 37, 32–64 (1995)CrossRefGoogle Scholar
  6. 6.
    Rao, A.S., Georgeff, M.P.: BDI agents: From theory to practice. ICMAS. 95, pp. 312–319. California, San Francisco (1995)Google Scholar
  7. 7.
    Gibson, J.J.: The ecological approach to visual perception, classic edn. Psychology Press, New York (2014)Google Scholar
  8. 8.
    Vicente, K.J., Rasmussen, J.: The ecology of human-machine systems II: mediating direct perception in complex work domains. Ecol Psychol 2, 207–249 (1990)CrossRefGoogle Scholar
  9. 9.
    Bennett, K.B., Flach, J.M.: Display and interface design: subtle science, exact art. CRC Press, Boca Raton (2011)Google Scholar
  10. 10.
    Wickens, C.D., Carswell, C.M.: The proximity compatibility principle: its psychological foundation and relevance to display design. Hum Factors: J Hum Factors Ergon Soc 37, 473–494 (1995)CrossRefGoogle Scholar
  11. 11.
    Bennett, K.B., Posey, S.M., Shattuck, L.G.: Ecological interface design for military command and control. J Cognitive Eng and Decision Making 2, 349–385 (2008)CrossRefGoogle Scholar
  12. 12.
    Kosara, R., Hauser, H., Gresh, D.L.: An interaction view on information visualization. In: State-of-the-Art Report Proceedings of EUROGRAPHICS 2 pp. 123–137. Blackwell, Granada (2003)Google Scholar
  13. 13.
    Treisman, A.: Preattentive processing in vision. Comp Vision, Graphics, and Image Process 31, 156–177 (1985)CrossRefGoogle Scholar
  14. 14.
    Selkowitz, A.S., Lakhmani, S.G., Chen, J.Y., Boyce, M.W.: The Effects of Agent Transparency on Human Interaction with an Autonomous Robotic Agent. Proceed Hum Factors Ergono Soc Annual Meeting. 59, pp. 806–810. SAGE Publications, Thousand Oaks (2015)Google Scholar
  15. 15.
    Selkowitz, A.S., Lakhmani, S.G., Larios, C. N., Chen, J.Y.C.: Agent transparency and the autonomous squad member, to be presented at the 2016 annual meeting for the human factors and ergonomics society. Washington, DC (in press)Google Scholar
  16. 16.
    Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum Factors: J Human Factors Ergon Soc 46, 50–80 (2004)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Anthony R. Selkowitz
    • 1
    Email author
  • Cintya A. Larios
    • 2
  • Shan G. Lakhmani
    • 2
  • Jessie Y.C. Chen
    • 1
  1. 1.U.S. Army Researach LaboratoryAdelphiUSA
  2. 2.Institute for Simulation and TrainingOrlandoUSA

Personalised recommendations