Context Sensitive Tactile Displays for Bidirectional HRI Communications

  • Bruce Mortimer
  • Linda ElliottEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 499)


Tactile displays have shown high potential to support critical communications, by relaying information, queries, and direction cues from the robot to the Soldier in a hands-free manner. Similarly, Soldiers can communicate to the robot through speech, gesture, or visual display. They could also respond to robot messages by pressing a button to acknowledge or request a repeat message. A series of studies have demonstrated the ability of Soldiers to interpret tactile direction and information cues during waypoint navigation in rough terrain, during day and night operations. The design of tactile display systems must result in reliable message detection. We have proposed a framework to ensure salience of the tactile cues. In this presentation we summarize research efforts that explore factors affecting the efficacy of tactile cues for bidirectional soldier-robot communications. We will propose methods for changing tactile salience based on the symbology and context.


Human factors Tactile Adaptive Salience Communication 



Acknowledgements go to Drs. Michael Barnes and Susan Hill of the US Army Research Laboratory’s Human Research and Engineering Directorate for support of efforts described in this report, as part of ongoing multiyear investigations of human-robot interaction.


  1. 1.
    Chiasson, J., McGrath, B., Rupert, A.: Enhanced situation awareness in sea, air, and land environment. In: Proceedings of NATO RTO Human Factors & Medicine Panel Symposium on “Spatial Disorientation in Military Vehicles: Causes, Consequences and Cures,” No. TRO-MP-086, pp. 1–10. La Coruña, Spain, Available at: (2002)
  2. 2.
    Elliott L., Redden, E.: Reducing workload: a multisensory approach. In: P. Savage-Knepshield (ed.) Designing Soldier Systems: Current Issues in Human Factors. Ashgate. (2013)Google Scholar
  3. 3.
    Prewett, M., Elliott, L., Walvoord, A., Coovert, M.: A meta-analysis of vibrotactile and visual information displays for improving task performance. IEEE Trans. Syst. Man Cybern. Part C: Appl. Rev. 42(1), 123–132 (2012)CrossRefGoogle Scholar
  4. 4.
    Self, B., van Erp, J., Eriksson, L., Elliott, L.R.: Human factors issues of tactile displays for military environments. In: van Erp, J., Self, B.P. (eds.) Tactile Displays for Orientation, Navigation, and Communication in Air, Sea, and Land Environments. NATO. RTO-TR-HFM-122 (2008)Google Scholar
  5. 5.
    Van Erp, J.B.F., Self, B.P.: Tactile displays for orientation, navigation, and communication in air, sea, and land environments. Final RTO Technical Report RTO-TR-HFM-122. North Atlantic Treaty Organization Research and Technology Organization. (2008)Google Scholar
  6. 6.
    Geldard, F.A.: Adventures in tactile literacy. Am. Psychol. 12, 115–124 (1957)CrossRefGoogle Scholar
  7. 7.
    Spirkovska, L.: Summary of Tactile User Interfaces Techniques and Systems, NASA Ames Research Center, Report NASA/TM-2005-213451 (2005)Google Scholar
  8. 8.
    Redden, E.: Virtual Environment Study of Mission-Based Critical Informational Requirements: ARL-TR-2636. US Army Research Laboratory, Aberdeen Proving Ground, MD (2002)Google Scholar
  9. 9.
    Gilson, R., Redden, E., Elliot, L. (eds.): Remote Tactile Displays for Future Soldiers, ARL-SR-0152. Army Research Laboratory, Aberdeen Proving Ground, MD (2007)Google Scholar
  10. 10.
    Pomranky-Hartnett, G., Elliott, L., Mortimer, B., Mort, G., Pettitt, R., Zets, G.: Soldier-based assessment of a dual-row tactor display during simultaneous navigational and robot-monitoring tasks. ARL-TR-7397. US Army Research Laboratory, Aberdeen Proving Ground, MD (2015)Google Scholar
  11. 11.
    Elliott, L., Mortimer, B., Cholewiak, R., Mort, G., Zets, G., Pomranky-Hartnett, G., Pettitt, R. Wooldridge, R.; Salience of tactile cues: an examination of tactor actuator and tactile cue characteristics. ARL-TR-7392. US Army Research Laboratory, Aberdeen Proving Ground, MD (2015)Google Scholar
  12. 12.
    Borji, A., Sihiti, D., Itti, L.: Quantitative analysis of human-model agreement in visual saliency modeling: a comparative study. IEEE Trans. Image Process. 22(1), 55–69 (2013)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Frintop, S., Rome, E., Christensen, H.: Computationial models of visual selective attention and their cognitive foundations: a survey. ACM Trans. Appl. Percept. 7(1), 1–46 (2010)CrossRefGoogle Scholar
  14. 14.
    Mortimer, B., Zets, G., Mort, G., Shovan, C.: Implementing effective tactile symbology for orientation and navigation. In: Proceedings of the 14th International Conference on Human Computer Interaction, HCI, Orlando, FL (2011)Google Scholar
  15. 15.
    Cholewiak, R., Wollowitz, M.: The design of vibrotactile transducers. In: Summers, R. (ed.) Tactile Aids for the Hearing Impaired. Whurr Publishers, London (1992)Google Scholar
  16. 16.
    Jones, L., Sarter, N.: Tactile displays: guidance for their design and application. Hum. Factors 50, 90–111 (2008)CrossRefGoogle Scholar
  17. 17.
    Mortimer, B., Zets, G., Cholewiak, R.: Vibrotactile transduction and transducers. J. Acoust. Soc. Am. 121, 2970 (2007)CrossRefGoogle Scholar
  18. 18.
    Cholewiak, R., Collins, A.: Sensory and physiological bases of touch. In: Heller, M., Schiff, W. (eds.) The Psychology of Touch, pp. 23–60. Lawrence Erlbaum Associates, Hillsdale, NJ (1991)Google Scholar
  19. 19.
    Hogan R.: Personality and personality measurement. In: Dunnette, M.D., Hough, L.M. (eds.) Handbook of Industrial Psychology. Consulting Psychologists Press, Palo Alto, CA (1991)Google Scholar
  20. 20.
    Hancock, P.A., Lawson, B.D., Cholewiak, R., Elliott, L., van Erp, J.B.F., Mortimer, B.J.P., Rupert, A.H., Redden, E., Schmeisser, E.: Tactile Cueing to Augment Multisensory Human-Machine Interaction, Engineering in Design, pp. 4–9 (2015)Google Scholar
  21. 21.
    Itti, L., Koch, C.: Computational modelling of visual attention. Nat. Rev. Neurosci. 2(3), 194–203 (2001)Google Scholar
  22. 22.
    Navalpakkam, V., Itti, L.: Modeling the influence of task on attention. Vision. Res. 45(2), 205–231 (2005)CrossRefGoogle Scholar
  23. 23.
    White, T., Krausman, A.: Effects of inter-stimulus interval and intensity on the perceived urgency of tactile patterns. Appl. Ergon. 48, 121–129 (2015)CrossRefGoogle Scholar
  24. 24.
    Brown, L., Brewster, S., Purchase, H.: Tactile crescendos and sforzandos: applying musical techniques to tactile icon design. In: CHI’06 Extended Abstracts on Human factors in Computing Systems, pp. 610–615 (2006)Google Scholar
  25. 25.
    Redden, E.S., Carstens, C.B., Turner, D.D., Elliott, L.R.: Localization of tactile signals as a function of tactor operating characteristics. ARL-TR-3971. Army Research Laboratory (US), Aberdeen Proving Ground (MD) (2006)Google Scholar
  26. 26.
    Gescheider, G., Bolanowski, S., Hall, K., Hoffman, K., Verrillo, R.: The effects of aging on information processing channels in the sense of touch: 1. Absolute sensitivity. Somatosens. Motor Res. 11, 4 (1994)Google Scholar
  27. 27.
    van Erp, J.B.F., Tactile navigation display. In: Brewster, S., Murray-Smith, R. (eds.) Haptic HCI 2000, LNCS 2058, pp. 165–173 (2001)Google Scholar
  28. 28.
    van Erp, J.: Tactile displays for navigation and orientation: perception and behavior. Mostert and van Onderen, Leiden, The Netherlands (2007)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  1. 1.Engineering Acoustics, IncCasselberryUSA
  2. 2.Army Research Laboratory Human Research and Engineering Directorate, Fort BenningGeorgiaUSA

Personalised recommendations