Design and Evaluation of a Nonverbal Communication Platform between Assistive Robots and their Users

  • Anthony L. Threatt
  • Keith Evan Green
  • Johnell O. Brooks
  • Jessica Merino
  • Ian D. Walker
  • Paul Yanik
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8028)


Inevitably, assistive robotics will become integral to the everyday lives of a human population that is increasingly mobile, older, urban-centric and networked. How will we communicate with such robots, and how will they communicate with us? We make the case for a relatively “artificial” mode of nonverbal human-robot communication [NVC] to avoid unnecessary distraction for people, busily conducting their lives via human-human, natural communication. We propose that this NVC be conveyed by familiar lights and sounds, and elaborate here early experiments with our NVC platform in a rehabilitation hospital. Our NVC platform was perceived by medical staff as a desirable and expedient communication mode for human-robot interaction [HRI] in clinical settings, suggesting great promise for our mode of human-robot communication for this and other applications and environments involving intimate HRI.


assistive robotics nonverbal communication human factors human-centered design 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Merino, J., Threatt, A.L., Walker, I.D., Green, K.E.: Forward kinematic model for continuum robotic surfaces. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal (October 2012)Google Scholar
  2. 2.
    Threatt, A.L., Merino, J., Green, K.E., Walker, I.D., Brooks, J.O., et al.: A vision of the patient room as an architectural robotic ecosystem. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal (October 2012)Google Scholar
  3. 3.
    Yanik, P., Manganelli, J., Merino, J., Threatt, T., Brooks, J.O., Green, K.E., Walker, I.D.: Use of kinect depth data and growing neural gas for gesture based robot control. In: PervaSense2012, the 4th International Workshop for Situation Recognition and Medical Data Analysis in Pervasive Health Environments, San Diego, California, May 21, pp. 283–290 (2012)Google Scholar
  4. 4.
    Green, K.E., Walker, I.D., Brooks, J.O., Logan Jr., W.C.: Comfortable: A robotic environment for aging in place. In: HRI 2009, La Jolla, California, USA, March 11-13 (2009)Google Scholar
  5. 5.
    Dautenhahn, K.: Socially intelligent robots: dimensions of human-robot interaction. Philosophical Transactions of the Royal Society of London - Series B: Biological Sciences 362(1480), 679–704 (2007)CrossRefGoogle Scholar
  6. 6.
    Mutlu, B., Bartneck, C., Ham, J., Evers, V., Kanda, T. (eds.): ICSR 2011. LNCS, vol. 7072. Springer, Heidelberg (2011)Google Scholar
  7. 7.
    Syrdal, D.S., Dautenhahn, K., Koay, K.L., Walters, M.L.: The negative attitudes towards robots scale and reactions to robot behaviour in a live human-robot interaction study. In: Proceedings on New Frontiers in Human-Robot Interaction, AISB 2009 Convention, pp. 109–115 (2009) Google Scholar
  8. 8.
    Komatsu, T.: Audio subtle expressions affecting user’s perceptions. In: Proceedings of 2006 International Conference on Intelligent User Interface, San Diego, pp. 306–308 (2006)Google Scholar
  9. 9.
    Breazeal, C., Siegel, M., Berlin, M., Gray, J., Grupen, R., Deegan, P., Weber, J., Narendren, K., McBean, J.: Mobile, dexterous, social robots for mobile manipulation and human-robot interaction. In: SIGGRAPH 2008: ACM SIGGRAPH 2008 New Tech Demos, New York (2008)Google Scholar
  10. 10.
    Cassell, J.: A Framework for Gesture Generation and Interpretation. In: Computer Vision in Human-Machine Interaction, Cambridge University Press, Cambridge (1998)Google Scholar
  11. 11.
    Lallee, S., Lemaignan, S., Lenz, A., Melhuish, C., Natale, L., Skachek, S., van Der Tanz, T., Warneken, F., Dominey, P.: Towards a platform-independent cooperative human-robot interaction system: I. perception. In: IROS, Taipei (2010)Google Scholar
  12. 12.
    Rossini, N.: Reinterpreting Gesture as Language. Language "in Action". IOS Press, Amsterdam (2012)Google Scholar
  13. 13.
    Read, R., Belpaeme, T.: Interpreting non-linguistic utterances by robots: studying the influence of physical appearance. In: Proceedings of AFFINE 2010, the 3rd International Workshop on Affective Interaction in Natural Environments, Firenze, Italy, pp. 65–70 (October 29, 2010)Google Scholar
  14. 14.
    The journal of pain of the american pain society: "pain in non-verbal elderly largely undetected by family caregivers"Google Scholar
  15. 15.
    Quenqua, D.: Pushing science’s limits in sign language lexicon (December 4, 2012)Google Scholar
  16. 16.
    Matsumoto, N., Fujii, H., Goan, M., Okada, M.: Minimal design strategy for embodied communication agents. In: The 14th IEEE International Workshop on Robot-Human Interaction, Nashville, pp. 335–340 (2005)Google Scholar
  17. 17.
    Okada, M., Sakamoto, S., Suzuki, N.: Muu: Artificial creatures as an embodied interface. In: Proceedings of 27th International Conference on Computer Graphics and Interactive Techniques (SIGGRAPH 2000), New Orleans, p. 91 (July 2000)Google Scholar
  18. 18.
    Yamada, S., Komatsu, T.: Designing Simple and Effective Expressions of Robot’s Primitive Minds to a Human. In: Human-Robot Interaction. Itech, Vienna (2007)Google Scholar
  19. 19.
    Burtt, B.: Star Wars: Galactic Phrase Book and Travel Guide. The Ballantine Publishing Group, New York (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Anthony L. Threatt
    • 1
  • Keith Evan Green
    • 1
    • 2
  • Johnell O. Brooks
    • 3
  • Jessica Merino
    • 2
  • Ian D. Walker
    • 2
  • Paul Yanik
    • 2
  1. 1.School of ArchitectureClemson UniversityClemsonUSA
  2. 2.Department of Electrical and Computer EngineeringClemson UniversityClemsonUSA
  3. 3.Department of Automotive EngineeringClemson UniversityClemsonUSA

Personalised recommendations