Abstract
Inevitably, assistive robotics will become integral to the everyday lives of a human population that is increasingly mobile, older, urban-centric and networked. How will we communicate with such robots, and how will they communicate with us? We make the case for a relatively “artificial” mode of nonverbal human-robot communication [NVC] to avoid unnecessary distraction for people, busily conducting their lives via human-human, natural communication. We propose that this NVC be conveyed by familiar lights and sounds, and elaborate here early experiments with our NVC platform in a rehabilitation hospital. Our NVC platform was perceived by medical staff as a desirable and expedient communication mode for human-robot interaction [HRI] in clinical settings, suggesting great promise for our mode of human-robot communication for this and other applications and environments involving intimate HRI.
Chapter PDF
Similar content being viewed by others
References
Merino, J., Threatt, A.L., Walker, I.D., Green, K.E.: Forward kinematic model for continuum robotic surfaces. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal (October 2012)
Threatt, A.L., Merino, J., Green, K.E., Walker, I.D., Brooks, J.O., et al.: A vision of the patient room as an architectural robotic ecosystem. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal (October 2012)
Yanik, P., Manganelli, J., Merino, J., Threatt, T., Brooks, J.O., Green, K.E., Walker, I.D.: Use of kinect depth data and growing neural gas for gesture based robot control. In: PervaSense2012, the 4th International Workshop for Situation Recognition and Medical Data Analysis in Pervasive Health Environments, San Diego, California, May 21, pp. 283–290 (2012)
Green, K.E., Walker, I.D., Brooks, J.O., Logan Jr., W.C.: Comfortable: A robotic environment for aging in place. In: HRI 2009, La Jolla, California, USA, March 11-13 (2009)
Dautenhahn, K.: Socially intelligent robots: dimensions of human-robot interaction. Philosophical Transactions of the Royal Society of London - Series B: Biological Sciences 362(1480), 679–704 (2007)
Mutlu, B., Bartneck, C., Ham, J., Evers, V., Kanda, T. (eds.): ICSR 2011. LNCS, vol. 7072. Springer, Heidelberg (2011)
Syrdal, D.S., Dautenhahn, K., Koay, K.L., Walters, M.L.: The negative attitudes towards robots scale and reactions to robot behaviour in a live human-robot interaction study. In: Proceedings on New Frontiers in Human-Robot Interaction, AISB 2009 Convention, pp. 109–115 (2009)
Komatsu, T.: Audio subtle expressions affecting user’s perceptions. In: Proceedings of 2006 International Conference on Intelligent User Interface, San Diego, pp. 306–308 (2006)
Breazeal, C., Siegel, M., Berlin, M., Gray, J., Grupen, R., Deegan, P., Weber, J., Narendren, K., McBean, J.: Mobile, dexterous, social robots for mobile manipulation and human-robot interaction. In: SIGGRAPH 2008: ACM SIGGRAPH 2008 New Tech Demos, New York (2008)
Cassell, J.: A Framework for Gesture Generation and Interpretation. In: Computer Vision in Human-Machine Interaction, Cambridge University Press, Cambridge (1998)
Lallee, S., Lemaignan, S., Lenz, A., Melhuish, C., Natale, L., Skachek, S., van Der Tanz, T., Warneken, F., Dominey, P.: Towards a platform-independent cooperative human-robot interaction system: I. perception. In: IROS, Taipei (2010)
Rossini, N.: Reinterpreting Gesture as Language. Language "in Action". IOS Press, Amsterdam (2012)
Read, R., Belpaeme, T.: Interpreting non-linguistic utterances by robots: studying the influence of physical appearance. In: Proceedings of AFFINE 2010, the 3rd International Workshop on Affective Interaction in Natural Environments, Firenze, Italy, pp. 65–70 (October 29, 2010)
The journal of pain of the american pain society: "pain in non-verbal elderly largely undetected by family caregivers"
Quenqua, D.: Pushing science’s limits in sign language lexicon (December 4, 2012)
Matsumoto, N., Fujii, H., Goan, M., Okada, M.: Minimal design strategy for embodied communication agents. In: The 14th IEEE International Workshop on Robot-Human Interaction, Nashville, pp. 335–340 (2005)
Okada, M., Sakamoto, S., Suzuki, N.: Muu: Artificial creatures as an embodied interface. In: Proceedings of 27th International Conference on Computer Graphics and Interactive Techniques (SIGGRAPH 2000), New Orleans, p. 91 (July 2000)
Yamada, S., Komatsu, T.: Designing Simple and Effective Expressions of Robot’s Primitive Minds to a Human. In: Human-Robot Interaction. Itech, Vienna (2007)
Burtt, B.: Star Wars: Galactic Phrase Book and Travel Guide. The Ballantine Publishing Group, New York (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Threatt, A.L., Green, K.E., Brooks, J.O., Merino, J., Walker, I.D., Yanik, P. (2013). Design and Evaluation of a Nonverbal Communication Platform between Assistive Robots and their Users. In: Streitz, N., Stephanidis, C. (eds) Distributed, Ambient, and Pervasive Interactions. DAPI 2013. Lecture Notes in Computer Science, vol 8028. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39351-8_55
Download citation
DOI: https://doi.org/10.1007/978-3-642-39351-8_55
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-39350-1
Online ISBN: 978-3-642-39351-8
eBook Packages: Computer ScienceComputer Science (R0)