Autonomous Robots

, Volume 41, Issue 5, pp 1189–1201 | Cite as

Autonomous human–robot proxemics: socially aware navigation based on interaction potential

  • Ross MeadEmail author
  • Maja J Matarić


To enable situated human–robot interaction (HRI), an autonomous robot must both understand and control proxemics—the social use of space—to employ natural communication mechanisms analogous to those used by humans. This work presents a computational framework of proxemics based on data-driven probabilistic models of how social signals (speech and gesture) are produced (by a human) and perceived (by a robot). The framework and models were implemented as autonomous proxemic behavior systems for sociable robots, including: (1) a sampling-based method for robot proxemic goal state estimation with respect to human–robot distance and orientation parameters, (2) a reactive proxemic controller for goal state realization, and (3) a cost-based trajectory planner for maximizing automated robot speech and gesture recognition rates along a path to the goal state. Evaluation results indicate that the goal state estimation and realization significantly improve upon past work in human–robot proxemics with respect to “interaction potential”—predicted automated speech and gesture recognition rates as the robot enters into and engages in face-to-face social encounters with a human user—illustrating their efficacy to support richer robot perception and autonomy in HRI.


Human–robot interaction Proxemics Social signals Interaction potential Goal state estimation Path-planning 



This work was supported in part by an NSF Graduate Research Fellowship, the NSF National Robotics Initiative Grant IIS-1208500, and NSF Grants CNS-0709296 and CNS-1513275.

Supplementary material

Supplementary material 1 (mp4 81408 KB)


  1. Argyle, M., & Dean, J. (1965). Eye-contact, distance, and affliciation. Sociometry, 28, 289–304.CrossRefGoogle Scholar
  2. Bailenson, J., Blascovich, J., Beall, A., & Loomis, J. (2001). Equilibrium theory revisited: Mutual gaze and personal space in virtual environments. Presence, 10(6), 583–598.CrossRefGoogle Scholar
  3. Brooks, A. G., & Arkin, R. C. (2007). Behavioral overlays for non-verbal communication expression on a humanoid robot. Autonomous Robots, 22(1), 55–74.CrossRefGoogle Scholar
  4. Burgoon, J., Stern, L., & Dillman, L. (1995). Interpersonal adaptation: Dyadic interaction patterns. New York: Cambridge University Press.CrossRefGoogle Scholar
  5. Feil-Seifer, D., & Matarić, M. (2011a). Automated detection and classification of positive vs. negative robot interactions with children with autism using distance-based features. In Proceedings of the 6th ACM/IEEE international conference on human–robot interaction (pp. 323–330). Lausanne, Switzerland, HRI’11.Google Scholar
  6. Feil-Seifer, D., & Matarić, M. (2011b). People-aware navigation for goal-oriented behavior involving a human partner. In IEEE International Conference on Development and Learning (vol. 2, pp. 1–6). Frankfurt Am Main, Germany, ICDL’11.Google Scholar
  7. Fox, D., Burgard, W., & Thrun, S. (1997). The dynamic window approach to collision avoidance. IEEE Robotics & Automation Magazine, 4(1), 23–33.CrossRefGoogle Scholar
  8. Gerkey, B., & Konolige, K. (2008). Planning and control in unstructured terrain. In ICRA Workshop on Path Planning on Costmaps. Pasadena, California, ICRA’08.Google Scholar
  9. Hall, E. (1963). A system for notation of proxemic behavior. American Anthropologist, 65, 1003–1026.CrossRefGoogle Scholar
  10. Hall, E. T. (1959). The silent language. New York: Doubleday Company.Google Scholar
  11. Hall, E. T. (1966). The hidden dimension. Chicago: Doubleday Company.Google Scholar
  12. Hayduk, L., & Mainprize, S. (1980). Personal space of the blind. Social Psychology Quarterly, 43(2), 216–223.CrossRefGoogle Scholar
  13. Hüttenrauch, H., Eklundh, K., Green, A., & Topp, E. (2006). Investigating spatial relationships in human–robot interaction. In 2006 IEEE/RSJ international conference on intelligent robots and systems, IROS’06 (pp. 5052–5059).Google Scholar
  14. ISO (2003) Acoustics—Normal equal-loudness-level contours (iso 226:2003). International Organization for Standardization.Google Scholar
  15. Kastanis, I., & Slater, M. (2012). Reinforcement learning utilizes proxemics: An avatar learns to manipulate the position of people in immersive virtual reality. Transactions on Applied Perception, 9(1), 1–15.CrossRefGoogle Scholar
  16. Kuzuoka, H., Suzuki, Y., Yamashita, J., & Yamazaki, K. (2010). Reconfiguring spatial formation arrangement by robot body orientation. In HRI, Osaka, Japan.Google Scholar
  17. Mallenby, T. W. (1975). The personal space of hard-of-hearing children after extended contact with “normals”. British Journal of Social and Clinical Psychology, 14(3), 253–257.CrossRefGoogle Scholar
  18. Marder-Eppstein, E., Berger, E., Foote, T., Gerkey, B., & Konolige, K. (2010). The office marathon: Robust navigation in an indoor office environment. In IEEE international conference on robotics and automation (pp. 300–307). Anchorage, Alaska, ICRA’10.Google Scholar
  19. Marquardt, N., & Greenberg, S. (2012). Informing the design of proxemic interactions. IEEE Pervasive Computing, 11(2), 14–23.CrossRefGoogle Scholar
  20. McNeill, D. (2005). Gesture and thought. Chicago: Chicago University Press.CrossRefGoogle Scholar
  21. Mead, R., & Matarić, M.J. (2012). A probabilistic framework for autonomous proxemic control in situated and mobile human–robot interaction. In 7th ACM/IEEE international conference on human–robot interaction (pp. 193–194). Boston, MA, HRI’12.Google Scholar
  22. Mead, R., & Matarić, M.J. (2015a). Proxemics and performance: Subjective human evaluations of autonomous sociable robot distance and social signal understanding. In The 2015 IEEE/RSJ international conference on intelligent robots and systems. Hamburg, Germany, IROS’15.Google Scholar
  23. Mead, R., & Matarić, M.J. (2015b). Robots have needs too: People adapt their proxemic behavior to improve autonomous robot recognition of human social signals. In 4th International symposium on new frontiers in human–robot interaction. Canterbury, UK, NF-HRI’15.Google Scholar
  24. Mead, R., & Matarić, M.J. (2016). Perceptual models of human–robot proxemics. In Experimental robotics, springer tracts in advanced robotics (Vol. 109, pp. 261–276).Google Scholar
  25. Mead, R., Wade, E., Johnson, P., Clair, A.S., Chen, S., & Mataric, M.J. (2010). An architecture for rehabilitation task practice in socially assistive human–robot interaction. In Robot and human interactive communication (pp. 404–409).Google Scholar
  26. Mead, R., Atrash, A., & Matarić, M.J. (2012). Representations of proxemic behavior for human–machine interaction. In NordiCHI 2012 workshop on proxemics in human–computer interaction. Copenhagen, Denmark, NordiCHI’12.Google Scholar
  27. Mead, R., Atrash, A., & Matarić, M. J. (2013). Automated proxemic feature extraction and behavior recognition: Applications in human–robot interaction. International Journal of Social Robotics, 5(3), 367–378.CrossRefGoogle Scholar
  28. Mehrabian, A. (1972). Nonverbal communication. Aldine Transcation: Piscataway.Google Scholar
  29. Mumm, J., & Mutlu, B. (2011). Human–robot proxemics: Physical and psychological distancing in human–robot interaction. In 6th ACM/IEEE International conference on human–robot interaction (pp. 331–338). Lausanne, HRI-2011.Google Scholar
  30. Oosterhout, T., & Visser, A. (2008). A visual method for robot proxemics measurements. In HRI workshop on metrics for human–robot interaction, Amsterdam.Google Scholar
  31. Ozyurek, A. (2002). Do speakers design their co-speech gestures for their addresees? the effects of addressee location on representational gestures. Journal of Memory and Language, 46(4), 688–704.CrossRefGoogle Scholar
  32. Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T,. Leibs, J., Wheeler, R., & Ng, A. (2009). Ros: An open-source robot operating system. In ICRA workshop on open source software. Kobe, Japan, ICRA’09.Google Scholar
  33. Rabiner, L. (1990). A tutorial on hidden Markov models and selected applications in speech recognition. In Readings in speech recognition (pp. 267–296).Google Scholar
  34. Rossini, N. (2004). The analysis of gesture: establishing a set of parameters. In Gesture-based communication in human–computer interaction (Vol. 2915, pp. 463–464). Springer-VerlagGoogle Scholar
  35. Schegloff, E. (1998). Body torque. Social Research, 65(3), 535–596.Google Scholar
  36. Takayama, L., & Pantofaru, C. (2009). Influences on proxemic behaviors in human–robot interaction. In 2009 IEEE/RSJ international conference on intelligent robots and systems, IROS’09 (pp. 5495–5502).Google Scholar
  37. Tapus, A., Matarić, M., & Scassellati, B. (2007). The grand challenges in socially assistive robotics. IEEE Robotics and Automation Magazine, 14(1), 35–42.CrossRefGoogle Scholar
  38. Trautman, P., & Krause, A. (2010). Unfreezing the robot: Navigation in dense, interacting crowds. In 2010 IEEE/RSJ international conference on intelligent robots and systems, IROS’10 (pp. 797–803).Google Scholar
  39. Walters, M., Dautenhahn, K., Boekhorst, R., Koay, K., Syrdal, D., & Nehaniv, C. (2009). An empirical framework for human–robot proxemics. In New frontiers in human–robot interaction, Edinburgh (pp. 144–149).Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Los AngelesUSA
  2. 2.Los AngelesUSA

Personalised recommendations