Recognizing Emotional Body Language Displayed by a Human-like Social Robot
- First Online:
- Cite this article as:
- McColl, D. & Nejat, G. Int J of Soc Robotics (2014) 6: 261. doi:10.1007/s12369-013-0226-7
Natural social human–robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that body language plays an important role in conveying information about changes in human emotions during human–human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional body language for our human-like social robot, Brian 2.0. We develop emotional body language for the robot using a variety of body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic body language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for life-sized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot’s emotional body language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human body language features displayed by the robot with respect to the same body language features displayed by a human actor.