Designing and Assessing Expressive Open-Source Faces for the Baxter Robot
Facial expressions of both humans and robots are known to communicate important social cues to human observers. Nevertheless, faces for use on the flat panel display screens of physical multi-degree-of-freedom robots have not been exhaustively studied. While surveying owners of the Rethink Robotics Baxter Research Robot to establish their interest, we designed a set of 49 Baxter faces, including seven colors (red, orange, yellow, green, blue, purple, and gray) and seven expressions (afraid, angry, disgusted, happy, neutral, sad, and surprised). Online study participants (N = 568) drawn equally from two countries (US and India) then rated photographs of a physical Baxter robot displaying randomized subsets of the faces. Face color, facial expression, and onlooker country of origin all significantly affected the perceived pleasantness and energeticness of the robot, as well as the onlooker’s feelings of safety and pleasedness, with facial expression causing the largest effects. The designed faces are available to researchers online.
KeywordsSocial robotics Expressive robot faces Baxter Research Robot
The first author was supported by a National Science Foundation (NSF) Graduate Research Fellowship under Grant No. DGE-0822 and an NSF Integrative Graduate Education and Research Traineeship under Grant No. 0966142. We thank Chris Callison-Burch and Eileen Huang for their advice.
- 1.Fitter, N.T., Kuchenbecker, K.J.: Equipping the Baxter robot with human-inspired hand-clapping skills. Accepted to the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (2016)Google Scholar
- 5.Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the human face: Guidelines for research and an integration of findings. Pergamon Press (2013)Google Scholar
- 6.Russell, J.A., Fernández-Dols, J.M.: The psychology of facial expression. Cambridge University Press (1997)Google Scholar
- 7.Schiano, D.J., Ehrlich, S.M., Rahardja, K., Sheridan, K.: Face to interface: facial affect in (Hu)man and machine. In: ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 193–200 (2000)Google Scholar
- 10.Bruce, A., Nourbakhsh, I., Simmons, R.: The role of expressiveness and attention in human-robot interaction. In: IEEE International Conference on Robotics and Automation (ICRA), vol. 4 (2002)Google Scholar
- 11.Blow, M., Dautenhahn, K., Appleby, A., Nehaniv, C.L., Lee, D.: The art of designing robot faces: dimensions for human-robot interaction. In: ACM SIGCHI/SIGART Conference on Human-Robot Interaction, pp. 331–332 (2006)Google Scholar
- 12.DiSalvo, C.F., Gemperle, F., Forlizzi, J., Kiesler, S.: All robots are not created equal: the design and perception of humanoid robot heads. In: ACM Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, pp. 321–326 (2002)Google Scholar
- 14.Sosnowski, S., Bittermann, A., Kuhnlenz, K., Buss, M.: Design and evaluation of emotion-display EDDIE. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3113–3118 (2006)Google Scholar
- 15.Bartneck, C., Reichenbach, J., Breemen, V.A.: In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions. In: Design and Emotion 2004 Conference (2004)Google Scholar
- 17.Nuñez, D., Tempest, M., Viola, E., Breazeal, C.: An initial discussion of timing considerations raised during development of a magician-robot interaction. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) Workshop on Timing in HRI (2014)Google Scholar
- 18.Takayama, L., Ju, W., Nass, C.: Beyond dirty, dangerous and dull: what everyday people think robots should do. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2008)Google Scholar