Designing and Assessing Expressive Open-Source Faces for the Baxter Robot

  • Naomi T. Fitter
  • Katherine J. Kuchenbecker
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9979)


Facial expressions of both humans and robots are known to communicate important social cues to human observers. Nevertheless, faces for use on the flat panel display screens of physical multi-degree-of-freedom robots have not been exhaustively studied. While surveying owners of the Rethink Robotics Baxter Research Robot to establish their interest, we designed a set of 49 Baxter faces, including seven colors (red, orange, yellow, green, blue, purple, and gray) and seven expressions (afraid, angry, disgusted, happy, neutral, sad, and surprised). Online study participants (N = 568) drawn equally from two countries (US and India) then rated photographs of a physical Baxter robot displaying randomized subsets of the faces. Face color, facial expression, and onlooker country of origin all significantly affected the perceived pleasantness and energeticness of the robot, as well as the onlooker’s feelings of safety and pleasedness, with facial expression causing the largest effects. The designed faces are available to researchers online.


Social robotics Expressive robot faces Baxter Research Robot 



The first author was supported by a National Science Foundation (NSF) Graduate Research Fellowship under Grant No. DGE-0822 and an NSF Integrative Graduate Education and Research Traineeship under Grant No. 0966142. We thank Chris Callison-Burch and Eileen Huang for their advice.


  1. 1.
    Fitter, N.T., Kuchenbecker, K.J.: Equipping the Baxter robot with human-inspired hand-clapping skills. Accepted to the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (2016)Google Scholar
  2. 2.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robot. Auton. Syst. 42(3), 143–166 (2003)CrossRefzbMATHGoogle Scholar
  3. 3.
    Leite, I., Martinho, C., Paiva, A.: Social robots for long-term interaction: a survey. Int. J. Social Robot. 5(2), 291–308 (2013)CrossRefGoogle Scholar
  4. 4.
    Kose-Bagci, H., Ferrari, E., Dautenhahn, K., Syrdal, D.S., Nehaniv, C.L.: Effects of embodiment and gestures on social interaction in drumming games with a humanoid robot. Adv. Robot. 23(14), 1951–1996 (2009)CrossRefGoogle Scholar
  5. 5.
    Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the human face: Guidelines for research and an integration of findings. Pergamon Press (2013)Google Scholar
  6. 6.
    Russell, J.A., Fernández-Dols, J.M.: The psychology of facial expression. Cambridge University Press (1997)Google Scholar
  7. 7.
    Schiano, D.J., Ehrlich, S.M., Rahardja, K., Sheridan, K.: Face to interface: facial affect in (Hu)man and machine. In: ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 193–200 (2000)Google Scholar
  8. 8.
    Valdez, P., Mehrabian, A.: Effects of color on emotions. J. Exp. Psychol. Gen. 123(4), 394–409 (1994)CrossRefGoogle Scholar
  9. 9.
    Markus, H.R., Kitayama, S.: Culture and the self: Implications for cognition, emotion, and motivation. Psychol. Rev. 98(2), 224–253 (1991)CrossRefGoogle Scholar
  10. 10.
    Bruce, A., Nourbakhsh, I., Simmons, R.: The role of expressiveness and attention in human-robot interaction. In: IEEE International Conference on Robotics and Automation (ICRA), vol. 4 (2002)Google Scholar
  11. 11.
    Blow, M., Dautenhahn, K., Appleby, A., Nehaniv, C.L., Lee, D.: The art of designing robot faces: dimensions for human-robot interaction. In: ACM SIGCHI/SIGART Conference on Human-Robot Interaction, pp. 331–332 (2006)Google Scholar
  12. 12.
    DiSalvo, C.F., Gemperle, F., Forlizzi, J., Kiesler, S.: All robots are not created equal: the design and perception of humanoid robot heads. In: ACM Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, pp. 321–326 (2002)Google Scholar
  13. 13.
    Breazeal, C.: Emotion and sociable humanoid robots. Int. J. Hum. Comput. Stud. 59(1), 119–155 (2003)CrossRefGoogle Scholar
  14. 14.
    Sosnowski, S., Bittermann, A., Kuhnlenz, K., Buss, M.: Design and evaluation of emotion-display EDDIE. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3113–3118 (2006)Google Scholar
  15. 15.
    Bartneck, C., Reichenbach, J., Breemen, V.A.: In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions. In: Design and Emotion 2004 Conference (2004)Google Scholar
  16. 16.
    Kirby, R., Forlizzi, J., Simmons, R.: Affective social robots. Robot. Auton. Syst. 58(3), 322–332 (2010)CrossRefGoogle Scholar
  17. 17.
    Nuñez, D., Tempest, M., Viola, E., Breazeal, C.: An initial discussion of timing considerations raised during development of a magician-robot interaction. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) Workshop on Timing in HRI (2014)Google Scholar
  18. 18.
    Takayama, L., Ju, W., Nass, C.: Beyond dirty, dangerous and dull: what everyday people think robots should do. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2008)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Haptics Group, GRASP Laboratory, Mechanical Engineering and Applied MechanicsUniversity of PennsylvaniaPhiladelphiaUSA

Personalised recommendations