Designing an Expressive Head for a Help Requesting Socially Assistive Robot

  • Tim van der GrintenEmail author
  • Steffen Müller
  • Martin Westhoven
  • Sascha Wischniewski
  • Andrea Scheidig
  • Horst-Michael Gross
Conference paper
Part of the Springer Proceedings in Advanced Robotics book series (SPAR, volume 12)


In this paper, we present the developments regarding an expressive robot head for our socially assistive mobile robot HERA, which among other things is serving as an autonomous delivery system in public buildings. One aspect of that task is contacting and interacting with unconcerned people in order get help when doors are to open or an elevator has to be used. We designed and tested a robot head comprising a pan-tilt unit, 3D-printed shells, animated eyes displayed on two LCD-screens, and three arrays of RGB-LEDs for communicating internal robot states and attracting potential helpers’ interest. An online-study was performed to compare variations of eye-expression and LED lighting. Data was extracted from the answers of 139 participants. Statistical analysis showed significant differences in identification performance for our intended eye-expressions, perceived politeness, help intentions, and hedonic user experience.


Mechatronic design Expressive robot head Social robots 


  1. 1.
    FRAME Project Website: Neuroinformatics and Cognitive Robotics Lab, Ilmenau University of Technology. (in German)
  2. 2.
    Backhaus, N., Rosen, P.H., Scheidig, A., Gross, H.M., Wischniewski, S.: Help me, please?! Interaction design framework for needy mobile service robot. In: IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO), pp. 54–61 (2018)Google Scholar
  3. 3.
    Liebner, J., Scheidig, A., Gross, H.M.: Now I need help! passing doors and using elevators as an assistance requiring robot. In: International Conference on Social Robotics (ICSR). Springer (2019)Google Scholar
  4. 4.
    Song, S., Yamada, S.: Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 2–11. ACM (2017)Google Scholar
  5. 5.
    Baraka, K., Veloso, M.M.: Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int. J. Social Robot. 10(1), 65–92 (2018)CrossRefGoogle Scholar
  6. 6.
    Bennett, C.C., Šabanović, S.: Deriving minimal features for human-like facial expressions in robotic faces. Int. J. Social Robot. 6(3), 367–381 (2014)CrossRefGoogle Scholar
  7. 7.
    Ernest-Jones, M., Nettle, D., Bateson, M.: Effects of eye images on everyday cooperative behavior: a field experiment. Evol. Hum. Behav. 32(3), 172–178 (2011)CrossRefGoogle Scholar
  8. 8.
    Breazeal, C.: Emotion and sociable humanoid robots. Int. J. Hum Comput Stud. 59(1–2), 119–155 (2003)CrossRefGoogle Scholar
  9. 9.
    Marsh, A.A., Ambady, N., Kleck, R.E.: The effects of fear and anger facial expressions on approach-and avoidance-related behaviors. Emotion 5(1), 119 (2005)CrossRefGoogle Scholar
  10. 10.
    Lee, H.R., Šabanović, S., Stolterman, E.: How humanlike should a social robot be: a user-centered exploration. In: 2016 AAAI Spring Symposium Series (2016)Google Scholar
  11. 11.
    Mathur, M.B., Reichling, D.B.: Navigating a social world with robot partners: a quantitative cartography of the uncanny valley. Cognition 146, 22–32 (2016)CrossRefGoogle Scholar
  12. 12.
    Gross, H.-M., Boehme, H.-J., Schroeter, C., Mueller, S., Koenig, A., Martin, C., Merten, M., Bley, A.: ShopBot: progress in developing an interactive mobile shopping assistant for everyday use. In: IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, pp. 3471–3478 (2008)Google Scholar
  13. 13.
    Gross, H.-M., Mueller, S., Schroeter, C., Volkhardt, M., Scheidig, A., Debes, K., Richter, K., Doering, N.: Robot companion for domestic health assistance: Implementation, test and case study under everyday conditions in private apartments. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5992–5999. IEEE (2015)Google Scholar
  14. 14.
    ISO 60204–1:2016: Safety of machinery - Electrical equipment of machines - Part 1: General requirements. International Organization for Standardization, Geneva, CH, Standard (2016)Google Scholar
  15. 15.
    Volkhardt, M., Weinrich, Ch., Gross, H.-M.: Multi-modal people tracking on a mobile companion robot. In: European Conference on Mobile Robots (ECMR) (2013)Google Scholar
  16. 16.
    Vorndran, A., Trinh, T.Q., Mueller, S., Scheidig, A., Gross, H.M.: How to always keep an eye on the user with a mobile robot? In: International Symposium on Robotics (ISR), Munich, Germany. VDE Verlag, pp. 219–225 (2018)Google Scholar
  17. 17.
    Dautenhahn, K., Walters, M., Woods, S., Koay, K.L., Nehaniv, C.L., Sisbot, A., Alami, R., Siméon, T.: How may I serve you?: A robot companion approaching a seated person in a helping context. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-robot Interaction. ACM, pp. 172–179 (2006)Google Scholar
  18. 18.
    Sosnowski, S., Bittermann, A., Kuhnlenz, K., Buss, M.: Design and evaluation of emotion-display EDDIE. In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3113–3118. IEEE (2006)Google Scholar
  19. 19.
    Schrepp, M., Hinderks, A., Thomaschewski, J.: Design and evaluation of a short version of the user experience questionnaire (UEQ-S). IJIMAI 4(6), 103–108 (2017)CrossRefGoogle Scholar
  20. 20.
    Salem, M., Ziadee, M., Sakr, M.: Marhaba, how may I help you? Effects of politeness and culture on robot acceptance and anthropomorphization. In: 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, pp. 74–81 (2014)Google Scholar
  21. 21.
    Pavey, L., Greitemeyer, T., Sparks, P.: I help because I want to, not because you tell me to empathy increases autonomously motivated helping. Pers. Soc. Psychol. Bull. 38(5), 681–689 (2012)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Tim van der Grinten
    • 1
    Email author
  • Steffen Müller
    • 1
  • Martin Westhoven
    • 2
  • Sascha Wischniewski
    • 2
  • Andrea Scheidig
    • 1
  • Horst-Michael Gross
    • 1
  1. 1.Neuroinformatics and Cognitive Robotics LabIlmenau University of TechnologyIlmenauGermany
  2. 2.Unit 2.3 Human Factors, Ergonomics, Division 2 Products and Work SystemsGerman Federal Institute for Occupational Safety and HealthDortmundGermany

Personalised recommendations