Skip to main content
Log in

Why Do Robots Need a Head? The Role of Social Interfaces on Service Robots

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Developing effective ways for robots to communicate with humans presents many significant design challenges and requires detailed consideration of a wide range of factors. To facilitate communication between people and machines, it is common for robots to possess head-like features capable of providing social feedback through facial expressions, attention, gaze, etc. This paper explores the multifaceted roles that robotic head-like interfaces play in human–robot interaction. The research makes two main contributions. First, the paper outlines the motivations for using social interfaces on service robots and reviews key design insights from past studies in the field. Second, a taxonomy for classifying robot heads is proposed. This taxonomy has broad appeal for designers, as it gives structure to a large, disorganised design space.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Sequence of images taken from [70]

Fig. 2
Fig. 3

Images of robots taken from dataset given in [117]

Fig. 4

Images of robots taken from dataset given in [117]

Fig. 5

Images of robots taken from dataset given in [117]

Fig. 6

Images of robots taken from dataset given in [117]

Fig. 7

Images of robots taken from dataset given in [117]

Similar content being viewed by others

Notes

  1. The logic behind this hypothesis may not hold true for some features which may have once evolved to serve a purpose, but have since become redundant. For example, humans possess auricular muscles in the ear, which originally evolved to help detect predators/prey, but currently do not serve a purpose since humans lost the ability to control them [61].

  2. The exact number of basic emotions remains a topic of debate among researchers—see [3] for a summary of the most prominent theories on the subject.

References

  1. International Organization for Standardization (2012) Robots and robotic devices–vocabulary. ISO 8373:2012

    Google Scholar 

  2. International Organization for Standardization (2014) Robots and robotic devices–safety requirements for personal care robots. ISO 13482:2014

    Google Scholar 

  3. Basic Emotions (2016). http://changingminds.org/explanations/emotions/basic%20emotions.htm. Accessed 30 Sept 2010

  4. Adamides G, Christou G, Katsanos C, Xenos M, Hadzilacos T (2015) Usability guidelines for the design of robot teleoperation: a taxonomy. IEEE Trans Hum Mach Syst 45(2):256–262

    Article  Google Scholar 

  5. Álvarez M, Galán R, Matía F, Rodríguez-Losada D, Jiménez A (2010) An emotional model for a guide robot. IEEE Trans Syst Man Cybern Part A Syst Hum 40(5):982–992

    Article  Google Scholar 

  6. Aly A, Griffiths S, Stramandinoli F (2017) Metrics and benchmarks in human-robot interaction: recent advances in cognitive robotics. Cognit Syst Res 43:313–323

    Article  Google Scholar 

  7. Azenkot S, Feng C, Cakmak M (2016) Enabling building service robots to guide blind people: a participatory design approach. In: The eleventh ACM/IEEE international conference on human robot interaction. IEEE Press, pp 3–10

  8. Azuma J, Ebner M (2008) A stylistic analysis of graphic emoticons: can they be candidates for a universal visual language of the future. In: Proceedings of world conference on educational multimedia, hypermedia and telecommunications, pp 972–979

  9. Bartneck C, Lyons MJ (2009) Facial expression analysis, modeling and synthesis: overcoming the limitations of artificial intelligence with the art of the soluble. In: Handbook of research on synthetic emotions and sociable robotics: new applications in affective computing and artificial intelligence. IGI Global, pp 34–55

  10. Bartneck C, Yogeeswaran K, Ser QM, Woodward G, Sparrow R, Wang S, Eyssel F (2018) Robots and racism. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction. ACM, pp 196–204

  11. Bates J et al (1994) The role of emotion in believable agents. Commun ACM 37(7):122–125

    Article  Google Scholar 

  12. Beck A, Hiolle A, Canamero L (2013) Using perlin noise to generate emotional expressions in a robot. In: CogSci

  13. Berns K, Braum T (2005) Design concept of a human-like robot head. In: 5th IEEE-RAS international conference on humanoid robots. IEEE, pp 32–37

  14. Bradley MM, Miccoli L, Escrig MA, Lang PJ (2008) The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4):602–607

    Article  Google Scholar 

  15. Breazeal C (2003) Toward sociable robots. Robot Autono Syst 42(3–4):167–175

    Article  Google Scholar 

  16. Breazeal C (2004) Social interactions in HRI: the robot view. IEEE Trans Syst Man Cybern Part C (Appl Rev) 34(2):181–186

    Article  Google Scholar 

  17. Breazeal C, Brooks A, Gray J, Hoffman G, Kidd C, Lee H, Lieberman J, Lockerd A, Chilongo D (2004) Tutelage and collaboration for humanoid robots. Int J Humanoid Robot 1(02):315–348

    Article  Google Scholar 

  18. Breazeal C, Edsinger A, Fitzpatrick P, Scassellati B (2001) Active vision for sociable robots. IEEE Trans Syst Man Cybern Part A Syst Hum 31(5):443–453

    Article  Google Scholar 

  19. Breazeal C, Scassellati B (1999) A context-dependent attention system for a social robot. In: Proceedings of the 16th international joint conference on artificial intelligence. IJCAI’99, vol 2. Morgan Kaufmann Publishers Inc., San Francisco, pp 1146–1151

    Google Scholar 

  20. van Breemen A, Yan X, Meerbeek B (2005) ICAT: an animated user-interface robot with personality. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems. ACM, pp 143–144

  21. Brooks RA (1991) Intelligence without representation. Artif Intell 47(1–3):139–159

    Article  Google Scholar 

  22. Brooks RA et al (1991) Intelligence without reason. In: IJCAI 91:569–595

    Google Scholar 

  23. Bruce A, Nourbakhsh I, Simmons, R (2002) The role of expressiveness and attention in human–robot interaction. In: Proceedings of IEEE international conference on robotics and automation (Cat. No. 02CH37292), vol 4, pp 4138–4142. https://doi.org/10.1109/ROBOT.2002.1014396

  24. Bruce V (1996) The role of the face in communication: implications for videophone design. Interact Comput 8(2):166–176. https://doi.org/10.1016/0953-5438(96)01026-0

    Article  Google Scholar 

  25. Buchanan R (1992) Wicked problems in design thinking. Des Issues 8(2):5–21

    Article  MathSciNet  Google Scholar 

  26. Burkhardt F, Campbell N (2015) Emotional speech synthesis. In: Calvo RA, D’Mello S, Gratch J, Kappas A (eds) The oxford handbook of affective computing, chap 20. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199942237.013.038

  27. Calvo MG, Fernández-Martín A, Nummenmaa L (2014) Facial expression recognition in peripheral versus central vision: role of the eyes and the mouth. Psychol Res 78(2):180–195

    Article  Google Scholar 

  28. Cañamero L, Fredslund J (2001) I show you how i like you-can you read it in my face?[robotics]. IEEE Trans Syst Man Cybern Part A Syst Hum 31(5):454–459

    Article  Google Scholar 

  29. Cheetham M, Suter P, Jäncke L (2011) The human likeness dimension of the “uncanny valley hypothesis”: behavioral and functional MRI findings. Front Hum Neurosci 5:126

    Article  Google Scholar 

  30. Chou CP, Hannaford B (1996) Measurement and modeling of mckibben pneumatic artificial muscles. IEEE Trans Robot Autom 12(1):90–102

    Article  Google Scholar 

  31. Collins EC, Prescott TJ, Mitchinson B (2015) Saying it with light: a pilot study of affective communication using the miro robot. In: Conference on biomimetic and biohybrid systems. Springer, pp 243–255

  32. Coradeschi S, Kristoffersson A, Loutfi A, Von Rump S, Cesta A, Cortellessa G, Gonzalez J (2011) Towards a methodology for longitudinal evaluation of social robotic telepresence for elderly. In: Human robot interaction

  33. Curtis A, Shim J, Gargas E, Srinivasan A, Howard AM (2011) Dance dance pleo: developing a low-cost learning robotic dance therapy aid. In: Proceedings of the 10th international conference on interaction design and children. ACM, pp 149–152

  34. Darwin C (1998) The expression of the emotions in man and animals. Oxford University Press, Oxford

    Google Scholar 

  35. Dautenhahn K (1998) The art of designing socially intelligent agents: science, fiction, and the human in the loop. Appl Artif Intell 12(7–8):573–617. https://doi.org/10.1080/088395198117550

    Article  Google Scholar 

  36. Dautenhahn K (1999) Socially intelligent agents and the primate social brain-towards a science of social minds. Adapt Behav 7(3–4):3–4

    Google Scholar 

  37. Dautenhahn K, Woods S, Kaouri C, Walters ML, Koay KL, Werry I (2005) What is a robot companion - friend, assistant or butler? In: IEEE/RSJ international conference on intelligent robots and systems, pp 1192–1197. https://doi.org/10.1109/IROS.2005.1545189

  38. De Gelder B (2009) Why bodies? twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc Lond B Biol Sci 364(1535):3475–3484

    Article  Google Scholar 

  39. De Graaf MM, Allouch SB (2013) Exploring influencing variables for the acceptance of social robots. Robot Auton Syst 61(12):1476–1486

    Article  Google Scholar 

  40. De Santis A, Siciliano B, De Luca A, Bicchi A (2008) An atlas of physical human-robot interaction. Mech Mach Theory 43(3):253–270

    Article  MATH  Google Scholar 

  41. Dennett DC (1971) Intentional systems. J Philos 68(4):87–106

    Article  Google Scholar 

  42. DiSalvo C, Louw M, Holstius D, Nourbakhsh I, Akin A (2012) Toward a public rhetoric through participatory design: critical engagements and creative expression in the neighborhood networks project. Des Issues 28(3):48–61

    Article  Google Scholar 

  43. DiSalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the 4th conference on designing interactive systems: processes, practices, methods, and techniques. ACM, pp 321–326

  44. Duchenne GB (1876) Mécanisme de la physionomie humaine: où. Analyse électro-physiologique de l’expression des passions. J.-B. Baillière, Paris

    Google Scholar 

  45. Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42(3–4):177–190. https://doi.org/10.1016/S0921-8890(02)00374-3

    Article  MATH  Google Scholar 

  46. Ekman P (1993) Facial expression and emotion. Am Psychol 48(4):384

    Article  Google Scholar 

  47. Ekman P (1977) Facial action coding system

  48. Elprama SA, Jewell CI, Jacobs A, El Makrini I, Vanderborght B (2017) Attitudes of factory workers towards industrial and collaborative robots. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 113–114

  49. Erden MS, Tapus A (2010) Postural expressions of emotions in a humanoid robot for assistive applications. In: Poster paper in workshop on learning for human–robot interaction modeling under the conference of robotics science and systems-RSS, pp 27–30

  50. Feil-Seifer D, Matarić MJ (2011) Socially assistive robotics. IEEE Robot Autom Mag 18(1):24–31

    Article  Google Scholar 

  51. Fink J (2012) Anthropomorphism and human likeness in the design of robots and human–robot interaction. Springer, Berlin, pp 199–208. https://doi.org/10.1007/978-3-642-34103-8_20

  52. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166. https://doi.org/10.1016/S0921-8890(02)00372-X

    Article  MATH  Google Scholar 

  53. Forlizzi J, DiSalvo C, Gemperle F (2004) Assistive robotics and an ecology of elders living independently in their homes. Hum Comput Interact 19(1):25–59

    Article  Google Scholar 

  54. Förster F, Weiss A, Tscheligi M (2011) Anthropomorphic design for an interactive urban robot: the right design approach. In: Proceedings of the 6th international conference on human–robot interaction. ACM, pp 137–138

  55. Frauenberger C, Makhaeva J, Spiel K (2017) Blending methods: developing participatory design sessions for autistic children. In: Proceedings of the 2017 conference on interaction design and children. ACM, pp 39–49

  56. Fujita M (2004) On activating human communications with pet-type robot AIBO. Proc IEEE 92(11):1804–1813

    Article  Google Scholar 

  57. Grabiner JV (1986) Computers and the nature of man: a historian’s perspective on controversies about artificial intelligence. Bull Am Math Soc. https://doi.org/10.1090/S0273-0979-1986-15461-3

    Article  MathSciNet  MATH  Google Scholar 

  58. Graf B, Reiser U, Hägele M, Mauz K, Klein P (2009) Robotic home assistant care-o-bot® 3-product vision and innovation platform. In: IEEE workshop on advanced robotics and its social impacts. IEEE, pp 139–144

  59. Grandstrand O (1999) Basic emotions. In: Dalgleish T, Power M (eds) The Oxford handbook of innovation, chap 3. Wiley, Hoboken, pp 45–60

    Google Scholar 

  60. Green A, Huttenrauch H, Norman M, Oestreicher L, Severinson Eklundh K (2000) User centered design for intelligent service robots. In: Proceedings of 9th IEEE international workshop on robot and human interactive communication. ROMAN 2000, pp 161–166. https://doi.org/10.1109/ROMAN.2000.892488

  61. Hackley SA (2015) Evidence for a vestigial pinna-orienting system in humans. Psychophysiology 52(10):1263–1270

    Article  Google Scholar 

  62. Halterman MW (2005) Emotions. In: Neuroscience, 3rd edn. Sinauer Associates, pp 687–711

  63. Hanson D, Olney A, Prilliman S, Mathews E, Zielke M, Hammons D, Fernandez R, Stephanou H (2005) Upending the uncanny valley. AAAI 5:1728–1729

    Google Scholar 

  64. Häring M, Bee N, André E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: ROMAN. IEEE, pp 204–209

  65. Hess EH, Polt JM (1960) Pupil size as related to interest value of visual stimuli. Science 132(3423):349–350

    Article  Google Scholar 

  66. Hinds PJ, Roberts TL, Jones H (2004) Whose job is it anyway? A study of human-robot interaction in a collaborative task. Hum Comput Interact 19(1):151–181. https://doi.org/10.1207/s15327051hci1901&2_7

    Article  Google Scholar 

  67. Hirsch L, Björsell A, Laaksoharju M, Obaid M (2017) Investigating design implications towards a social robot as a memory trainer. In: Proceedings of the 5th international conference on human agent interaction. ACM, pp 5–10

  68. Hjortsjö CH (1969) Man’s face and mimic language. Studen litteratur, Lund

    Google Scholar 

  69. Hoffman G (2011) On stage: robots as performers. In: RSS 2011 workshop on human–robot interaction: perspectives and contributions to robotics from the human sciences, vol 1, Los Angeles

  70. Hornung A, Phillips M, Jones EG, Bennewitz M, Likhachev M, Chitta S (2012) Navigation in three-dimensional cluttered environments for mobile manipulation. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 423–429

  71. Humphrey N (1976) The colour currency of nature. Colour for architecture, pp 95–98

  72. Huttenrauch H, Green A, Norman M, Oestreicher L, Eklundh KS (2004) Involving users in the design of a mobile office robot. IEEE Trans Syst Man Cybern Part C (Appl Rev) 34(2):113–124. https://doi.org/10.1109/TSMCC.2004.826281

    Article  Google Scholar 

  73. International Federation of Robotics (IFR): service robots (2014). http://www.ifr.org/service-robots/. Accessed 6 June 2019

  74. Joosse M, Lohse M, Evers V (2015) Crowdsourcing culture in HRI: lessons learned from quantitative and qualitative data collections. In: 3rd International workshop on culture aware robotics at ICSR, vol 15

  75. Joosse M, Lohse M, Pérez JG, Evers V (2013) What you do is who you are: the role of task context in perceived social robot personality. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 2134–2139

  76. Kang E, Jackson E, Schulte W (2010) An approach for effective design space exploration. In: Monterey workshop. Springer, pp 33–54

  77. Kidd CD, Taggart W, Turkle S A (2006) sociable robot to encourage social interaction among the elderly. In: Proceedings of IEEE international conference on robotics and automation, ICRA 2006. IEEE, pp 3972–3976

  78. Kiesler S (2005) Fostering common ground in human–robot interaction. In: IEEE international workshop on robot and human interactive communication, ROMAN. pp 729–734. https://doi.org/10.1109/ROMAN.2005.1513866

  79. Kiesler S, Goetz J (2002) Mental models and cooperation with robotic assistants. In: Proceedings of conference on human factors in computing systems. ACM Press, pp 576–577

  80. Kim ES, Paul R, Shic F, Scassellati B (2012) Bridging the research gap: making HRI useful to individuals with autism. J Hum Robot Interact 1(1):26–54

    Article  Google Scholar 

  81. Kim M, Oh K, Choi J, Jung J, Kim Y (2011) User-centered HRI: HRI research methodology for designers. In: Mixed reality and human–robot interaction. Springer, pp 13–33

  82. Kishi T, Futaki H, Trovato G, Endo N, Destephe M, Cosentino S, Hashimoto K, Takanishi A (2014) Development of a comic mark based expressive robotic head adapted to japanese cultural background. In: IEEE/RSJ international conference on intelligent robots and systems, pp 2608–2613. https://doi.org/10.1109/IROS.2014.6942918

  83. Kishi T, Otani T, Endo N, Kryczka P, Hashimoto K, Nakata K, Takanishi A (2012) Development of expressive robotic head for bipedal humanoid robot. In: IEEE/RSJ international conference on intelligent robots and systems, pp 4584–4589. https://doi.org/10.1109/IROS.2012.6386050

  84. Klamer T, Allouch SB (2010) Zoomorphic robots used by elderly people at home. In: Proceedings of 27th international conference on human factors in computing systems

  85. Kristoffersson A, Coradeschi S, Loutfi A (2013) A review of mobile robotic telepresence. Adv Hum Comput Interact 2013:3

    Article  Google Scholar 

  86. Kühnlenz K, Sosnowski S, Buss M (2010) Impact of animal-like features on emotion expression of robot head eddie. Adv Robot 24(8–9):1239–1255

    Article  Google Scholar 

  87. Lee HR, Šabanović S, Chang WL, Nagata S, Piatt J, Bennett C, Hakken D (2017) Steps toward participatory design of social robots: mutual learning with older adults with depression. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 244–253

  88. Lee MK, Forlizzi J, Kiesler S, Rybski P, Antanitis J, Savetsila S (2012) Personalization in HRI: a longitudinal field experiment. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction. ACM, pp 319–326

  89. Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308

    Article  Google Scholar 

  90. Li D, Rau PP, Li Y (2010) A cross-cultural study: Effect of robot appearance and task. Int J Soc Robot 2(2):175–186

    Article  Google Scholar 

  91. Linnæus C (1758) Systema naturæ per regna tria naturæ, secundum classes, ordines, genera, species, cum characteribus, differentiis, synonymis, locis. Tomus I. Editio decima, reformata, pp [1–4], 1–824. Holmiæ. (Salvius). http://www.animalbase.uni-goettingen.de/zooweb/servlet/AnimalBase/home/reference?id=4

  92. Lohan KS, Pitsch K, Rohlfing KJ, Fischer K, Saunders J, Lehmann H, Nehaniv C, Wrede B (2011) Contingency allows the robot to spot the tutor and to learn from interaction. In: IEEE international conference on development and learning (ICDL), vol 2. IEEE, pp 1–8

  93. Van der Loos HM, Reinkensmeyer DJ, Guglielmelli E (2016) Rehabilitation and health care robotics. In: Springer handbook of robotics. Springer, pp 1685–1728

  94. Lütkebohle I, Hegel F, Schulz S, Hackel M, Wrede B, Wachsmuth S, Sagerer G (2010) The bielefeld anthropomorphic robot head “flobi”. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 3384–3391

  95. Ma LH, Gilland E, Bass AH, Baker R (2010) Ancestry of motor innervation to pectoral fin and forelimb. Nat Commun 1:49

    Article  Google Scholar 

  96. Malmir M, Forster D, Youngstrom K, Morrison L, Movellan J (2013) Home alone: social robots for digital ethnography of toddler behavior. In: Proceedings of the IEEE international conference on computer vision workshops, pp 762–768

  97. Mathur MB, Reichling DB (2016) Navigating a social world with robot partners: a quantitative cartography of the uncanny valley. Cognition 146:22–32

    Article  Google Scholar 

  98. Matsui Y, Kanoh M, Kato S, Nakamura T, Itoh H (2010) A model for generating facial expressions using virtual emotion based on simple recurrent network. JACIII 14(5):453–463

    Article  Google Scholar 

  99. McGinn C, Cullinan MF, Culleton M, Kelly K (2017) A human-oriented framework for developing assistive service robots. Disability and rehabilitation: assistive technology, pp 1–12

  100. McGinn C, Torre I (2019) Can you tell the robot by the voice? An exploratory study on the role of voice in the perception of robots. In: 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 211–221

  101. Mehrabian A (1971) Silent messages. Wadsworth Belmont, CA

    Google Scholar 

  102. Miwa H, Okuchi T, Takanobu H, Takanishi A (2002) Development of a new human-like head robot we-4. In: IEEE/RSJ international conference on intelligent robots and systems, vol 3. IEEE, pp 2443–2448

  103. Morasso P, Bizzi E, Dichgans J (1973) Adjustment of saccade characteristics during head movements. Exp Brain Res 16(5):492–500

    Article  Google Scholar 

  104. Mori M, MacDorman KF, Kageki N (2012) The uncanny valley [from the field]. IEEE Robot Autom Mag 19(2):98–100. https://doi.org/10.1109/MRA.2012.2192811

    Article  Google Scholar 

  105. Murphy R, Schreckenghost D (2013) Survey of metrics for human–robot interaction. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction. IEEE Press, pp 197–198

  106. Murphy RR (2004) human-robot interaction in rescue robotics. IEEE Trans Syst Man Cybern Part C (Appl Rev) 34(2):138–153

    Article  MathSciNet  Google Scholar 

  107. Murray JC, Cañamero L, Hiolle A (2009) Towards a model of emotion expression in an interactive robot head. In: The 18th IEEE international symposium on robot and human interactive communication. ROMAN 2009. IEEE, pp 627–632

  108. Nakata T, Sato T, Mori T, Mizoguchi H (1998) Expression of emotion and intention by robot body movement. In: Proceedings of the 5th international conference on autonomous systems

  109. Niculescu A, van Dijk B, Nijholt A, Li H, See SL (2013) Making social robots more attractive: the effects of voice pitch, humor and empathy. Int J Soc Robot 5(2):171–191

    Article  Google Scholar 

  110. Nielsen J (1993) Iterative user-interface design. Computer 26(11):32–41

    Article  Google Scholar 

  111. Ou LC, Luo MR, Woodcock A, Wright A (2004) A study of colour emotion and colour preference. Part I: Colour emotions for single colours. Color Res Appl 29(3):232–240

  112. Paauwe RA, Keyson DV, Hoorn JF, Konijn EA (2015) Minimal requirements of realism in social robots: designing for patients with acquired brain injury. In: Proceedings of the 33rd annual ACM conference extended abstracts on human factors in computing systems. ACM, pp 2139–2144

  113. Park JJ, Haddadin S, Song JB, Albu-Schäffer A (2011) Designing optimally safe robot surface properties for minimizing the stress characteristics of human–robot collisions. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 5413–5420

  114. Partala T, Surakka V (2003) Pupil size variation as an indication of affective processing. Int J Hum Comput Stud 59(1):185–198

    Article  Google Scholar 

  115. Peng H, Zhou C, Hu H, Chao F, Li J (2015) Robotic dance in social robotics-a taxonomy. IEEE Trans Hum Mach Syst 45(3):281–293

    Article  Google Scholar 

  116. Pfeifer R, Bongard J (2006) How the body shapes the way we think: a new view of intelligence. MIT Press, Cambridge

    Book  Google Scholar 

  117. Phillips E, Zhao X, Ullman D, Malle BF (2018) What is human-like? Decomposing robots’ human-like appearance using the anthropomorphic robot (abot) database. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction. ACM, pp 105–113

  118. Plutchik R (1980) A general psychoevolutionary theory of emotion. Theor Emot 1(3–31):4

    Google Scholar 

  119. Powers A, Kramer AD, Lim S, Kuo J, Lee SL, Kiesler S (2005) Eliciting information from people with a gendered humanoid robot. In: IEEE international workshop on robot and human interactive communication. ROMAN 2005. IEEE, pp 158–163

  120. Pratt GA, Williamson MM (1995) Series elastic actuators. In: Proceedings. 1995 IEEE/RSJ international conference on intelligent robots and systems 95. ’Human robot interaction and cooperative robots’, vol 1. IEEE, pp 399–406

  121. Rosenthal-von der Pütten AM, Krämer NC, Hoffmann L, Sobieraj S, Eimler SC (2013) An experimental study on emotional reactions towards a robot. Int J Soc Robot 5(1):17–34

    Article  Google Scholar 

  122. Reece JB, Urry LA, Cain ML, Wasserman SA, Minorsky PV, Jackson RB et al (2011) Campbell biology. Pearson, Boston

    Google Scholar 

  123. Rolls BJ, Rowe EA, Rolls ET (1982) How sensory properties of foods affect human feeding behavior. Physiol Behav 29(3):409–417

    Article  Google Scholar 

  124. Rose R, Scheutz M, Schermerhorn P (2010) Towards a conceptual and methodological framework for determining robot believability. Interact Stud 11(2):314–335

    Article  Google Scholar 

  125. Ruesch J, Lopes M, Bernardino A, Hornstein J, Santos-Victor J, Pfeifer R (2008) Multimodal saliency-based bottom-up attention a framework for the humanoid robot iCub. In: IEEE international conference on robotics and automation. ICRA 2008. IEEE, pp 962–967

  126. Salter T, Michaud F, Larouche H (2010) How wild is wild? A taxonomy to characterize the ‘wildness’ of child-robot interaction. Int J Soc Robot 2(4):405–415

    Article  Google Scholar 

  127. Saygin AP, Chaminade T, Ishiguro H, Driver J, Frith C (2011) The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Soc Cognit Affect Neurosci 7(4):413–422

    Article  Google Scholar 

  128. Scassellati BM (2001) Foundations for a theory of mind for a humanoid robot. Ph.D. thesis, Massachusetts Institute of Technology

  129. Scherer KR, Oshinsky JS (1977) Cue utilization in emotion attribution from auditory stimuli. Motiv Emot 1(4):331–346

    Article  Google Scholar 

  130. Schulte J, Rosenberg C, Thrun S (1999) Spontaneous, short-term interaction with mobile robots. In: Proceedings of 1999 IEEE international conference on robotics and automation (Cat. No.99CH36288C), vol 1, pp 658–663. https://doi.org/10.1109/ROBOT.1999.770050

  131. Shayganfar M, Rich C, Sidner CL (2012) A design methodology for expressing emotion on robot faces. In: IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4577–4583

  132. Shibata T, Tashima T, Tanie K (1999) Emergence of emotional behavior through physical interaction between human and robot. In: Proceedings of 1999 IEEE international conference on robotics and automation, vol 4. IEEE, pp 2868–2873

  133. Shim J, Arkin RC (2013) A taxonomy of robot deception and its benefits in HRI. In: IEEE international conference on systems, man, and cybernetics (SMC). IEEE, pp 2328–2335

  134. Sloan RJS, Cook M, Robinson B (2009) Considerations for believable emotional facial expression animation. In: Second international conference in visualisation. VIZ’09. IEEE, pp 61–66

  135. Spence C, Levitan CA, Shankar MU, Zampini M (2010) Does food color influence taste and flavor perception in humans? Chemosens Percept 3(1):68–84

    Article  Google Scholar 

  136. Stebbins G (1886) Delsarte system of dramatic expression. ES Werner

  137. Steinert S (2014) The five robots-a taxonomy for roboethics. Int J Soc Robot 6(2):249–260

    Article  Google Scholar 

  138. Steinfeld A (2004) Interface lessons for fully and semi-autonomous mobile robots. In: Proceedings of IEEE international conference on robotics and automation. ICRA’04, vol 3. IEEE, pp 2752–2757

  139. Straub I, Nishio S, Ishiguro H (2010) Incorporated identity in interaction with a teleoperated android robot: a case study. In: ROMAN. IEEE, pp 119–124

  140. Sugano S, Ogata T (1996) Emergence of mind in robots for human interface-research methodology and robot model. In: Proceedings of IEEE international conference on robotics and automation, vol 2. IEEE, pp 1191–1198

  141. Syrdal DS, Dautenhahn K, Woods S, Walters ML, Koay KL (2006) ‘Doing the right thing wrong’: personality and tolerance to uncomfortable robot approaches. In: The 15th IEEE international symposium on robot and human interactive communication. ROMAN 2006, pp 183–188. https://doi.org/10.1109/ROMAN.2006.314415

  142. Takayama L, Pantofaru C (2009) Influences on proxemic behaviors in human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems. IROS 2009. IEEE, pp 5495–5502

  143. Tapus A, Maja M, Scassellatti B (2007) The grand challenges in socially assistive robotics. IEEE Robot Autom Mag 14(1):1–7

    Article  Google Scholar 

  144. Tay B, Jung Y, Park T (2014) When stereotypes meet robots: the double-edge sword of robot gender and personality in human-robot interaction. Comput Hum Behav 38:75–84

    Article  Google Scholar 

  145. Terada K, Yamauchi A, Ito A (2012) Artificial emotion expression for a robot by dynamic color change. In: ROMAN. IEEE, pp 314–321

  146. The Building Regulations (2010) Park K: Protection from falling, collision and impact. https://www.gov.uk/government/publications/protection-from-falling-collision-and-impact-approved-document-k. Accessed 6 June 2019

  147. Thomas F, Johnston O, Thomas F (1995) The illusion of life: Disney animation. Hyperion, New York

    Google Scholar 

  148. Villani L, De Schutter J (2016) Force control. In: Springer handbook of robotics. Springer, pp 195–220

  149. Walters ML, Syrdal DS, Dautenhahn K, te Boekhorst R, Koay KL (2008) Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Auton Robot 24(2):159–178. https://doi.org/10.1007/s10514-007-9058-3

    Article  Google Scholar 

  150. Wittig S, Rätsch M, Kloos U (2015) Parameterized facial animation for socially interactive robots. In: Mensch and computer, pp 355–358

  151. Woods S, Dautenhahn K, Kaouri C, Boekhorst RT, Koay KL, Walters ML (2007) Are robots like people? Relationships between participant and robot personality traits in human-robot interaction studies. Interact Stud 8(2):281–305. https://doi.org/10.1075/is.8.2.06woo

    Article  Google Scholar 

  152. Yaffe P (2011) The 7% rule: fact, fiction, or misunderstanding. Ubiquity 2011:1

    Google Scholar 

  153. Yamazaki A, Yamazaki K, Kuno Y, Burdelski M, Kawashima M, Kuzuoka H (2008) Precision timing in human–robot interaction: coordination of head movement and utterance. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 131–140

  154. Yanco HA, Drury J (2004) Classifying human–robot interaction: an updated taxonomy. In: IEEE international conference on systems, man and cybernetics, vol 3. IEEE, pp 2841–2846

  155. Yanco HA, Drury JL (2002) A taxonomy for human–robot interaction. In: Proceedings of the AAAI fall symposium on human–robot interaction, pp 111–119

  156. Yoganandan N, Pintar FA, Zhang J, Baisden JL (2009) Physical properties of the human head: mass, center of gravity and moment of inertia. J Biomech 42(9):1177–1192

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Conor McGinn.

Ethics declarations

Conflict of interest

The author declares that he have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

McGinn, C. Why Do Robots Need a Head? The Role of Social Interfaces on Service Robots. Int J of Soc Robotics 12, 281–295 (2020). https://doi.org/10.1007/s12369-019-00564-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-019-00564-5

Keywords

Navigation