Abstract
Developing effective ways for robots to communicate with humans presents many significant design challenges and requires detailed consideration of a wide range of factors. To facilitate communication between people and machines, it is common for robots to possess head-like features capable of providing social feedback through facial expressions, attention, gaze, etc. This paper explores the multifaceted roles that robotic head-like interfaces play in human–robot interaction. The research makes two main contributions. First, the paper outlines the motivations for using social interfaces on service robots and reviews key design insights from past studies in the field. Second, a taxonomy for classifying robot heads is proposed. This taxonomy has broad appeal for designers, as it gives structure to a large, disorganised design space.
Similar content being viewed by others
Notes
The logic behind this hypothesis may not hold true for some features which may have once evolved to serve a purpose, but have since become redundant. For example, humans possess auricular muscles in the ear, which originally evolved to help detect predators/prey, but currently do not serve a purpose since humans lost the ability to control them [61].
The exact number of basic emotions remains a topic of debate among researchers—see [3] for a summary of the most prominent theories on the subject.
References
International Organization for Standardization (2012) Robots and robotic devices–vocabulary. ISO 8373:2012
International Organization for Standardization (2014) Robots and robotic devices–safety requirements for personal care robots. ISO 13482:2014
Basic Emotions (2016). http://changingminds.org/explanations/emotions/basic%20emotions.htm. Accessed 30 Sept 2010
Adamides G, Christou G, Katsanos C, Xenos M, Hadzilacos T (2015) Usability guidelines for the design of robot teleoperation: a taxonomy. IEEE Trans Hum Mach Syst 45(2):256–262
Álvarez M, Galán R, Matía F, Rodríguez-Losada D, Jiménez A (2010) An emotional model for a guide robot. IEEE Trans Syst Man Cybern Part A Syst Hum 40(5):982–992
Aly A, Griffiths S, Stramandinoli F (2017) Metrics and benchmarks in human-robot interaction: recent advances in cognitive robotics. Cognit Syst Res 43:313–323
Azenkot S, Feng C, Cakmak M (2016) Enabling building service robots to guide blind people: a participatory design approach. In: The eleventh ACM/IEEE international conference on human robot interaction. IEEE Press, pp 3–10
Azuma J, Ebner M (2008) A stylistic analysis of graphic emoticons: can they be candidates for a universal visual language of the future. In: Proceedings of world conference on educational multimedia, hypermedia and telecommunications, pp 972–979
Bartneck C, Lyons MJ (2009) Facial expression analysis, modeling and synthesis: overcoming the limitations of artificial intelligence with the art of the soluble. In: Handbook of research on synthetic emotions and sociable robotics: new applications in affective computing and artificial intelligence. IGI Global, pp 34–55
Bartneck C, Yogeeswaran K, Ser QM, Woodward G, Sparrow R, Wang S, Eyssel F (2018) Robots and racism. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction. ACM, pp 196–204
Bates J et al (1994) The role of emotion in believable agents. Commun ACM 37(7):122–125
Beck A, Hiolle A, Canamero L (2013) Using perlin noise to generate emotional expressions in a robot. In: CogSci
Berns K, Braum T (2005) Design concept of a human-like robot head. In: 5th IEEE-RAS international conference on humanoid robots. IEEE, pp 32–37
Bradley MM, Miccoli L, Escrig MA, Lang PJ (2008) The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4):602–607
Breazeal C (2003) Toward sociable robots. Robot Autono Syst 42(3–4):167–175
Breazeal C (2004) Social interactions in HRI: the robot view. IEEE Trans Syst Man Cybern Part C (Appl Rev) 34(2):181–186
Breazeal C, Brooks A, Gray J, Hoffman G, Kidd C, Lee H, Lieberman J, Lockerd A, Chilongo D (2004) Tutelage and collaboration for humanoid robots. Int J Humanoid Robot 1(02):315–348
Breazeal C, Edsinger A, Fitzpatrick P, Scassellati B (2001) Active vision for sociable robots. IEEE Trans Syst Man Cybern Part A Syst Hum 31(5):443–453
Breazeal C, Scassellati B (1999) A context-dependent attention system for a social robot. In: Proceedings of the 16th international joint conference on artificial intelligence. IJCAI’99, vol 2. Morgan Kaufmann Publishers Inc., San Francisco, pp 1146–1151
van Breemen A, Yan X, Meerbeek B (2005) ICAT: an animated user-interface robot with personality. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems. ACM, pp 143–144
Brooks RA (1991) Intelligence without representation. Artif Intell 47(1–3):139–159
Brooks RA et al (1991) Intelligence without reason. In: IJCAI 91:569–595
Bruce A, Nourbakhsh I, Simmons, R (2002) The role of expressiveness and attention in human–robot interaction. In: Proceedings of IEEE international conference on robotics and automation (Cat. No. 02CH37292), vol 4, pp 4138–4142. https://doi.org/10.1109/ROBOT.2002.1014396
Bruce V (1996) The role of the face in communication: implications for videophone design. Interact Comput 8(2):166–176. https://doi.org/10.1016/0953-5438(96)01026-0
Buchanan R (1992) Wicked problems in design thinking. Des Issues 8(2):5–21
Burkhardt F, Campbell N (2015) Emotional speech synthesis. In: Calvo RA, D’Mello S, Gratch J, Kappas A (eds) The oxford handbook of affective computing, chap 20. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199942237.013.038
Calvo MG, Fernández-Martín A, Nummenmaa L (2014) Facial expression recognition in peripheral versus central vision: role of the eyes and the mouth. Psychol Res 78(2):180–195
Cañamero L, Fredslund J (2001) I show you how i like you-can you read it in my face?[robotics]. IEEE Trans Syst Man Cybern Part A Syst Hum 31(5):454–459
Cheetham M, Suter P, Jäncke L (2011) The human likeness dimension of the “uncanny valley hypothesis”: behavioral and functional MRI findings. Front Hum Neurosci 5:126
Chou CP, Hannaford B (1996) Measurement and modeling of mckibben pneumatic artificial muscles. IEEE Trans Robot Autom 12(1):90–102
Collins EC, Prescott TJ, Mitchinson B (2015) Saying it with light: a pilot study of affective communication using the miro robot. In: Conference on biomimetic and biohybrid systems. Springer, pp 243–255
Coradeschi S, Kristoffersson A, Loutfi A, Von Rump S, Cesta A, Cortellessa G, Gonzalez J (2011) Towards a methodology for longitudinal evaluation of social robotic telepresence for elderly. In: Human robot interaction
Curtis A, Shim J, Gargas E, Srinivasan A, Howard AM (2011) Dance dance pleo: developing a low-cost learning robotic dance therapy aid. In: Proceedings of the 10th international conference on interaction design and children. ACM, pp 149–152
Darwin C (1998) The expression of the emotions in man and animals. Oxford University Press, Oxford
Dautenhahn K (1998) The art of designing socially intelligent agents: science, fiction, and the human in the loop. Appl Artif Intell 12(7–8):573–617. https://doi.org/10.1080/088395198117550
Dautenhahn K (1999) Socially intelligent agents and the primate social brain-towards a science of social minds. Adapt Behav 7(3–4):3–4
Dautenhahn K, Woods S, Kaouri C, Walters ML, Koay KL, Werry I (2005) What is a robot companion - friend, assistant or butler? In: IEEE/RSJ international conference on intelligent robots and systems, pp 1192–1197. https://doi.org/10.1109/IROS.2005.1545189
De Gelder B (2009) Why bodies? twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc Lond B Biol Sci 364(1535):3475–3484
De Graaf MM, Allouch SB (2013) Exploring influencing variables for the acceptance of social robots. Robot Auton Syst 61(12):1476–1486
De Santis A, Siciliano B, De Luca A, Bicchi A (2008) An atlas of physical human-robot interaction. Mech Mach Theory 43(3):253–270
Dennett DC (1971) Intentional systems. J Philos 68(4):87–106
DiSalvo C, Louw M, Holstius D, Nourbakhsh I, Akin A (2012) Toward a public rhetoric through participatory design: critical engagements and creative expression in the neighborhood networks project. Des Issues 28(3):48–61
DiSalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the 4th conference on designing interactive systems: processes, practices, methods, and techniques. ACM, pp 321–326
Duchenne GB (1876) Mécanisme de la physionomie humaine: où. Analyse électro-physiologique de l’expression des passions. J.-B. Baillière, Paris
Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42(3–4):177–190. https://doi.org/10.1016/S0921-8890(02)00374-3
Ekman P (1993) Facial expression and emotion. Am Psychol 48(4):384
Ekman P (1977) Facial action coding system
Elprama SA, Jewell CI, Jacobs A, El Makrini I, Vanderborght B (2017) Attitudes of factory workers towards industrial and collaborative robots. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 113–114
Erden MS, Tapus A (2010) Postural expressions of emotions in a humanoid robot for assistive applications. In: Poster paper in workshop on learning for human–robot interaction modeling under the conference of robotics science and systems-RSS, pp 27–30
Feil-Seifer D, Matarić MJ (2011) Socially assistive robotics. IEEE Robot Autom Mag 18(1):24–31
Fink J (2012) Anthropomorphism and human likeness in the design of robots and human–robot interaction. Springer, Berlin, pp 199–208. https://doi.org/10.1007/978-3-642-34103-8_20
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166. https://doi.org/10.1016/S0921-8890(02)00372-X
Forlizzi J, DiSalvo C, Gemperle F (2004) Assistive robotics and an ecology of elders living independently in their homes. Hum Comput Interact 19(1):25–59
Förster F, Weiss A, Tscheligi M (2011) Anthropomorphic design for an interactive urban robot: the right design approach. In: Proceedings of the 6th international conference on human–robot interaction. ACM, pp 137–138
Frauenberger C, Makhaeva J, Spiel K (2017) Blending methods: developing participatory design sessions for autistic children. In: Proceedings of the 2017 conference on interaction design and children. ACM, pp 39–49
Fujita M (2004) On activating human communications with pet-type robot AIBO. Proc IEEE 92(11):1804–1813
Grabiner JV (1986) Computers and the nature of man: a historian’s perspective on controversies about artificial intelligence. Bull Am Math Soc. https://doi.org/10.1090/S0273-0979-1986-15461-3
Graf B, Reiser U, Hägele M, Mauz K, Klein P (2009) Robotic home assistant care-o-bot® 3-product vision and innovation platform. In: IEEE workshop on advanced robotics and its social impacts. IEEE, pp 139–144
Grandstrand O (1999) Basic emotions. In: Dalgleish T, Power M (eds) The Oxford handbook of innovation, chap 3. Wiley, Hoboken, pp 45–60
Green A, Huttenrauch H, Norman M, Oestreicher L, Severinson Eklundh K (2000) User centered design for intelligent service robots. In: Proceedings of 9th IEEE international workshop on robot and human interactive communication. ROMAN 2000, pp 161–166. https://doi.org/10.1109/ROMAN.2000.892488
Hackley SA (2015) Evidence for a vestigial pinna-orienting system in humans. Psychophysiology 52(10):1263–1270
Halterman MW (2005) Emotions. In: Neuroscience, 3rd edn. Sinauer Associates, pp 687–711
Hanson D, Olney A, Prilliman S, Mathews E, Zielke M, Hammons D, Fernandez R, Stephanou H (2005) Upending the uncanny valley. AAAI 5:1728–1729
Häring M, Bee N, André E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: ROMAN. IEEE, pp 204–209
Hess EH, Polt JM (1960) Pupil size as related to interest value of visual stimuli. Science 132(3423):349–350
Hinds PJ, Roberts TL, Jones H (2004) Whose job is it anyway? A study of human-robot interaction in a collaborative task. Hum Comput Interact 19(1):151–181. https://doi.org/10.1207/s15327051hci1901&2_7
Hirsch L, Björsell A, Laaksoharju M, Obaid M (2017) Investigating design implications towards a social robot as a memory trainer. In: Proceedings of the 5th international conference on human agent interaction. ACM, pp 5–10
Hjortsjö CH (1969) Man’s face and mimic language. Studen litteratur, Lund
Hoffman G (2011) On stage: robots as performers. In: RSS 2011 workshop on human–robot interaction: perspectives and contributions to robotics from the human sciences, vol 1, Los Angeles
Hornung A, Phillips M, Jones EG, Bennewitz M, Likhachev M, Chitta S (2012) Navigation in three-dimensional cluttered environments for mobile manipulation. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 423–429
Humphrey N (1976) The colour currency of nature. Colour for architecture, pp 95–98
Huttenrauch H, Green A, Norman M, Oestreicher L, Eklundh KS (2004) Involving users in the design of a mobile office robot. IEEE Trans Syst Man Cybern Part C (Appl Rev) 34(2):113–124. https://doi.org/10.1109/TSMCC.2004.826281
International Federation of Robotics (IFR): service robots (2014). http://www.ifr.org/service-robots/. Accessed 6 June 2019
Joosse M, Lohse M, Evers V (2015) Crowdsourcing culture in HRI: lessons learned from quantitative and qualitative data collections. In: 3rd International workshop on culture aware robotics at ICSR, vol 15
Joosse M, Lohse M, Pérez JG, Evers V (2013) What you do is who you are: the role of task context in perceived social robot personality. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 2134–2139
Kang E, Jackson E, Schulte W (2010) An approach for effective design space exploration. In: Monterey workshop. Springer, pp 33–54
Kidd CD, Taggart W, Turkle S A (2006) sociable robot to encourage social interaction among the elderly. In: Proceedings of IEEE international conference on robotics and automation, ICRA 2006. IEEE, pp 3972–3976
Kiesler S (2005) Fostering common ground in human–robot interaction. In: IEEE international workshop on robot and human interactive communication, ROMAN. pp 729–734. https://doi.org/10.1109/ROMAN.2005.1513866
Kiesler S, Goetz J (2002) Mental models and cooperation with robotic assistants. In: Proceedings of conference on human factors in computing systems. ACM Press, pp 576–577
Kim ES, Paul R, Shic F, Scassellati B (2012) Bridging the research gap: making HRI useful to individuals with autism. J Hum Robot Interact 1(1):26–54
Kim M, Oh K, Choi J, Jung J, Kim Y (2011) User-centered HRI: HRI research methodology for designers. In: Mixed reality and human–robot interaction. Springer, pp 13–33
Kishi T, Futaki H, Trovato G, Endo N, Destephe M, Cosentino S, Hashimoto K, Takanishi A (2014) Development of a comic mark based expressive robotic head adapted to japanese cultural background. In: IEEE/RSJ international conference on intelligent robots and systems, pp 2608–2613. https://doi.org/10.1109/IROS.2014.6942918
Kishi T, Otani T, Endo N, Kryczka P, Hashimoto K, Nakata K, Takanishi A (2012) Development of expressive robotic head for bipedal humanoid robot. In: IEEE/RSJ international conference on intelligent robots and systems, pp 4584–4589. https://doi.org/10.1109/IROS.2012.6386050
Klamer T, Allouch SB (2010) Zoomorphic robots used by elderly people at home. In: Proceedings of 27th international conference on human factors in computing systems
Kristoffersson A, Coradeschi S, Loutfi A (2013) A review of mobile robotic telepresence. Adv Hum Comput Interact 2013:3
Kühnlenz K, Sosnowski S, Buss M (2010) Impact of animal-like features on emotion expression of robot head eddie. Adv Robot 24(8–9):1239–1255
Lee HR, Šabanović S, Chang WL, Nagata S, Piatt J, Bennett C, Hakken D (2017) Steps toward participatory design of social robots: mutual learning with older adults with depression. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 244–253
Lee MK, Forlizzi J, Kiesler S, Rybski P, Antanitis J, Savetsila S (2012) Personalization in HRI: a longitudinal field experiment. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction. ACM, pp 319–326
Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308
Li D, Rau PP, Li Y (2010) A cross-cultural study: Effect of robot appearance and task. Int J Soc Robot 2(2):175–186
Linnæus C (1758) Systema naturæ per regna tria naturæ, secundum classes, ordines, genera, species, cum characteribus, differentiis, synonymis, locis. Tomus I. Editio decima, reformata, pp [1–4], 1–824. Holmiæ. (Salvius). http://www.animalbase.uni-goettingen.de/zooweb/servlet/AnimalBase/home/reference?id=4
Lohan KS, Pitsch K, Rohlfing KJ, Fischer K, Saunders J, Lehmann H, Nehaniv C, Wrede B (2011) Contingency allows the robot to spot the tutor and to learn from interaction. In: IEEE international conference on development and learning (ICDL), vol 2. IEEE, pp 1–8
Van der Loos HM, Reinkensmeyer DJ, Guglielmelli E (2016) Rehabilitation and health care robotics. In: Springer handbook of robotics. Springer, pp 1685–1728
Lütkebohle I, Hegel F, Schulz S, Hackel M, Wrede B, Wachsmuth S, Sagerer G (2010) The bielefeld anthropomorphic robot head “flobi”. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 3384–3391
Ma LH, Gilland E, Bass AH, Baker R (2010) Ancestry of motor innervation to pectoral fin and forelimb. Nat Commun 1:49
Malmir M, Forster D, Youngstrom K, Morrison L, Movellan J (2013) Home alone: social robots for digital ethnography of toddler behavior. In: Proceedings of the IEEE international conference on computer vision workshops, pp 762–768
Mathur MB, Reichling DB (2016) Navigating a social world with robot partners: a quantitative cartography of the uncanny valley. Cognition 146:22–32
Matsui Y, Kanoh M, Kato S, Nakamura T, Itoh H (2010) A model for generating facial expressions using virtual emotion based on simple recurrent network. JACIII 14(5):453–463
McGinn C, Cullinan MF, Culleton M, Kelly K (2017) A human-oriented framework for developing assistive service robots. Disability and rehabilitation: assistive technology, pp 1–12
McGinn C, Torre I (2019) Can you tell the robot by the voice? An exploratory study on the role of voice in the perception of robots. In: 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 211–221
Mehrabian A (1971) Silent messages. Wadsworth Belmont, CA
Miwa H, Okuchi T, Takanobu H, Takanishi A (2002) Development of a new human-like head robot we-4. In: IEEE/RSJ international conference on intelligent robots and systems, vol 3. IEEE, pp 2443–2448
Morasso P, Bizzi E, Dichgans J (1973) Adjustment of saccade characteristics during head movements. Exp Brain Res 16(5):492–500
Mori M, MacDorman KF, Kageki N (2012) The uncanny valley [from the field]. IEEE Robot Autom Mag 19(2):98–100. https://doi.org/10.1109/MRA.2012.2192811
Murphy R, Schreckenghost D (2013) Survey of metrics for human–robot interaction. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction. IEEE Press, pp 197–198
Murphy RR (2004) human-robot interaction in rescue robotics. IEEE Trans Syst Man Cybern Part C (Appl Rev) 34(2):138–153
Murray JC, Cañamero L, Hiolle A (2009) Towards a model of emotion expression in an interactive robot head. In: The 18th IEEE international symposium on robot and human interactive communication. ROMAN 2009. IEEE, pp 627–632
Nakata T, Sato T, Mori T, Mizoguchi H (1998) Expression of emotion and intention by robot body movement. In: Proceedings of the 5th international conference on autonomous systems
Niculescu A, van Dijk B, Nijholt A, Li H, See SL (2013) Making social robots more attractive: the effects of voice pitch, humor and empathy. Int J Soc Robot 5(2):171–191
Nielsen J (1993) Iterative user-interface design. Computer 26(11):32–41
Ou LC, Luo MR, Woodcock A, Wright A (2004) A study of colour emotion and colour preference. Part I: Colour emotions for single colours. Color Res Appl 29(3):232–240
Paauwe RA, Keyson DV, Hoorn JF, Konijn EA (2015) Minimal requirements of realism in social robots: designing for patients with acquired brain injury. In: Proceedings of the 33rd annual ACM conference extended abstracts on human factors in computing systems. ACM, pp 2139–2144
Park JJ, Haddadin S, Song JB, Albu-Schäffer A (2011) Designing optimally safe robot surface properties for minimizing the stress characteristics of human–robot collisions. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 5413–5420
Partala T, Surakka V (2003) Pupil size variation as an indication of affective processing. Int J Hum Comput Stud 59(1):185–198
Peng H, Zhou C, Hu H, Chao F, Li J (2015) Robotic dance in social robotics-a taxonomy. IEEE Trans Hum Mach Syst 45(3):281–293
Pfeifer R, Bongard J (2006) How the body shapes the way we think: a new view of intelligence. MIT Press, Cambridge
Phillips E, Zhao X, Ullman D, Malle BF (2018) What is human-like? Decomposing robots’ human-like appearance using the anthropomorphic robot (abot) database. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction. ACM, pp 105–113
Plutchik R (1980) A general psychoevolutionary theory of emotion. Theor Emot 1(3–31):4
Powers A, Kramer AD, Lim S, Kuo J, Lee SL, Kiesler S (2005) Eliciting information from people with a gendered humanoid robot. In: IEEE international workshop on robot and human interactive communication. ROMAN 2005. IEEE, pp 158–163
Pratt GA, Williamson MM (1995) Series elastic actuators. In: Proceedings. 1995 IEEE/RSJ international conference on intelligent robots and systems 95. ’Human robot interaction and cooperative robots’, vol 1. IEEE, pp 399–406
Rosenthal-von der Pütten AM, Krämer NC, Hoffmann L, Sobieraj S, Eimler SC (2013) An experimental study on emotional reactions towards a robot. Int J Soc Robot 5(1):17–34
Reece JB, Urry LA, Cain ML, Wasserman SA, Minorsky PV, Jackson RB et al (2011) Campbell biology. Pearson, Boston
Rolls BJ, Rowe EA, Rolls ET (1982) How sensory properties of foods affect human feeding behavior. Physiol Behav 29(3):409–417
Rose R, Scheutz M, Schermerhorn P (2010) Towards a conceptual and methodological framework for determining robot believability. Interact Stud 11(2):314–335
Ruesch J, Lopes M, Bernardino A, Hornstein J, Santos-Victor J, Pfeifer R (2008) Multimodal saliency-based bottom-up attention a framework for the humanoid robot iCub. In: IEEE international conference on robotics and automation. ICRA 2008. IEEE, pp 962–967
Salter T, Michaud F, Larouche H (2010) How wild is wild? A taxonomy to characterize the ‘wildness’ of child-robot interaction. Int J Soc Robot 2(4):405–415
Saygin AP, Chaminade T, Ishiguro H, Driver J, Frith C (2011) The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Soc Cognit Affect Neurosci 7(4):413–422
Scassellati BM (2001) Foundations for a theory of mind for a humanoid robot. Ph.D. thesis, Massachusetts Institute of Technology
Scherer KR, Oshinsky JS (1977) Cue utilization in emotion attribution from auditory stimuli. Motiv Emot 1(4):331–346
Schulte J, Rosenberg C, Thrun S (1999) Spontaneous, short-term interaction with mobile robots. In: Proceedings of 1999 IEEE international conference on robotics and automation (Cat. No.99CH36288C), vol 1, pp 658–663. https://doi.org/10.1109/ROBOT.1999.770050
Shayganfar M, Rich C, Sidner CL (2012) A design methodology for expressing emotion on robot faces. In: IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4577–4583
Shibata T, Tashima T, Tanie K (1999) Emergence of emotional behavior through physical interaction between human and robot. In: Proceedings of 1999 IEEE international conference on robotics and automation, vol 4. IEEE, pp 2868–2873
Shim J, Arkin RC (2013) A taxonomy of robot deception and its benefits in HRI. In: IEEE international conference on systems, man, and cybernetics (SMC). IEEE, pp 2328–2335
Sloan RJS, Cook M, Robinson B (2009) Considerations for believable emotional facial expression animation. In: Second international conference in visualisation. VIZ’09. IEEE, pp 61–66
Spence C, Levitan CA, Shankar MU, Zampini M (2010) Does food color influence taste and flavor perception in humans? Chemosens Percept 3(1):68–84
Stebbins G (1886) Delsarte system of dramatic expression. ES Werner
Steinert S (2014) The five robots-a taxonomy for roboethics. Int J Soc Robot 6(2):249–260
Steinfeld A (2004) Interface lessons for fully and semi-autonomous mobile robots. In: Proceedings of IEEE international conference on robotics and automation. ICRA’04, vol 3. IEEE, pp 2752–2757
Straub I, Nishio S, Ishiguro H (2010) Incorporated identity in interaction with a teleoperated android robot: a case study. In: ROMAN. IEEE, pp 119–124
Sugano S, Ogata T (1996) Emergence of mind in robots for human interface-research methodology and robot model. In: Proceedings of IEEE international conference on robotics and automation, vol 2. IEEE, pp 1191–1198
Syrdal DS, Dautenhahn K, Woods S, Walters ML, Koay KL (2006) ‘Doing the right thing wrong’: personality and tolerance to uncomfortable robot approaches. In: The 15th IEEE international symposium on robot and human interactive communication. ROMAN 2006, pp 183–188. https://doi.org/10.1109/ROMAN.2006.314415
Takayama L, Pantofaru C (2009) Influences on proxemic behaviors in human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems. IROS 2009. IEEE, pp 5495–5502
Tapus A, Maja M, Scassellatti B (2007) The grand challenges in socially assistive robotics. IEEE Robot Autom Mag 14(1):1–7
Tay B, Jung Y, Park T (2014) When stereotypes meet robots: the double-edge sword of robot gender and personality in human-robot interaction. Comput Hum Behav 38:75–84
Terada K, Yamauchi A, Ito A (2012) Artificial emotion expression for a robot by dynamic color change. In: ROMAN. IEEE, pp 314–321
The Building Regulations (2010) Park K: Protection from falling, collision and impact. https://www.gov.uk/government/publications/protection-from-falling-collision-and-impact-approved-document-k. Accessed 6 June 2019
Thomas F, Johnston O, Thomas F (1995) The illusion of life: Disney animation. Hyperion, New York
Villani L, De Schutter J (2016) Force control. In: Springer handbook of robotics. Springer, pp 195–220
Walters ML, Syrdal DS, Dautenhahn K, te Boekhorst R, Koay KL (2008) Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Auton Robot 24(2):159–178. https://doi.org/10.1007/s10514-007-9058-3
Wittig S, Rätsch M, Kloos U (2015) Parameterized facial animation for socially interactive robots. In: Mensch and computer, pp 355–358
Woods S, Dautenhahn K, Kaouri C, Boekhorst RT, Koay KL, Walters ML (2007) Are robots like people? Relationships between participant and robot personality traits in human-robot interaction studies. Interact Stud 8(2):281–305. https://doi.org/10.1075/is.8.2.06woo
Yaffe P (2011) The 7% rule: fact, fiction, or misunderstanding. Ubiquity 2011:1
Yamazaki A, Yamazaki K, Kuno Y, Burdelski M, Kawashima M, Kuzuoka H (2008) Precision timing in human–robot interaction: coordination of head movement and utterance. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 131–140
Yanco HA, Drury J (2004) Classifying human–robot interaction: an updated taxonomy. In: IEEE international conference on systems, man and cybernetics, vol 3. IEEE, pp 2841–2846
Yanco HA, Drury JL (2002) A taxonomy for human–robot interaction. In: Proceedings of the AAAI fall symposium on human–robot interaction, pp 111–119
Yoganandan N, Pintar FA, Zhang J, Baisden JL (2009) Physical properties of the human head: mass, center of gravity and moment of inertia. J Biomech 42(9):1177–1192
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The author declares that he have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
McGinn, C. Why Do Robots Need a Head? The Role of Social Interfaces on Service Robots. Int J of Soc Robotics 12, 281–295 (2020). https://doi.org/10.1007/s12369-019-00564-5
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-019-00564-5