Skip to main content
Log in

Quantifying the Human Likeness of a Humanoid Robot

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

In research of human-robot interactions, human likeness (HL) of robots is frequently used as an individual, vague parameter to describe how a robot is perceived by a human. However, such a simplification of HL is often not sufficient given the complexity and multidimensionality of human-robot interaction. Therefore, HL must be seen as a variable influenced by a network of parameter fields. The first goal of this paper is to introduce such a network which systematically characterizes all relevant aspects of HL. The network is subdivided into ten parameter fields, five describing static aspects of appearance and five describing dynamic aspects of behavior. The second goal of this paper is to propose a methodology to quantify the impact of single or multiple parameters out of these fields on perceived HL. Prior to quantification, the minimal perceivable difference, i.e. the threshold of perception, is determined for the parameters of interest in a first experiment. Thereafter, these parameters are modified in whole-number multiple of the threshold of perception to investigate their influence on perceived HL in a second experiment. This methodology was illustrated on the parameters speed and sequencing (onset of joint movements) of the parameter field movement as well as on the parameter sound. Results revealed that the perceived HL is more sensitive to changes in sequencing than to changes in speed. The sound of the motors during the movement also reduced perceived HL. The presented methodology should guide further, systematic explorations of the proposed network of HL parameters in order to determine and optimize acceptance of humanoid robots.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Pfeifer R, Meili M (2011) Soft robots are the future. Tages-Anzeiger 2011, April 30 (in German)

  2. Hashimoto T, Kato N, Kobayashi H (2010) Study on educational application of android robot SAYA: field trial and evaluation at elementary school. In: Intelligent robotics and applications. Lecture notes in computer science, vol 6425. Springer, Berlin, pp 505–516

    Chapter  Google Scholar 

  3. Hanson D, Olney A, Prilliman S, Mathews E, Zielke M, Hammons D, Fernandez R, Stephanou H (2005) Upending the uncanny valley. In: Proceedings of the national conference on artificial intelligence, vol 20, p 1728

    Google Scholar 

  4. Bartneck C, Kanda T, Ishiguro H, Hagita N (2009) My robotic doppelgaenger—a critical look at the uncanny valley. In: The 18th IEEE international symposium on robot and human interactive communication (ROMAN 2009). IEEE Press, New York, pp 269–276

    Chapter  Google Scholar 

  5. Mori M (1970) The uncanny valley. Energy 7(4):33–35. Translated by K.F. MacDorman and T. Minato

    Google Scholar 

  6. MacDorman KF, Green RD, Ho C-C, Koch C (2009) Too real for comfort: uncanny responses to computer generated faces. Comput Hum Behav 25(3):695–710

    Article  Google Scholar 

  7. MacDorman KF (2005) Androids as an experimental apparatus: why is there an uncanny valley and can we exploit it. In: CogSci-2005 workshop: toward social mechanisms of android science, pp 106–118

    Google Scholar 

  8. Walters ML, Syrdal DS, Dautenhahn K, Te Boekhorst R, Koay KL (2008) Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Auton Robots 24(2):159–178

    Article  Google Scholar 

  9. Kanda T, Miyashita T, Osada T, Haikawa Y, Ishiguro H (2008) Analysis of humanoid appearances in human-robot interaction. IEEE Trans Robot 24(3):725–735

    Article  Google Scholar 

  10. Kupferberg A, Glasauer S, Huber M, Rickert M, Knoll A, Brandt T (2011) Biological movement increases acceptance of humanoid robots as human partners in motor interaction. AI & Soc 26:339–345

    Article  Google Scholar 

  11. Blow M, Dautenhahn K, Appleby A, Nehaniv CL, Lee DC (2006) Perception of robot smiles and dimensions for human-robot interaction design. In: The 15th IEEE international symposium on robot and human interactive communication (ROMAN 2006). IEEE, New York, pp 469–474

    Chapter  Google Scholar 

  12. Mitchell WJ, Szerszen KA, Lu AS, Schermerhorn PW, Scheutz M, MacDorman KF (2011) A mismatch in the human realism of face and voice produces an uncanny valley. i-Perception 2(1):10–12

    Article  Google Scholar 

  13. Goetz J, Kiesler S, Powers A (2003) Matching robot appearance and behavior to tasks to improve human-robot cooperation. In: The proceedings of the 12th IEEE international workshop on robot and human interactive communication proceedings (ROMAN 2003). IEEE Press, New York, pp 55–60

    Google Scholar 

  14. Minato T, Shimada M, Itakura S, Lee K, Ishiguro H (2006) Evaluating the human likeness of an android by comparing gaze behaviors elicited by the android and a person. Adv Robot 20(10):1147–1163

    Article  Google Scholar 

  15. Guizzo E (2010) Who’s afraid of the uncanny valley? Blog, IEEE Spectrum, April 2010

  16. Fitch W, Huber L, Bugnyar T (2010) Social cognition and the evolution of language: constructing cognitive phylogenies. Neuron 65(6):795–814

    Article  Google Scholar 

  17. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166

    Article  MATH  Google Scholar 

  18. Coeckelbergh M (2010) Humans, animals, and robots: a phenomenological APproach to human-robot relations. Int J Soc Robot 3:197–204

    Article  Google Scholar 

  19. Kidd CD (2003) Sociable robots: the role of presence and task in human-robot interaction. PhD thesis, Citeseer

  20. Schroeder M (2009) Expressive speech synthesis: past, present, and possible futures. In: Affective information processing. Springer, London, pp 111–126

    Chapter  Google Scholar 

  21. Zen H, Tokuda K, Black AW (2009) Statistical parametric speech synthesis. Speech Commun 51(11):1039–1064

    Article  Google Scholar 

  22. Purves D, Augustine GJ, Fitzpatrick D, Hall WC, LaMantina A, McNamara JO, Mooney RD, Platt ML, Simon SA, White LE, Williams SM (2008) Neuroscience, 4th edn. Sinauer Associates, Sunderland

    Google Scholar 

  23. Cabibihan JJ (2009) Design of prosthetic skins with humanlike softness. In: 13th international conference on biomedical engineering, vol 23. Springer, Berlin, Heidelberg, pp 2023–2026

    Chapter  Google Scholar 

  24. Libera FD, Minato T, Fasel I, Ishiguro H, Menegatti E, Pagello E (2007) Teaching by touching: an intuitive method for development of humanoid robot motions. In: 7th IEEE-RAS international conference on humanoid robots, pp 352–359

    Google Scholar 

  25. Voyles RM, Khosla PK (1995) Tactile gestures for human-robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems

    Google Scholar 

  26. Johansson G (1973) Visual perception of biological motion and a model for its analysis. Atten Percept Psychophys 14(2):201–211

    Article  MathSciNet  Google Scholar 

  27. Chaminade T, Franklin DW, Oztop E, Cheng G (2005) Motor interference between humans and humanoid robots: effect of biological and artificial motion. In: Proceedings of the 4th international conference on development and learning, pp 96–101, July

    Google Scholar 

  28. Matsui D, Minato T, MacDorman KF, Ishiguro H (2005) Generating natural motion in an android by mapping human motion. In: IEEE/RSJ international conference on intelligent robots and systems, pp 3301–3308, August

    Google Scholar 

  29. Kamide H, Yasumoto M, Mae Y, Takubo T, Ohara K, Arai T (2011) Comparative evaluation of virtual and real humanoid with robot-oriented psychology scale. In: IEEE international conference on robotics and automation (ICRA), pp 599–604, May

    Google Scholar 

  30. Ono T, Imai M, Ishiguro H (2001) A model of embodied communications with gestures between humans and robots. In: Proceedings of 23rd annual meeting of the cognitive science society, pp 732–737

    Google Scholar 

  31. Sidner CL, Kidd CD, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: Proceedings of the 9th international conference on intelligent user interfaces. ACM, New York, pp 78–84

    Google Scholar 

  32. Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(1–2):140–164

    Article  Google Scholar 

  33. Kanda T, Ishiguro H, Imai M, Ono T (2004) Development and evaluation of interactive humanoid robots. Proc IEEE 92(11):1839–1850

    Article  Google Scholar 

  34. Ritter H, Steil JJ, Sagerer G (2010) Mit Kopf, Körper und Hand: Herausforderungen humanoider Roboter. at-Automatisierungstechnik 58(11):630–638

    Google Scholar 

  35. Breazeal CL (2004) Designing sociable robots. MIT Press, Cambridge

    Google Scholar 

  36. Wang D, Narayanan S (2007) An acoustic measure for word prominence in spontaneous speech. IEEE Trans Audio Speech Lang Process 15(2):690–701

    Article  Google Scholar 

  37. Breazeal C, Fitzpatrick P (2000) That certain look: social amplification of animate vision. In: Proceedings of the AAAI fall symposium on society of intelligence agents—the human in the loop

    Google Scholar 

  38. Breazeal C (2004) Social interactions in HRI: the robot view. IEEE Trans Syst Man Cybern, Part C, Appl Rev 34(2):181–186

    Article  Google Scholar 

  39. Bruce A, Nourbakhsh I, Simmons R (2002) The role of expressiveness and attention in human-robot interaction. In: Proceedings of IEEE international conference on robotics and automation (ICRA’02), vol 4. IEEE Press, New York, pp 4138–4142

    Google Scholar 

  40. Dautenhahn K, Billard A (1999) Bringing up robots or—the psychology of socially intelligent robots: from theory to implementation. In: Proceedings of the third annual conference on autonomous agents. ACM, New York, pp 366–367

    Chapter  Google Scholar 

  41. Asada M, MacDorman KF, Ishiguro H, Kuniyoshi Y (2001) Cognitive developmental robotics as a new paradigm for the design of humanoid robots. Robot Auton Syst 37(2):185–193

    Article  MATH  Google Scholar 

  42. Scassellati B (2002) Theory of mind for a humanoid robot. Auton Robots 12(1):13–24

    Article  MATH  Google Scholar 

  43. Borenstein S (2006) Scientists try to make robots more human. USA Today, November 2006

  44. Johnston JM, Pennypacker HS (1993) Strategies and tactics of behavioral research. Erlbaum, Hillsdale

    Google Scholar 

  45. Mataric MJ (1998) Behavior-based robotics as a tool for synthesis of artificial behavior and analysis of natural behavior. Trends Cogn Sci 2(3):82–86

    Article  Google Scholar 

  46. Teodoro PDD (2007) Humanoid robot development of a simulation environment of an entertainment humanoid robot. Instituto Superior Tecnico Universidade Tecnica de Lisboa, Lisboa, pp 1–15

    Google Scholar 

  47. Kakebeeke TH, von Siebenthal K, Largo RH (1997) Differences in movement quality at term among preterm and term infants. Biol Neonate 71:367–378

    Article  Google Scholar 

  48. Largo RH, Kakebeeke TH (1994) Fine manipulative abilities in the first years of life. In: Motor Development in Children, p 33

    Google Scholar 

  49. Buechel M (2008) Patient-cooperative control strategy for 3D point-to-point movements applied to ARMin III. Semester thesis, Sensory-Motor Systems Lab, ETH Zurich, September

  50. de Vet HCW, Terwee CB, Knol DL, Bouter LM (2006) When to use agreement versus reliability measures. J Clin Epidemiol 59(10):1033–1039

    Article  Google Scholar 

  51. Forssberg H (1985) Ontogeny of human locomotor control. I. Infant stepping, supported locomotion and transition to independent locomotion. Exp Brain Res 57:480–493

    Google Scholar 

  52. Schaal S (1999) Is imitation learning the route to humanoid robots? Trends Cogn Sci 3(6):233–242

    Article  Google Scholar 

  53. Mataric MJ (2002) Sensory-motor primitives as a basis for imitation: linking perception to action and biology to robotics. In: Imitation in animals and artifacts. MIT Press, Cambridge, pp 391–422

    Google Scholar 

  54. Drumwright E, Jenkins OC, Mataric MJ (2004) Exemplar-based primitives for humanoid movement classification and control. In: Proceedings of IEEE international conference on robotics and automation (ICRA’04), vol 1. IEEE Press, New York, pp 140–145

    Google Scholar 

  55. Hertzmann A, O’Sullivan C, Perlin K (2009) Realistic human body movement for emotional expressiveness. In: ACM SIGGRAPH 2009 courses (SIGGRAPH ’09), New York, NY, USA. ACM, New York, pp 20:1–20:27

    Google Scholar 

  56. Agur AMR, Lee MJ (1999) Grant’s atlas of anatomy, 10th edn. Williams Wilkins, Baltimore

    Google Scholar 

  57. Winter DA (1984) Kinematic and kinetic patterns in human gait: variability and compensating effects. Hum Mov Sci 3(1–2):51–76

    Article  Google Scholar 

  58. Plamondon R (1991) On the origin of asymmetric bell-shaped velocity profiles in rapid aimed movements. Tutor Motor Neurosc 62:283–295

    Article  Google Scholar 

  59. Yamanaka K, Wada Y, Kawato M (2002) Quantitative examinations for human arm trajectory planning in three-dimensional space. IEICE Trans Inf Syst 85(3):602–603

    Google Scholar 

  60. Kakebeeke TH (2010) Motorische Entwicklung I–VI. In: Motor control and learning 2, lecture SS10. ETH Zurich, Zurich

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank all those who participated in the survey; Katharina Vogt for her assistance with the online questionnaire; Frank Bodin and Johannes Hedinger for initiating and consulting the project and Daniel Kiper for his support in the project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joachim von Zitzewitz.

Additional information

First two authors contributed equally to this work.

Appendix: Movement Parameters

Appendix: Movement Parameters

Basic Movements: Basic Movements are single movements which cannot be further separated. Complex human movements are build by combining basic movements. This assumption has broad support in science, for example in movement sciences [51], or when movement primitives serve as a foundation for imitation learning [5254].

Associated Movements: The parameter Associated Movements is used for assessing human movements [48]. Association of movements exploits synergies of joints or physical effects to increase the efficiency of the overall movement.

Fluency: Fluency is a typical parameter used to assess human movement [47, 48]. It is defined as the smoothness of a movement [47].

Stiffness: Humans are able to modify the stiffness of a body part by simultaneously contracting the protagonist and the antagonist muscle [55].

Range of Motion: Range of Motion is about the realism of movement constraints. It is a classic parameter of movement on joint limits and torque limits [49, 55].

Complexity: Most human movements can be achieved involving different numbers of degrees of freedom (DoF). The complexity of a movement increases with the number of DoFs involved. The upper limit is maximum number of DoFs available for a movement. For example the maximum complexity achievable by a human hand is 27 DoFs [56].

Spatiotemporal Variability: Spatiotemporal Variability is another typical parameter of human movement [47, 57]. Kakebeeke [47] defined it as the “variation in displacement, speed and rotation.”

Velocity Profile: Velocity is a predominant quantitative parameter of human movement [48]. The velocity profiles of typical human movements show approximately the shape of a bell [49, 58, 59]. Robot movements, in order to resemble human movements, should show the same velocity profile.

Physiological Correctness: An illustrative example to explain Physiological Correctness are the unfamiliar movements of people who are not able to use certain joints or muscles. The problem of low Physiological Correctness is also known in computer animation [55].

Precision: Humans are able reach very high precision in movements when the movement is fine. However, even if the movement is fast or under high load humans are able to carry it out with precision. How capable is the robot in this respect?

Efficiency: Efficiency is a parameter used in the assessment of human movements [48, 60]. It is defined as the “minimal motor activity for a task” [60].

Appropriateness: Appropriateness describes whether the observed movement seems appropriate from a human’s perspective or not. To give an example, imagine a man strolling through the city on a sunny day. A high value of Appropriateness would have the man comfortably walking down the avenue.

Situatedness: Situatedness relates the movement to the situation and the environment. A high value of Situatedness means that the movement is very well adapted to the current situation [48]. A situated movement is an efficient and safe movement to do in the given situation [55].

Rights and permissions

Reprints and permissions

About this article

Cite this article

von Zitzewitz, J., Boesch, P.M., Wolf, P. et al. Quantifying the Human Likeness of a Humanoid Robot. Int J of Soc Robotics 5, 263–276 (2013). https://doi.org/10.1007/s12369-012-0177-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-012-0177-4

Keywords

Navigation