Advertisement

International Journal of Social Robotics

, Volume 8, Issue 2, pp 287–302 | Cite as

Blurring Human–Machine Distinctions: Anthropomorphic Appearance in Social Robots as a Threat to Human Distinctiveness

  • Francesco Ferrari
  • Maria Paola PaladinoEmail author
  • Jolanda Jetten
Article

Abstract

The present research aims at gaining a better insight on the psychological barriers to the introduction of social robots in society at large. Based on social psychological research on intergroup distinctiveness, we suggested that concerns toward this technology are related to how we define and defend our human identity. A threat to distinctiveness hypothesis was advanced. We predicted that too much perceived similarity between social robots and humans triggers concerns about the negative impact of this technology on humans, as a group, and their identity more generally because similarity blurs category boundaries, undermining human uniqueness. Focusing on the appearance of robots, in two studies we tested the validity of this hypothesis. In both studies, participants were presented with pictures of three types of robots that differed in their anthropomorphic appearance varying from no resemblance to humans (mechanical robots), to some body shape resemblance (biped humanoids) to a perfect copy of human body (androids). Androids raised the highest concerns for the potential damage to humans, followed by humanoids and then mechanical robots. In Study 1, we further demonstrated that robot anthropomorphic appearance (and not the attribution of mind and human nature) was responsible for the perceived damage that the robot could cause. In Study 2, we gained a clearer insight in the processes underlying this effect by showing that androids were also judged as most threatening to the human–robot distinction and that this perception was responsible for the higher perceived damage to humans. Implications of these findings for social robotics are discussed.

Keywords

Social acceptance of social robots  Threat to human distinctiveness Uncanny valley  Robot anthropomorphic appearance Androids 

Notes

Acknowledgments

The research for this paper was financially supported by a doctorate grant awarded by the University of Trento to F. Ferrari. Portions of the data of Study 1 have been analyzed for a different purpose and presented in form of a proceeding at “Evaluating Social Robts”, The 13th International Conference on Intelligent Autonomous System, July 18, 2014, Padova, Italy.

Authors contribution Francesco Ferrari, Maria Paola Paladino and Jolanda Jetten developed the study concept. Francesco Ferrari and Maria Paola Paladino designed the studies. Francesco Ferrari prepared the experimental material, collected and analyzed the data. Francesco Ferrari and Maria Paola Paladino drafted the manuscript. Jolanda Jetten edited and contributed to the critical revisions of the manuscript. All the authors read and approved the final version for submission.

References

  1. 1.
    Kanda T, Ishiguro H, Ishida T (2001) Psychological analysis on human–robot interaction. In: IEEE international conference on robotics and automation, 2001. Proceedings 2001 ICRA. vol 4, pp 4166–4173. doi: 10.1109/ROBOT.2001.933269
  2. 2.
    Lee KM, Jung Y, Kim J, Kim SR (2006) Are physically embodied social agents better than disembodied social agents?: the effects of physical embodiment, tactile interaction, and people’s loneliness in humanrobot interaction. Int J Hum Comput Stud 64(10):962–973. doi: 10.1016/j.ijhcs.2006.05.002 CrossRefGoogle Scholar
  3. 3.
    European Commission, Special Eurobarometer 382, Public Attitudes Toward Robots (2012) TNS opinion & social, brussels [Producer]. http://ec.europa.eu/public_opinion/archives/ebs/ebs_382_en. Accessed 22th May 2015
  4. 4.
    Kamide H, Mae Y, Kawabe K, Shigemi S, Arai T (2012) A psychological scale for general impressions of humanoids. In 2012 IEEE international conference on robotics and automation (ICRA), pp 4030–4037. doi: 10.1080/01691864.2013.751159
  5. 5.
    Mori M (1970) The uncanny valley. Energy 7(4):33–35Google Scholar
  6. 6.
    Mori M, MacDorman KF, Kageki N (2012) The uncanny valley (from the field). IEEE Autom Mag Robot 19(2):98–100. doi: 10.1109/MRA.2012.2192811 CrossRefGoogle Scholar
  7. 7.
    Ramey CH (2005) The uncanny valley of similarities concerning abortion, baldness, heaps of sand, and humanlike robots. In: Proceedings of views of the uncanny valley workshop: IEEE-RAS international conference on humanoid robots, pp 8–13Google Scholar
  8. 8.
    Kaplan F (2004) Who is afraid of the humanoid? Investigating cultural differences in the acceptance of robots. Int J Hum Robot 1(3):1–16. doi: 10.1142/S0219843604000289 CrossRefGoogle Scholar
  9. 9.
    MacDorman KF, Vasudevan SK, Ho CC (2009) Does Japan really have robot mania? Comparing attitudes by implicit and explicit measures. AI Soc 23(4):485–510. doi: 10.1007/s00146-008-0181-2 CrossRefGoogle Scholar
  10. 10.
    MacDorman KF, Ishiguro H (2006) The uncanny advantage of using androids in cognitive and social science research. Interact Stud 7(3):297–337. doi: 10.1075/is.7.3.03mac CrossRefGoogle Scholar
  11. 11.
    Rosenthal-von der Ptten AM, Krmer NC, Becker-Asano C, Ogawa K, Nishio S, Ishiguro H (2014) The uncanny in the wild. Analysis of unscripted humanandroid interaction in the field. Int J Soc Robot 6(1):67–83. doi: 10.1007/s12369-013-0198-7 CrossRefGoogle Scholar
  12. 12.
    MacDorman KF, Entezari SO (2015) Individual differences predict sensitivity to the uncanny valley. Interact Stud 16(2):141172. doi: 10.1075/is.16.2.01mac CrossRefGoogle Scholar
  13. 13.
    Tajfel H, Turner JC (1979) An integrative theory of intergroup conflict. Soc Psychol Intergroup Relat 33(47):74. doi: 10.1146/annurev.ps.33.020182.000245 Google Scholar
  14. 14.
    Brewer MB (1991) The social self: on being the same and different at the same time. Personal Soc Psychol Bull 17(5):475–482. doi: 10.1177/0146167291175001 MathSciNetCrossRefGoogle Scholar
  15. 15.
    Jetten J, Spears R, Manstead AS (1996) Intergroup norms and intergroup discrimination: distinctive self-categorization and social identity effects. J Personal Soc Psychol 71(6):1222. doi: 10.1037/0022-3514.71.6.1222 CrossRefGoogle Scholar
  16. 16.
    Jetten J, Spears R, Manstead AS (1997) Distinctiveness threat and prototypicality: combined effects on intergroup discrimination and collective self-esteem. Eur J Soc Psychol 27(6):635–657. doi:10.1002/(SICI)1099-0992(199711/12)27:63.0.CO;2-#Google Scholar
  17. 17.
    Haslam N (2006) Dehumanization: an integrative review. Personal Soc Psychol Rev 10(3):252–264. doi: 10.1207/s15327957pspr1003_4 CrossRefGoogle Scholar
  18. 18.
    Vaes J, Leyens JP, Paola Paladino M, Pires Miranda M (2012) We are human, they are not: driving forces behind outgroup dehumanisation and the humanisation of the ingroup. Eur Rev Soc Psychol 23(1):64–106. doi: 10.1080/10463283.2012.665250 CrossRefGoogle Scholar
  19. 19.
    Enz S, Diruf M, Spielhagen C, Zoll C, Vargas PA (2011) The social role of robots in the futureexplorative measurement of hopes and fears. Int J Soc Robot 3(3):263–271. doi: 10.1007/s12369-011-0094-y CrossRefGoogle Scholar
  20. 20.
    Hegel F, Eyssel F, Wrede B (2010) The social robot ‘flobi’: key concepts of industrial design. In: IEEE RO-MAN 2010, pp 107–112. doi: 10.1109/ROMAN.2010.5598691
  21. 21.
    Ishiguro H, Ono T, Imai M, Maeda T, Kanda T, Nakatsu R (2001) Robovie: an interactive humanoid robot. Ind Robot 28(6):498–504. doi: 10.1108/01439910110410051 CrossRefGoogle Scholar
  22. 22.
    Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A (2009) Whole body emotion expressions for KOBIAN humanoid robotpreliminary experiments with different emotional patterns. In: The 18th IEEE international symposium on robot and human interactive communication, 2009. RO-MAN 2009, pp 381–386. doi: 10.1109/ROMAN.2009.5326184
  23. 23.
    Hornsey MJ, Jetten J (2003) Not being what you claim to be: impostors as sources of group threat. Eur J Soc Psychol 33:639–657. doi: 10.1002/ejsp.176 CrossRefGoogle Scholar
  24. 24.
    Jetten J, Summerville N, Hornsey MJ, Mewse AJ (2005) When differences matter: intergroup distinctiveness and the evaluation of impostors. Eur J Soc Psychol 35:609–620. doi: 10.1002/ejsp.282 CrossRefGoogle Scholar
  25. 25.
    Warner R, Hornsey MJ, Jetten J (2007) Why minority group members resent impostors. Eur J Soc Psychol 37(1):1–17. doi: 10.1002/ejsp.332 CrossRefGoogle Scholar
  26. 26.
    Jetten J, Hornsey MJ (eds) (2010) Rebels in groups: dissent, deviance, difference, and defiance. Wiley, Hoboken. doi: 10.1002/ejsp.332 Google Scholar
  27. 27.
    Gray K, Wegner DM (2012) Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125(1):125130. doi: 10.1016/j.cognition.2012.06.007 CrossRefGoogle Scholar
  28. 28.
    Ferrari F, Paladino MP (2014) Validation of the psychological scale of general impressions of humanoids in an italian sample. In: Workshop proceedings of IAS-13, 13th international conference on intelligent autonomous systems, Padova, Accessed July 15–19, pp 436–441, ISBN: 978-88-95872-06-3Google Scholar
  29. 29.
    Gray HM, Gray K, Wegner DM (2007) Dimensions of mind perception. Science 315(5812):619. doi: 10.1126/science.1134475 CrossRefGoogle Scholar
  30. 30.
  31. 31.
    Hahn-Holbrook J, Holt-Lunstad J, Holbrook C, Coyne SM, Lawson ET (2011) Maternal defense: breast feeding increases aggression by reducing stress. Psychol Sci 22:1288–1295. doi: 10.1177/0956797611420729 CrossRefGoogle Scholar
  32. 32.
    Legault L, Gutsell JN, Inzlicht M (2011) Ironic effects of antiprejudice messages: how motivational interventions can reduce (but also increase) prejudice. Psychol Sci 22:1472–1477. doi: 10.1177/0956797611427918 CrossRefGoogle Scholar
  33. 33.
    Waytz A, Heafner J, Epley N (2014) The mind in the machine: anthropomorphism increases trust in an autonomous vehicle. J Exp Soc Psychol 52:113–117. doi: 10.1016/j.jesp.2014.01.005 CrossRefGoogle Scholar
  34. 34.
    Fritz MS, MacKinnon DP (2007) Required sample size to detect the mediated effect. Psychol Sci 18:233–239. doi: 10.1111/j.1467-9280.2007.01882.x CrossRefGoogle Scholar
  35. 35.
    Preacher KJ, Hayes AF (2008) Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behav Res Methods 40(3):879–891. doi: 10.3758/BRM.40.3.879 CrossRefGoogle Scholar
  36. 36.
    Mazzei D, Billeci L, Armato A, Lazzeri N, Cisternino A, Pioggia G, Igliozzi R, Muratori F, Ahluwalia A, De Rossi D (2010) The FACE of autism. In: Proceedings—IEEE international workshop on robot and human interactive communication, art. no. 5598683, pp 791–796. doi: 10.1109/ROMAN.2010.5598683
  37. 37.
    Mazzei D, Lazzeri N, Billeci L, Igliozzi R, Mancini A, Ahluwalia A, Muratori F, De Rossi D (2011) Development and evaluation of a social robot platform for therapy in autism. In: Proceedings of the annual international conference of the IEEE engineering in medicine and biology society, EMBS, art. no. 6091119, pp 4515–4518. doi: 10.1109/IEMBS.2011.6091119
  38. 38.
    Luke MA, Maio GR (2009) Oh the humanity! Humanity-esteem and its social importance. J Res Personal 43(4):586–601. doi: 10.1016/j.jrp.2009.03.001 CrossRefGoogle Scholar
  39. 39.
    Hyman HH (1955) Survey design and analysis: principles, cases, and procedures. Free Press, Glencoe. doi: 10.1177/001316445601600312 Google Scholar
  40. 40.
    Judd CM, Kenny DA (1981) Process analysis estimating mediation in treatment evaluations. Eval Rev 5(5):602–619. doi: 10.1177/0193841X8100500502 CrossRefGoogle Scholar
  41. 41.
    Baron RM, Kenny DA (1986) The moderatormediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. J Personal Soc Psychol 51(6):1173. doi: 10.1037/0022-3514.51.6.1173 CrossRefGoogle Scholar
  42. 42.
    Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42:177–190. doi: 10.1016/S0921-8890(02)00374-3 CrossRefzbMATHGoogle Scholar
  43. 43.
    Fink J (2012) Anthropomorphism and human likeness in the design of robots and human–robot interaction. Springer, New York, pp 199–208. doi: 10.1007/978-3-642-34103-8_20 Google Scholar
  44. 44.
    Mitchell WJ, Ho CC, Patel H, MacDorman KF (2011) Does social desirability bias favor humans? Explicitimplicit evaluations of synthesized speech support a new HCI model of impression management. Comput Hum Behav 27(1):402–412. doi: 10.1016/j.chb.2010.09.002 CrossRefGoogle Scholar
  45. 45.
    MacDorman KF, Coram JA, Ho CC, Patel H (2010) Gender differences in the impact of presentational factors in human character animation on decisions in ethical dilemmas. Presence 19(3):213–229. doi: 10.1162/pres.19.3.213 CrossRefGoogle Scholar
  46. 46.
    Heerink M, Krse B, Evers V, Wielinga B (2010) Assessing acceptance of assistive social agent technology by older adults: the almere model. Int J Soc Robot 2(4):361–375. doi: 10.1007/s12369-010-0068-5 CrossRefGoogle Scholar
  47. 47.
    Rosenthal-von der Ptten AM, Krmer NC (2014) How design characteristics of robots determine evaluation and uncanny valley related responses. Comput Hum Behav 36:422–439. doi: 10.1016/j.chb.2014.03.066 CrossRefGoogle Scholar
  48. 48.
    Becker-Asano C, Ogawa K, Nishio S, Ishiguro H (2010) Exploring the uncanny valley with Geminoid HI-1 in a real-world application. In: Proceedings of IADIS International conference interfaces and human computer interaction, pp 121–128. ISBN: 978-972-8939-18-2Google Scholar
  49. 49.
    Haring KS, Mougenot C, Ono F, Watanabe K (2014) Cultural differences in perception and attitude towards robots. Int J Affect Eng 13(3):149–157. doi: 10.5057/ijae.13.149 CrossRefGoogle Scholar
  50. 50.
    Bartneck C (2008, August) Who like androids more: Japanese or US Americans?. In: The 17th IEEE international symposium on robot and human interactive communication, 2008. RO-MAN 2008, pp 553–557. doi: 10.1109/ROMAN.2008.4600724
  51. 51.
    Burleigh TJ, Schoenherr JR, Lacroix GL (2013) Does the uncanny valley exist? An empirical test of the relationship between eeriness and the human likeness of digitally created faces. Comput Hum Behav 29(3):759–771. doi: 10.1162/pres.16.4.337 CrossRefGoogle Scholar
  52. 52.
    Sorbello R, Chella A, Giardina M, Nishio S, Ishiguro H (2014) An architecture for telenoid robot as empathic conversational android companion for elderly people. In: The 13th international conference on intelligent autonomous systems (IAS-13), PadovaGoogle Scholar
  53. 53.
    Damiano L, Dumouchel P, Lehmann H (2014) Towards human robot affective co-evolution overcoming oppositions in constructing emotions and empathy. Int J Soc Robot 7(1):7–18CrossRefGoogle Scholar
  54. 54.
    Leite I, Castellano G, Pereira A, Martinho C, Paiva A (2014) Empathic robots for long-term interaction. Int J Soc Robot 6(3):329–341CrossRefGoogle Scholar
  55. 55.
    Asada M (2014) Towards artificial empathy. Int J Soc Robot 7(1):19–33CrossRefGoogle Scholar
  56. 56.
    Lim A, Okuno HG (2014) A recipe for empathy. Int J Soc Robot 7(1):35–49CrossRefGoogle Scholar
  57. 57.
    Saygin AP, Chaminade T, Ishiguro H, Driver J, Frith C (2012) The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Soc Cogn Affect Neurosci 7(4):413–422. doi: 10.1093/scan/nsr025 PMID: 21515639CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  • Francesco Ferrari
    • 1
  • Maria Paola Paladino
    • 1
    Email author
  • Jolanda Jetten
    • 2
  1. 1.Department of Psychology and Cognitive ScienceUniversity of TrentoRoveretoItaly
  2. 2.School of PsychologyThe University of QueenslandSt LuciaAustralia

Personalised recommendations