Skip to main content

Mind the Robot! Variation in Attributions of Mind to a Wide Set of Real and Fictional Robots


The rapid rise of computing power has prompted the desire to develop more social, human-like robots. Quantitatively comparing different computing systems on their ability to simulate human qualities has been a major technical challenge. A recent framework put forth by Gray et al. (Science 315(5812):619, 2007. provides promise as a new means for comparing robots. While the framework has been validated for assessing individual robots with different descriptions, or different behaviours, the framework has not been applied to a wider landscape of robots and machines situated amongst other characters. The present study sought to investigate attributions of mind towards a wide range of real and fictional robots. We asked participants to rate the agency (the ability “to do”) and experience (the ability “to feel”) of 24 characters made up of humans, robots, inanimate objects, and animals. Although robots were collectively rated lower than humans on agency and experience, there was significant variation among robots—even when fictional robots were omitted. The results of this investigation suggest that building robots that are perceived to feel is a fruitful avenue for future development as people are more open to perceiving aspects of mind in a wider range of robots than previously established. Our results also indicate that age is a critical factor in people’s attributions of mind to robots, suggesting that there may be a generational shift towards greater acceptance of robots’ ability to both do and feel.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4


  1. 1.

    Cai Y (2006) Empathic computing. In: Ambient intelligence in everyday life. Springer, Berlin, pp 67–85

    Chapter  Google Scholar 

  2. 2.

    Casler K, Bickel L, Hackett E (2013) Separate but equal? A comparison of participants and data gathered via Amazon’s MTurk, social media, and face-to-face behavioral testing. Comput Hum Behav 29(6):2156–2160.

    Article  Google Scholar 

  3. 3.

    Chalmers DJ (1992) Subsymbolic computation and the Chinese room. In: Dinsmore J (ed) The symbolic and connectionist paradigms: closing the gap. Lawrence Erlbaum, Hillsdale, NJ

    Google Scholar 

  4. 4.

    Chomsky N, Schaff A (1997) Language and cognition. In: Johnson DM, Erneling CE (eds) The future of the cognitive revolution. Oxford University Press, pp 15–31

    Google Scholar 

  5. 5.

    de Graaf MM, Malle BF (2018) People's judgments of human and robot behaviors: a robust set of behaviors and some discrepancies. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction, pp 97–98

  6. 6.

    Dennett DC (1991) Consciousness explained. Little, Brown and Co, Boston

    Google Scholar 

  7. 7.

    Epley N, Waytz A, Cacioppo JT (2007) On seeing human: a three-factor theory of anthropomorphism. Psychol Rev 114(4):864–886.

    Article  Google Scholar 

  8. 8.

    Eyssel F, Kuchenbrandt D, Bobinger S (2011) Effects of anticipated human-robot interaction and predictability of robot behavior on perceptions of anthropomorphism. In: HRI 2011-proceedings of the 6th ACM/IEEE international conference on human–robot interaction, pp 61–67.

  9. 9.

    Gazzola V, Rizzolatti G, Wicker B, Keysers C (2007) The anthropomorphic brain: the mirror neuron system responds to human and robotic actions. Neuroimage 35(4):1674–1684.

    Article  Google Scholar 

  10. 10.

    Gray H, Gray K, Wegner DM (2007) Dimensions of mind perception. Science 315(5812):619.

    Article  Google Scholar 

  11. 11.

    Gray K, Jenkins AC, Heberlein AS, Wegner DM (2011) Distortions of mind perception in psychopathology. Proc Natl Acad Sci 108(2):477–479

    Article  Google Scholar 

  12. 12.

    Gray K, Wegner DM (2012) Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125(1):125–130.

    Article  Google Scholar 

  13. 13.

    Hauser DJ, Schwarz N (2016) Attentive Turkers: MTurk participants perform better on online attention checks than do subject pool participants. Behav Res Methods 48(1):400–407.

    Article  Google Scholar 

  14. 14.

    Heider F, Simmel M (1944) An experimental study of apparent behavior. Am J Psychol 57(2):243–259

    Article  Google Scholar 

  15. 15.

    Kamide H, Eyssel F, Arai T (2013) Psychological anthropomorphism of robots. In: International conference on social robotics. Springer, Cham, pp 199–208.

  16. 16.

    Malle BF (2019) How many dimensions of mind perception really are there? In: Proceedings of the 41st annual meeting of the cognitive science society, 1987, pp 2268–2274.

  17. 17.

    McCorduck P, Cfe C (2004) Machines who think: a personal inquiry into the history and prospects of artificial intelligence. CRC Press, Boca Raton

    Book  Google Scholar 

  18. 18.

    Miller GA (2003) The cognitive revolution: a historical perspective. Trends Cogn Sci 7(3):141–144.

    Article  Google Scholar 

  19. 19.

    Moore JW (2016) What is the sense of agency and why does it matter? Front Psychol 7:1–9.

    Article  Google Scholar 

  20. 20.

    Phillips E, Zhao X, Ullman D, Malle BF (2018) What is human-like? Decomposing robots' human-like appearance using the anthropomorphic roBOT (ABOT) database. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction, pp 105–113.

  21. 21.

    Picard RW (2000) Affective computing. MIT Press, Cambridge, Mass

    Book  Google Scholar 

  22. 22.

    Saygin AP, Cicekli I, Akman V (2003) Turing test: 50 years later, pp 23–78.

  23. 23.

    Searle JR (1980) Minds and brains without programs. Mindwaves 3:1–19

    Google Scholar 

  24. 24.

    Stafford RQ, MacDonald BA, Jayawardena C, Wegner DM, Broadbent E (2014) Does the robot have a mind? Mind perception and attitudes towards robots predict use of an eldercare robot. Int J Soc Robot 6(1):17–32.

    Article  Google Scholar 

  25. 25.

    Tharp M, Holtzman NS, Eadeh FR (2017) Mind perception and individual differences: a replication and extension. Basic Appl Soc Psychol 39(1):68–73

    Article  Google Scholar 

  26. 26.

    Thellman S, Silvervarg A, Ziemke T (2017) Folk-psychological interpretation of human vs. humanoid robot behavior: exploring the intentional stance toward robots. Front Psychol 8:1–14.

    Article  Google Scholar 

  27. 27.

    Waytz A, Cacioppo J, Epley N (2010) Who sees human? The stability and importance of individual differences in anthropomorphism. Perspect Psychol Sci 5(3):219–232.

    Article  Google Scholar 

  28. 28.

    Weizenbaum J (1966) ELIZA-A computer program for the study of natural language communication between man and machine. Commun ACM 9(1):36–45.

    Article  Google Scholar 

Download references


This work was supported by Canadian grants to Alan Kingstone from Mitacs Inc (IT16021), the Natural Sciences of Engineering Research Council (NSERC, RGPIN-2016-04319), and the Social Sciences and Humanities Research Council (SSHRC, 435-2019-0749). The authors have no financial or proprietary conflicts of interest in any material discussed in the paper. This study was approved by the ethics board of the University of British Columbia (H10-00527).

Author information



Corresponding author

Correspondence to Oliver L. Jacobs.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (DOCX 14 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Jacobs, O.L., Gazzaz, K. & Kingstone, A. Mind the Robot! Variation in Attributions of Mind to a Wide Set of Real and Fictional Robots. Int J of Soc Robotics (2021).

Download citation


  • Mind perception
  • Artificial intelligence
  • Humanoid robots
  • Agency
  • Experience