The rapid rise of computing power has prompted the desire to develop more social, human-like robots. Quantitatively comparing different computing systems on their ability to simulate human qualities has been a major technical challenge. A recent framework put forth by Gray et al. (Science 315(5812):619, 2007. https://doi.org/10.1126/science.1134475) provides promise as a new means for comparing robots. While the framework has been validated for assessing individual robots with different descriptions, or different behaviours, the framework has not been applied to a wider landscape of robots and machines situated amongst other characters. The present study sought to investigate attributions of mind towards a wide range of real and fictional robots. We asked participants to rate the agency (the ability “to do”) and experience (the ability “to feel”) of 24 characters made up of humans, robots, inanimate objects, and animals. Although robots were collectively rated lower than humans on agency and experience, there was significant variation among robots—even when fictional robots were omitted. The results of this investigation suggest that building robots that are perceived to feel is a fruitful avenue for future development as people are more open to perceiving aspects of mind in a wider range of robots than previously established. Our results also indicate that age is a critical factor in people’s attributions of mind to robots, suggesting that there may be a generational shift towards greater acceptance of robots’ ability to both do and feel.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
Cai Y (2006) Empathic computing. In: Ambient intelligence in everyday life. Springer, Berlin, pp 67–85
Casler K, Bickel L, Hackett E (2013) Separate but equal? A comparison of participants and data gathered via Amazon’s MTurk, social media, and face-to-face behavioral testing. Comput Hum Behav 29(6):2156–2160. https://doi.org/10.1016/j.chb.2013.05.009
Chalmers DJ (1992) Subsymbolic computation and the Chinese room. In: Dinsmore J (ed) The symbolic and connectionist paradigms: closing the gap. Lawrence Erlbaum, Hillsdale, NJ
Chomsky N, Schaff A (1997) Language and cognition. In: Johnson DM, Erneling CE (eds) The future of the cognitive revolution. Oxford University Press, pp 15–31
de Graaf MM, Malle BF (2018) People's judgments of human and robot behaviors: a robust set of behaviors and some discrepancies. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction, pp 97–98
Dennett DC (1991) Consciousness explained. Little, Brown and Co, Boston
Epley N, Waytz A, Cacioppo JT (2007) On seeing human: a three-factor theory of anthropomorphism. Psychol Rev 114(4):864–886. https://doi.org/10.1037/0033-295X.114.4.864
Eyssel F, Kuchenbrandt D, Bobinger S (2011) Effects of anticipated human-robot interaction and predictability of robot behavior on perceptions of anthropomorphism. In: HRI 2011-proceedings of the 6th ACM/IEEE international conference on human–robot interaction, pp 61–67. https://doi.org/10.1145/1957656.1957673
Gazzola V, Rizzolatti G, Wicker B, Keysers C (2007) The anthropomorphic brain: the mirror neuron system responds to human and robotic actions. Neuroimage 35(4):1674–1684. https://doi.org/10.1016/j.neuroimage.2007.02.003
Gray H, Gray K, Wegner DM (2007) Dimensions of mind perception. Science 315(5812):619. https://doi.org/10.1126/science.1134475
Gray K, Jenkins AC, Heberlein AS, Wegner DM (2011) Distortions of mind perception in psychopathology. Proc Natl Acad Sci 108(2):477–479
Gray K, Wegner DM (2012) Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125(1):125–130. https://doi.org/10.1016/j.cognition.2012.06.007
Hauser DJ, Schwarz N (2016) Attentive Turkers: MTurk participants perform better on online attention checks than do subject pool participants. Behav Res Methods 48(1):400–407. https://doi.org/10.3758/s13428-015-0578-z
Heider F, Simmel M (1944) An experimental study of apparent behavior. Am J Psychol 57(2):243–259
Kamide H, Eyssel F, Arai T (2013) Psychological anthropomorphism of robots. In: International conference on social robotics. Springer, Cham, pp 199–208. https://doi.org/10.1007/978-3-319-02675-6_20
Malle BF (2019) How many dimensions of mind perception really are there? In: Proceedings of the 41st annual meeting of the cognitive science society, 1987, pp 2268–2274. http://bit.ly/SA_MindCapacities
McCorduck P, Cfe C (2004) Machines who think: a personal inquiry into the history and prospects of artificial intelligence. CRC Press, Boca Raton
Miller GA (2003) The cognitive revolution: a historical perspective. Trends Cogn Sci 7(3):141–144. https://doi.org/10.1016/S1364-6613(03)00029-9
Moore JW (2016) What is the sense of agency and why does it matter? Front Psychol 7:1–9. https://doi.org/10.3389/fpsyg.2016.01272
Phillips E, Zhao X, Ullman D, Malle BF (2018) What is human-like? Decomposing robots' human-like appearance using the anthropomorphic roBOT (ABOT) database. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction, pp 105–113. https://doi.org/10.1145/3171221.3171268
Picard RW (2000) Affective computing. MIT Press, Cambridge, Mass
Saygin AP, Cicekli I, Akman V (2003) Turing test: 50 years later, pp 23–78. https://doi.org/10.1007/978-94-010-0105-2_2
Searle JR (1980) Minds and brains without programs. Mindwaves 3:1–19
Stafford RQ, MacDonald BA, Jayawardena C, Wegner DM, Broadbent E (2014) Does the robot have a mind? Mind perception and attitudes towards robots predict use of an eldercare robot. Int J Soc Robot 6(1):17–32. https://doi.org/10.1007/s12369-013-0186-y
Tharp M, Holtzman NS, Eadeh FR (2017) Mind perception and individual differences: a replication and extension. Basic Appl Soc Psychol 39(1):68–73
Thellman S, Silvervarg A, Ziemke T (2017) Folk-psychological interpretation of human vs. humanoid robot behavior: exploring the intentional stance toward robots. Front Psychol 8:1–14. https://doi.org/10.3389/fpsyg.2017.01962
Waytz A, Cacioppo J, Epley N (2010) Who sees human? The stability and importance of individual differences in anthropomorphism. Perspect Psychol Sci 5(3):219–232. https://doi.org/10.1177/1745691610369336
Weizenbaum J (1966) ELIZA-A computer program for the study of natural language communication between man and machine. Commun ACM 9(1):36–45. https://doi.org/10.1145/365153.365168
This work was supported by Canadian grants to Alan Kingstone from Mitacs Inc (IT16021), the Natural Sciences of Engineering Research Council (NSERC, RGPIN-2016-04319), and the Social Sciences and Humanities Research Council (SSHRC, 435-2019-0749). The authors have no financial or proprietary conflicts of interest in any material discussed in the paper. This study was approved by the ethics board of the University of British Columbia (H10-00527).
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Below is the link to the electronic supplementary material.
About this article
Cite this article
Jacobs, O.L., Gazzaz, K. & Kingstone, A. Mind the Robot! Variation in Attributions of Mind to a Wide Set of Real and Fictional Robots. Int J of Soc Robotics (2021). https://doi.org/10.1007/s12369-021-00807-4
- Mind perception
- Artificial intelligence
- Humanoid robots