Advertisement

Adopting the Intentional Stance Towards Humanoid Robots

  • Jairo Perez-Osorio
  • Agnieszka WykowskaEmail author
Chapter
  • 195 Downloads
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 130)

Abstract

On the day by day humans need to predict and understand others’ behavior in order to efficiently navigate through our social environment. When making predictions about what others are going to do next, we refer to their mental states, such as beliefs or intentions. At the dawn of a new era, in which robots will be among us at homes and offices, one needs to ask whether (or when) we predict and also explain robots’ behavior with reference to mental states. In other words, do we adopt the intentional stance (Dennett in The Intentional Stance. MIT Press, Cambridge (1987) [1]) also towards artificial agents—especially those with humanoid shape and human-like behavior? What plays a role in adopting the intentional stance towards robots? Does adopting an intentional stance affect our social attunement with artificial agents? In this chapter, we first discuss the general approach that we take towards examining these questions—using objective methods of cognitive neuroscience to test social attunement as a function of adopting the intentional stance. Also, we describe our newly developed method to examine whether participants adopt the intentional stance towards an artificial agent. The chapter concludes with an outlook to the questions that still need to be addressed, like ethical consequences and societal impact of robots with which we attune socially, and towards which we adopt the intentional stance.

Keywords

Intentional stance Social robotics Human-robot interaction Mental states 

Notes

Acknowledgements

Work on this chapter, and research agenda described in the section “Future directions” of this chapter have been supported by the European Research Council under the European Union’s Horizon 2020 research and innovation program (grant awarded to AW, titled “InStance: Intentional Stance for Social Attunement”. ERC starting grant, grant agreement No.: 715058).

References

  1. 1.
    Dennett, D.C.: The Intentional Stance. MIT Press, Cambridge (1987)Google Scholar
  2. 2.
    Dennett, D.C.: Intentional systems. J. Philos. Bradford Books, 68 (Feb 1971)CrossRefGoogle Scholar
  3. 3.
    Dennett, D.: Intentional systems theory. In: The Oxford Handbook of Philosophy of Mind. (2009)  https://doi.org/10.1093/oxfordhb/9780199262618.003.0020CrossRefGoogle Scholar
  4. 4.
    Gray, H.M., Gray, K., Wegner, D.M.: Dimensions of mind perception. Science 315(5812), 619 (2007)CrossRefGoogle Scholar
  5. 5.
    Heider, F., Simmel, M.: An experimental study of apparent behaviour. Am. J. Psychol. 57, 243–259 (1944)CrossRefGoogle Scholar
  6. 6.
    Epley, N., Waytz, A., Cacioppo, J.T.: On seeing human: a three-factor theory of anthropomorphism. Psychol. Rev. 114(4), 864–886 (2007).  https://doi.org/10.1037/0033-295x.114.4.864CrossRefGoogle Scholar
  7. 7.
    Mullin, M.H., Mitchell, R.W., Thompson, N.S., Miles, H.L.: Anthropomorphism, anecdotes, and animals. In: Current Anthropology (1997)Google Scholar
  8. 8.
    Waytz, A., Epley, N., Cacioppo, J.T.: Social cognition unbound: insights into anthropomorphism and dehumanization. Curr. Dir. Psychol. Sci. (2010).  https://doi.org/10.1177/0963721409359302CrossRefGoogle Scholar
  9. 9.
    Wiese, E., Metta, G., Wykowska, A.: Robots as intentional agents: using neuroscientific methods to make robots appear more social. Frontiers Psychol. (2017).  https://doi.org/10.3389/fpsyg.2017.01663
  10. 10.
    Castelli, F., Happe, F., Frith, U., Frith, C.: Movement and mind: a functional imaging study of perception and interpretation of complex intentional movement patterns. NeuroImage 12, 314–325 (2000)CrossRefGoogle Scholar
  11. 11.
    Iacoboni, M., et al.: Watching social interactions produces dorsomedial prefrontal and medial parietal BOLD fMRI signal increases compared to a resting baseline. NeuroImage (2004).  https://doi.org/10.1016/j.neuroimage.2003.11.013CrossRefGoogle Scholar
  12. 12.
    Wimmer, H., Perner, J.: Beliefs about beliefs: representation and constraining function of wrong beliefs in young children’s understanding of deception. Cognition (1983).  https://doi.org/10.1016/0010-0277(83)90004-5CrossRefGoogle Scholar
  13. 13.
    Griffin, R., Baron-Cohen, S.: The intentional stance: developmental and neurocognitive perspectives. In: Daniel Dennett (2002)Google Scholar
  14. 14.
    Apperly, I.: Mindreaders: The Cognitive Basis of ‘Theory of Mind’. Psychology Press, New York, NY, US (2011)Google Scholar
  15. 15.
    Woodward, A.L.: Infants selectively encode the goal object of an actor’s reach’. Cognition (1998).  https://doi.org/10.1016/s0010-0277(98)00058-4CrossRefGoogle Scholar
  16. 16.
    Senju, A., Csibra, G., Johnson, M.H.: Understanding the referential nature of looking: infants preference for object-directed gaze. Cognition 108(2), 303–319 (2008).  https://doi.org/10.1016/j.cognition.2008.02.009CrossRefGoogle Scholar
  17. 17.
    Stern, D.N.: The interpersonal world of the infant: a view from psychoanalysis and developmental psychology. In: New York Basic (1998)  https://doi.org/10.1017/cbo9781107415324.004
  18. 18.
    Gergely, G., Csibra, G.: Teleological reasoning in infancy: the naïve theory of rational action. Trends Cogn. Sci. (2003).  https://doi.org/10.1016/s1364-6613(03)00128-1CrossRefGoogle Scholar
  19. 19.
    Ma, L., Lillard, A.S.: Where is the real cheese? Young children’s ability to discriminate between real and pretend acts. Child Dev. (2006).  https://doi.org/10.1111/j.1467-8624.2006.00972.xCrossRefGoogle Scholar
  20. 20.
    Behne, T., et al.: Unwilling versus unable: infants understanding of intentional action. Dev. Psychol. (2005).  https://doi.org/10.1037/0012-1649.41.2.328CrossRefGoogle Scholar
  21. 21.
    Repacholi, B.M., Gopnik, A.: Early reasoning about desires: evidence from 14- and 18-month-olds. Dev. Psychol. (1997).  https://doi.org/10.1037/0012-1649.33.1.12CrossRefGoogle Scholar
  22. 22.
    Tomasello, M., et al.: ‘Understanding and sharing intentions: the origins of cultural cognition. Behav. Brain Sci. (2005).  https://doi.org/10.1017/s0140525x05000129CrossRefGoogle Scholar
  23. 23.
    Baldwin, D.A., et al.: Infants parse dynamic action. Child Dev. (2001).  https://doi.org/10.1111/1467-8624.00310CrossRefGoogle Scholar
  24. 24.
    Sorce, J.F., et al.: Maternal emotional signaling. Its effect on the visual cliff behavior of 1-year-olds. Dev. Psychol. (1985).  https://doi.org/10.1037/0012-1649.21.1.195CrossRefGoogle Scholar
  25. 25.
    Feinman, S., Lewis, M.: Social referencing at ten months: a second-order effect on infants’ responses to strangers. Child Dev. (1983).  https://doi.org/10.1111/j.1467-8624.1983.tb00509.xGoogle Scholar
  26. 26.
    Johnson, S., Slaughter, V., Carey, S.: Whose gaze will infants follow? The elicitation of gaze following in 12-month-olds. Dev. Sci. (1998).  https://doi.org/10.1111/1467-7687.00036CrossRefGoogle Scholar
  27. 27.
    Carpenter, M., Akhtar, N., Tomasello, M.: Fourteen-through 18-month-old infants differentially imitate intentional and accidental actions. Infant Behav. Dev. (1998).  https://doi.org/10.1016/s0163-6383(98)90009-1CrossRefGoogle Scholar
  28. 28.
    Meltzoff, A.N.: Understanding the intentions of others: re-enactment of intended acts by 18-month-old children. Dev. Psychol. (1995).  https://doi.org/10.1037/0012-1649.31.5.838CrossRefGoogle Scholar
  29. 29.
    Tomasello, M., Kruger, A.C., Ratner, H.H.: Cultural learning. Behav. Brain Sci. (1993).  https://doi.org/10.1017/s0140525x0003123xCrossRefGoogle Scholar
  30. 30.
    Harris, P.: Pretending and planning. In: Baron-Cohen, S., Tager-Flusberg, H., Cohen, D. (eds.) Understanding Other Minds: Perspectives from Autism. Oxford University Press, Oxford (1993)Google Scholar
  31. 31.
    Wellman, H.M., Cross, D., Watson, J.: Meta-analysis of theory-of-mind development: the truth about false belief. Child Dev. (2001).  https://doi.org/10.1111/1467-8624.00304CrossRefGoogle Scholar
  32. 32.
    Bartsch, K., Wellman, H.M.: Children Talk About the Mind. Oxford University Press, New York, NY, US (1995)Google Scholar
  33. 33.
    Lillard, A.S.: Wanting to be it: children’s understanding of intentions underlying pretense. Child Dev. 69, 981–993 (1998)CrossRefGoogle Scholar
  34. 34.
    Schult, C.A.: Children’s understanding of the distinction between intentions and desires. Child Dev. 73, 1727–1747 (2002)CrossRefGoogle Scholar
  35. 35.
    Perner, J.: Understanding the Representational Mind. MIT Press, Cambridge (1991)Google Scholar
  36. 36.
    Baird, J.A., Moses, L.J.: Do preschoolers appreciate that identical actions may be motivated by different intentions?. J. Cogn. Dev. (2001).  https://doi.org/10.1207/s15327647jcd0204_4CrossRefGoogle Scholar
  37. 37.
    Johnson, S.C.: Detecting agents. Philos. Trans. Roy. Soc. B Biol. Sci. (2003).  https://doi.org/10.1098/rstb.2002.1237CrossRefGoogle Scholar
  38. 38.
    Mar, R.A., Macrae, C.N.: Triggering the intentional stance. In: Empathy and Fairness, pp. 111–120 (2008).  https://doi.org/10.1002/9780470030585.ch9CrossRefGoogle Scholar
  39. 39.
    Dennett, D.C.: True believers: the intentional strategy and why it works. In: Mind Design (1997).  https://doi.org/10.1007/s13398-014-0173-7.2
  40. 40.
    Malle, B.F.: Attribution theories: how people make sense of behavior. Theor. Soc. Psychol. 23, 72–95 (2011)Google Scholar
  41. 41.
    Michael, J.: The intentional stance and cultural learning: a developmental feedback loop. In: Content and Consciousness Revisited (2015).  https://doi.org/10.1007/978-3-319-17374-0_9CrossRefGoogle Scholar
  42. 42.
    Searle, J.R.: Construction of social reality. In: The Free Press (1995)Google Scholar
  43. 43.
    Scholl, B.J., Tremoulet, P.D.: Perceptual causality and animacy. Trends Cogn. Sci. 4(8), 299–309 (2000).  https://doi.org/10.1016/s1364-6613(00)01506-0; Gilbert, M.: Walking together: a paradigmatic social phenomenon. Midwest Stud. Philos. (1990).  https://doi.org/10.1111/j.1475-4975.1990.tb00202.x
  44. 44.
    Frith, C., Frith, U.: How we predict what other people are going to do. Brain Res. 1079(1), 36–46 (2000)CrossRefGoogle Scholar
  45. 45.
    Fletcher, P.C., et al.: Other minds in the brain: a functional imaging study of “theory of mind” in story comprehension. Cognition (1995).  https://doi.org/10.1016/0010-0277(95)00692-rCrossRefGoogle Scholar
  46. 46.
    Gallagher, H.L., Happe, F., Brunswick, N., Fletcher, P.C., Frith, U., Frith, C.D.: Reading the mind in cartoons and stories: an fMRI study of “theory of mind” in verbal and nonverbal tasks. Neuropsychologia 38, 11–21 (2000)CrossRefGoogle Scholar
  47. 47.
    Saxe, R., Kanwisher, N.: People thinking about thinking people: the role of the temporo-parietal junction in “theory of mind”. Soc. Neurosci. Key Readings (2013).  https://doi.org/10.4324/9780203496190CrossRefGoogle Scholar
  48. 48.
    Brunet, E., et al.: A PET investigation of the attribution of intentions with a nonverbal task. NeuroImage (2000).  https://doi.org/10.1006/nimg.1999.0525CrossRefGoogle Scholar
  49. 49.
    Vogeley, K., et al.: Mind reading: neural mechanisms of theory of mind and self-perspective. NeuroImage (2001).  https://doi.org/10.1006/nimg.2001.0789CrossRefGoogle Scholar
  50. 50.
    Allison, T., Puce, A., McCarthy, G.: Social perception from visual cues: role of the STS region. Trends Cogn. Sci. (2000).  https://doi.org/10.1016/s1364-6613(00)01501-1CrossRefGoogle Scholar
  51. 51.
    Pelphrey, K.A., Morris, J.P., McCarthy, G.: Grasping the intentions of others: the perceived intentionality of an action influences activity in the superior temporal sulcus during social perception. J. Cogn. Neurosci. (2004)  https://doi.org/10.1162/0898929042947900CrossRefGoogle Scholar
  52. 52.
    Saxe, R., et al.: A region of right posterior superior temporal sulcus responds to observed intentional actions. Neuropsychologia 42(11), 1435–1446 (2004).  https://doi.org/10.1016/j.neuropsychologia.2004.04.015CrossRefGoogle Scholar
  53. 53.
    Gallagher, H., Jack, A., Roepstorff, A., Frith, C.: Imaging the intentional stance in a competitive game. Neuroimage 16, 814 (2002)CrossRefGoogle Scholar
  54. 54.
    Krach, S., et al.: Can machines think? Interaction and perspective taking with robots investigated via fMRI. PLoS ONE (2008).  https://doi.org/10.1371/journal.pone.0002597CrossRefGoogle Scholar
  55. 55.
    Chaminade, T., Rosset, D., Da Fonseca, D., Nazarian, B., Lutcher, E., Cheng, G., Deruelle, C.: How do we think machines think? An fMRI study of alleged competition with an artificial intelligence. Front. Hum. Neurosci. 6, 103 (2012).  https://doi.org/10.3389/fnhum.2012.00103CrossRefGoogle Scholar
  56. 56.
    Gazzola, V., et al.: The anthropomorphic brain: the mirror neuron system responds to human and robotic actions. NeuroImage (2007).  https://doi.org/10.1016/j.neuroimage.2007.02.003CrossRefGoogle Scholar
  57. 57.
    Oberman, L.M., et al.: EEG evidence for mirror neuron activity during the observation of human and robot actions: toward an analysis of the human qualities of interactive robots. Neurocomputing (2007).  https://doi.org/10.1016/j.neucom.2006.02.024CrossRefGoogle Scholar
  58. 58.
    Thellman, S., Silvervarg, A., Ziemke, T.: Folk-psychological interpretation of human vs. humanoid robot behavior: exploring the intentional stance toward robots. Front. Psychol. 8, 1–14 (2017).  https://doi.org/10.3389/fpsyg.2017.01962CrossRefGoogle Scholar
  59. 59.
    Marchesi, S., et al.: Do we adopt the intentional stance towards humanoid robots? Front. Psychol. (2019)  https://doi.org/10.3389/fpsyg.2019.00450
  60. 60.
    Metta, G., et al.: The iCub humanoid robot: an open platform for research in embodied cognition. In: Performance Metrics for Intelligent Systems Workshop (PerMIS 2008) (2008)Google Scholar
  61. 61.
    Chaminade, T., et al.: Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures. PLOS ONE. Public Library of Science 5(7), e11577 (2010).  https://doi.org/10.1371/journal.pone.0011577CrossRefGoogle Scholar
  62. 62.
    Fink, J.: Anthropomorphism and human likeness in the design of robots and human-robot interaction. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 7621 LNAI, pp. 199–208 (2012).  https://doi.org/10.1007/978-3-642-34103-8_20CrossRefGoogle Scholar
  63. 63.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots : concepts, design, and applications. Robot. Auton. Syst. (2003)  https://doi.org/10.1016/s0921-8890(02)00372-xzbMATHCrossRefGoogle Scholar
  64. 64.
    Venkatesh, V., Davis, F.D.: A theoretical extension of the technology acceptance model: four longitudinal field studies. Manag. Sci. (2000).  https://doi.org/10.1287/mnsc.46.2.186.11926CrossRefGoogle Scholar
  65. 65.
    Duffy, B.R.: Anthropomorphism and the social robot. Robot. Auton. Syst. 42, 177–190 (2003)zbMATHCrossRefGoogle Scholar
  66. 66.
    Goetz, J., Kiesler, S.: Cooperation with a robotic assistant. In CHI’02 Extended Abstracts on Human Factors in Computing Systems—CHI’02 (2002).  https://doi.org/10.1145/506443.506492
  67. 67.
    Axelrod, L., Hone, K.: E-motional advantage: performance and satisfaction gains with affective computing. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems (2005).  https://doi.org/10.1145/1056808.1056874
  68. 68.
    Riek, L.D., Rabinowitch, T.-C., Chakrabarti, B., Robinson, P.: How anthropomorphism affects empathy toward robots. In: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction, pp. 245–246. ACM, New York (2009)Google Scholar
  69. 69.
    Hegel, F., et al.: Understanding social robots: a user study on anthropomorphism. In: The 17th IEEE International Symposium on Robot and Human Interactive Communication, 2008. RO-MAN 2008. (2008).  https://doi.org/10.1109/roman.2008.4600728
  70. 70.
    Bartneck, C., Forlizzi, J.: Shaping human-robot interaction: understanding the social aspects of intelligent robotic products. In: CHI 2004 Extended Abstracts on Human Factors in Computing Systems, pp. 1731–1732. ACM, New York (2004)Google Scholar
  71. 71.
    Eyssel, F., Hegel, F., Horstmann, G., Wagner, C.: Anthropomorphic inferences from emotional nonverbal cues: a case study. In: 2010 IEEE RO-MAN, pp. 646–651. IEEE (2010)Google Scholar
  72. 72.
    Gonsior, B., et al.: Improving aspects of empathy and subjective performance for HRI through mirroring facial expressions. In: Proceedings—IEEE International Workshop on Robot and Human Interactive Communication (2011).  https://doi.org/10.1109/roman.2011.6005294
  73. 73.
    Fussell, S.R., Kiesler, S., Setlock, L.D., Yew, V.: How people anthropomorphize robots. In: Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction, pp. 145–152. ACM, New York (2008)Google Scholar
  74. 74.
    Willemse, C., Marchesi, S., Wykowska, A.: Robot faces that follow gaze facilitate attentional engagement and increase their likeability. Front. Psychol. (2018)  https://doi.org/10.3389/fpsyg.2018.00070
  75. 75.
    Kompatsiari, K., et al.: The importance of mutual gaze in human-robot interaction BT—social robotics. In: Kheddar, A., et al. (eds.) Springer International Publishing, Cham, pp. 443–452 (2017)Google Scholar
  76. 76.
    Wykowska, A., et al.: Humans are well tuned to detecting agents among non-agents: examining the sensitivity of human perception to behavioral characteristics of intentional systems. Int. J. Soc. Robot. (2015).  https://doi.org/10.1007/s12369-015-0299-6CrossRefGoogle Scholar
  77. 77.
    Wiese, E., et al.: I see what you mean: how attentional selection is shaped by ascribing intentions to others. PLoS ONE 7(9), e45391 (2012).  https://doi.org/10.1371/journal.pone.0045391CrossRefGoogle Scholar
  78. 78.
    Wykowska, A., Wiese, E., Prosser, A., Müller, H.J.: Beliefs about the minds of others influence how we process sensory information. PLoS ONE 9(4), e94339 (2014)CrossRefGoogle Scholar
  79. 79.
    Cabibihan, J.J., Javed, H., Ang, M., et al.: Why robots? a survey on the roles and benefits of social robots in the therapy of children with autism. Int. J. Social Robot. 5, 593 (2013).  https://doi.org/10.1007/s12369-013-0202-2CrossRefGoogle Scholar
  80. 80.
    Wykowska, A., et al.: Autistic traits and sensitivity to human-like features of robot behavior. Interact. Stud. (2015).  https://doi.org/10.1075/is.16.2.09wykCrossRefGoogle Scholar
  81. 81.
    Kajopoulos, J., et al.: Robot-assisted training of joint attention skills in children diagnosed with autism. In: Social Robotics : Proceedings of the 7th International Conference on Social Robotics, ICSR 2015, Paris, France (2015).  https://doi.org/10.1007/978-3-319-25554-5_30CrossRefGoogle Scholar
  82. 82.
    Dautenhahn, K.: Socially intelligent robots: dimensions of human-robot interaction. Philos. Trans. R. Soc. Lond. B Biol. Sci. 362, 679–704 (2007)CrossRefGoogle Scholar
  83. 83.
    Kompatsiari, K., Pérez-Osorio, J., De Tommaso, D., Metta, G., Wykowska, A.: Neuroscientifically-grounded research for improved human-robot interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, pp. 3403–3408 (2018)Google Scholar
  84. 84.
    Kompatsiari, K., Ciardo, F., Tikhanoff, V., Metta, A., Wykowska, A.: On the role of eye contact in gaze cueing. Sci Rep. 8, 17842 (2018).  https://doi.org/10.1038/s41598-018-36136-2
  85. 85.
    Schellen, E., Pérez-Osorio, J., Wykowska, A.: (2018). Social cognition in human-robot interaction: putting the ‘H’ back in ‘HRI’. In: Ivaldi, S., Pateraki, M. (eds.) ERCIM News 114, Special theme: Human-Robot Interaction (2018)Google Scholar
  86. 86.
    Willemse, C., Wykowska, A.: In natural interaction with embodied robots we prefer it when they follow our gaze: a gaze-contingent mobile eyetracking study. Philos. Trans. Roy. Soc. B. 374, 20180036 (2019)CrossRefGoogle Scholar
  87. 87.
    Kompatsiari, K., Ciardo, F., Tikhanoff, V., Metta, G., Wykowska, A.: It’s in the eyes: the engaging role of eye contact in HRI. Int. J. Soc. Robot. (2019). https://doi.org/10.1007/s12369-019-00565-4

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Istituto Italiano di Tecnologia, IITGenoaItaly

Personalised recommendations