Advertisement

International Journal of Social Robotics

, Volume 7, Issue 5, pp 767–781 | Cite as

Humans are Well Tuned to Detecting Agents Among Non-agents: Examining the Sensitivity of Human Perception to Behavioral Characteristics of Intentional Systems

  • Agnieszka Wykowska
  • Jasmin Kajopoulos
  • Miguel Obando-Leitón
  • Sushil Singh Chauhan
  • John-John Cabibihan
  • Gordon Cheng
Article

Abstract

For efficient social interactions, humans have developed means to predict and understand others’ behavior often with reference to intentions and desires. To infer others’ intentions, however, one must assume that the other is an agent with a mind and mental states. With two experiments, this study examined if the human perceptual system is sensitive to detecting human agents, based on only subtle behavioral cues. Participants observed robots, which performed pointing gestures interchangeably to the left or right with one of their two arms. Onset times of the pointing movements could have been pre-programmed, human-controlled (Experiment 1), or modeled after a human behavior (Experiment 2). The task was to determine if the observed behavior was controlled by a human or by a computer program, without any information about what parameters of behavior this judgment should be based on. Results showed that participants were able to detect human behavior above chance in both experiments. Moreover, participants were asked to discriminate a letter (F/T) presented on the left or the right side of a screen. The letter could have been either validly cued by the robot (location that the robot pointed to coincided with the location of the letter) or invalidly cued (the robot pointed to the opposite location than the letter was presented). In this cueing task, target discrimination was better for the valid versus invalid conditions in Experiment 1 where a human face was presented centrally on a screen throughout the experiment. This effect was not significant in Experiment 2 where participants were exposed only to a robotic face. In sum, present results show that the human perceptual system is sensitive to subtleties of human behavior. Attending to where others attend, however, is modulated not only by adopting the Intentional Stance but also by the way participants interpret the observed stimuli.

Keywords

Turing test Agency Human–robot interaction  Pointing Spatial cueing 

Notes

Acknowledgments

This work was supported by the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG)—Grant awarded to AW (WY-122/1-1) and a grant within the LMUExcellent scheme awarded to AW.

References

  1. 1.
    Baron-Cohen S (1995) Mindblindness: an essay on autism and theory of mind. MIT Press/ Bradford Books, BostonGoogle Scholar
  2. 2.
    Frith CD, Frith U (2006) How we predict what other people are going to do. Brain Res 1079:36–46CrossRefGoogle Scholar
  3. 3.
    Dennett DC (2003) True believers: the intentional strategy and why it works. In: O’Connor T, Robb D (eds) Philosophy of mind: contemporary readings. Routledge, London, pp 370–390Google Scholar
  4. 4.
    Gallagher HL, Jack AI, Roepstorff A, Frith CD (2002) Imaging the intentional stance in a competitive game. Neuroimage 16:814–821CrossRefGoogle Scholar
  5. 5.
    Chaminade T, Rosset D, Da Fonesca D, Nazarian B, Lutcher E, Cheng G, Deruelle C (2012) How do we think machines think? An fMRI study of alleged competition with an artificial intelligence. Front Hum Neurosci 6:1–9Google Scholar
  6. 6.
    Wiese E, Wykowska A, Zwickel J, Müller HJ (2012) I see what you mean: how attentional selection is shaped by ascribing intentions to others. PLoS One 7(9):e45391CrossRefGoogle Scholar
  7. 7.
    Wykowska A, Wiese E, Prosser A, Müller H (2014) Beliefs about the minds of others influence how we process sensory information. PLoS One 9(4):e94339CrossRefGoogle Scholar
  8. 8.
    Tomasello M (2010) Origins of human communication. MIT Press, CambridgeGoogle Scholar
  9. 9.
    Turing AM (1950) Computing machinery and intelligence. Mind 59:433–460MathSciNetCrossRefGoogle Scholar
  10. 10.
    Pfeiffer UJ, Timmermans B, Bente G, Vogeley K, Schilbach L (2011) A non-verbal Turing test: differentiating mind from machine in gaze-based social interaction. PLoS One 6(11):e27591CrossRefGoogle Scholar
  11. 11.
    Chaminade T, Okka MM (2013) Comparing the effect of humanoid and human face for the spatial orientation of attention. Front Neurorobot 7:12CrossRefGoogle Scholar
  12. 12.
    Friesen CK, Kingstone A (1998) The eyes have it! Reflexive orienting is triggered by nonpredictive gaze. Psychon Bull Rev 5:490–495CrossRefGoogle Scholar
  13. 13.
    Driver J, Davis G, Ricciardelli P, Kidd P, Maxwell E, Baron-Cohen S (1999) Gaze perception triggers reflexive visuospatial orienting. Vis Cogn 6:509–540CrossRefGoogle Scholar
  14. 14.
    Frischen A, Bayliss AP, Tipper SP (2007) Gaze cuing of attention: visual attention, social cognition, and individual differences. Psychol Bull 133:694–724CrossRefGoogle Scholar
  15. 15.
    Liebal K, Tomasello M (2009) Infants appreciate the social intention behind a pointing gesture: commentary on “Children‘s understanding of communicative intentions in the middle of the second year of life” by Aureli T, Perucchini P, Genco, M. Cogn Dev 24:13–15CrossRefGoogle Scholar
  16. 16.
    Liszkowski U, Carpenter M, Henning A, Striano T, Tomasello M (2004) 12-month-olds point to share attention and interest. Dev Sci 7:297–307CrossRefGoogle Scholar
  17. 17.
    Liszkowski U, Carpenter A, Striano T, Tomasello M (2006) 12 and 18-month-olds point to provide information for others. J Cogn Dev 7:173–187CrossRefGoogle Scholar
  18. 18.
    Grassmann S, Tomasello M (2010) Young children follow pointing over words in interpreting acts of reference. Dev Sci 13:252–263CrossRefGoogle Scholar
  19. 19.
    Cabibihan JJ, So W-C, Saj S, Zhang Z (2012) Telerobotic pointing gestures shape human spatial cognition. Int J Soc Robot 4:263–272Google Scholar
  20. 20.
    Cabibihan JJ, So WC, Nazar MA, Ge SS (2009) Pointing gestures for robot-mediated communication interface. In: Xie M et al (eds) Intelligent robotics and applications, vol 5928. Springer, Berlin / Heidelberg, pp 67–77CrossRefGoogle Scholar
  21. 21.
    Ham J, Bokhorst R, Cuijpers R, van der Pol D, Cabibihan JJ (2011) Making robots persuasive: the influence of combining persuasive strategies (gazing and gestures) by a storytelling robot on its persuasive power, social robotics. Springer, Berlin / HeidelbergGoogle Scholar
  22. 22.
    Baron-Cohen S, Wheelwright S, Skinner R, Martin J, Clubley E (2001) The autism-spectrum quotient (AQ): evidence from asperger syndrome/high-functioning autism, males and females, scientists and mathematicians. J Autism Dev Disord 31:5–17CrossRefGoogle Scholar
  23. 23.
    Johansson G (1973) Visual perception of biological motion and a model for its analysis. Percept Psychophys 14:201–211CrossRefGoogle Scholar
  24. 24.
    Grossman E, Blake R (2002) Brain areas active during visual perception of biological motion. Neuron 35:1157–1165CrossRefGoogle Scholar
  25. 25.
    Thornton IM, Vuong QC (2004) Incidental processing of biological motion. Curr Biol 14:1084–1089CrossRefGoogle Scholar
  26. 26.
    Hiris E (2007) Detection of biological and nonbiological motion. J Vis 7:1–16CrossRefGoogle Scholar
  27. 27.
    Wykowska A, Kajopoulos J, Ramirez-Amaro K, Cheng G. (accepted). Autistic traits and sensitivity to human-like features of robot behavior. Interact StudGoogle Scholar
  28. 28.
    Schilbach L, Timmermans B, Reddy V, Costall A, Bente G, Schlicht T, Volgeley K (2013) Toward a second-person neuroscience. Behav Brain Sci 36:393–462CrossRefGoogle Scholar
  29. 29.
    Li H, Cabibihan JJ, Tan YK (2011) Towards an effective design of social robots. Int J Soc Robot 3:333–335CrossRefGoogle Scholar
  30. 30.
    Brayda L, Chellali R (2012) Measuring human–robots interactions. Int J Soc Robot 4:119–221 (Editorial)CrossRefGoogle Scholar
  31. 31.
    Cabibihan JJ, Williams MA, Simmons R (2014) When robots engage humans. Int J Soc Robot 6:311–313. doi: 10.1007/s12369-014-0249-8 CrossRefGoogle Scholar
  32. 32.
    Cabibihan JJ, Wing-Chee S, Pramanik S (2012) Human-recognizable robotic gestures. IEEE T Auton Mental Dev 4:305–314CrossRefGoogle Scholar
  33. 33.
    Yagoda RE, Gillan DJ (2012) You want me to trust a ROBOT? The development of a human–robot interaction trust scale. Int J Soc Robot 4:235–248CrossRefGoogle Scholar
  34. 34.
    Ishiguro H, Nishio S (2007) Building artificial humans to understand humans. J Artif Organs 10:133–142CrossRefGoogle Scholar
  35. 35.
    Nishio S, Ishiguro H, Hagita N (2007) Geminoid: teleoperated android of an existing person. In: de Pina Filho AC et al (eds) Humanoid robots: new developments. I-Tech Education and Publishing, Vienna, pp 343–352Google Scholar
  36. 36.
    Mori M (1970) Bukimi no tani. The uncanny valley (MacDorman KF, Minato T, trans.). Energy 7:33–35Google Scholar
  37. 37.
    Saygin AP, Chaminade T, Ishiguro H, Driver J, Frith C (2012) The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Soc Cogn Affect Neurosci 7:413–422CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  • Agnieszka Wykowska
    • 1
    • 2
  • Jasmin Kajopoulos
    • 1
    • 3
  • Miguel Obando-Leitón
    • 4
  • Sushil Singh Chauhan
    • 5
  • John-John Cabibihan
    • 6
  • Gordon Cheng
    • 2
  1. 1.General and Experimental Psychology Unit, Department of PsychologyLudwig-Maximilians-UniversitätMunichGermany
  2. 2.Institute for Cognitive SystemsTechnische Universität MünchenMunichGermany
  3. 3.Neuro-cognitive Psychology Master Program, Department of PsychologyLudwig-Maximilians-UniversitätMunichGermany
  4. 4.Graduate School of Systemic NeurosciencesLudwig-Maximilians-UniversitätPlanegg-MartinsriedGermany
  5. 5.Singapore Institute for Neurotechnology (SINAPSE)National University of SingaporeSingaporeSingapore
  6. 6.Department of Mechanical and Industrial EngineeringQatar UniversityDohaQatar

Personalised recommendations