Advertisement

Direction of Attention Perception for Conversation Initiation in Virtual Environments

  • Christopher Peters
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3661)

Abstract

We consider the role of gaze and direction of attention for providing embodied agents with the capability of visually perceiving the attention of others in a virtual environment. Such a capability is of importance in social environments where the directions in which others orient themselves provides information necessary for detecting important social cues and serving as a basis for inferring information about their possible motives, desires and intentions. Our real-time model uses synthetic vision and memory to implement a perceptually-based theory of mind that considers the direction of the eyes, head, body and locomotion of others. These contribute to metrics that describe the awareness and amount of interest that another is deemed to have in the self. We apply this capability to an automated conversation initiation scenario where an agent who encounters a potential interaction partner considers not only its own interaction goal, but also its theory of the goal of the other. Our aim is to improve the plausibility of animated social interaction and is inspired by human social behaviour, where one generally wishes to avoid the embarrassing situation of committing to a conversation with an unwilling participant.

Keywords

Virtual Environment Humanoid Robot Attention Level Attention Behaviour Conversational Agent 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Argyle, M., Cook, M.: Gaze and mutual gaze. Cambridge University Press, Cambridge (1976)Google Scholar
  2. 2.
    Baron-Cohen, S.: How to build a baby that can read minds: cognitive mechanisms in mind reading. Cahiers de Psychologie Cognitive 13, 513–552 (1994)Google Scholar
  3. 3.
    Cassell, J., Pelachaud, C., Badler, N., Steedman, M., Achorn, B., Becket, T., Douville, B., Prevost, S., Stone, M.: Animated conversation: rule-based generation of facial expression, gesture, spoken intonation for multiple conversational agents. Computer Graphics 28(Annual Conference Series), 413–420 (1994)Google Scholar
  4. 4.
    Chance, M.R.A.: Attention Structure as the basis of primate rank orders. Man 22, 503–518 (1976)CrossRefGoogle Scholar
  5. 5.
    Colburn, A., Cohen, M., Drucker, S.: The role of eye gaze in avatar mediated conversational interfaces (2000)Google Scholar
  6. 6.
    Emery, N.J.: The eyes have it: The neuroethology, function and evolution of social gaze. Neuroscience and biobehavioural reviews 24, 581–604 (2000)CrossRefGoogle Scholar
  7. 7.
    Garau, M., Slater, M., Vinayagamoorthy, V., Brogni, A., Steed, A., Sasse, M.A.: The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In: Proceedings of the conference on Human factors in computing systems, pp. 529–536. ACM Press, New York (2003)CrossRefGoogle Scholar
  8. 8.
    Goffman, E.: Behaviour in public places: notes on the social order of gatherings. The Free Press, New York (1963)Google Scholar
  9. 9.
    Kampe, K.K.W., Frith, C.D., Frith, U.: “hey john”: signals conveying communicative intention toward the self activate brain regions associated with “mentalizing,” regardless of modality. Journal of Neuroscience 23(12), 5258–5263 (2003)Google Scholar
  10. 10.
    Kendon, A.: Conducting interaction: patterns of behaviour in focused encounters. Cambridge University Press, New York (1990)Google Scholar
  11. 11.
    Noser, H., Thalmann, D.: Synthetic vision and audition for digital actors. Computer Graphics Journal 14(3), 325–336 (1995)CrossRefGoogle Scholar
  12. 12.
    Perrett, D.I., Emery, N.J.: Understanding the intentions of others from visual signals: neurophysiological evidence. Current Psychology of Cognition 13, 683–694 (1994)Google Scholar
  13. 13.
    Perrett, D.I., Hietanen, J.K., Oram, M.W., Benson, P.J.: Organisation and functions of cells responsive to faces in the temporal cortex. Philosophical Transactions of the Royal Society of London: Biological Sciences 335, 23–30 (1992)CrossRefGoogle Scholar
  14. 14.
    Peters, C., Sullivan, C.O’.: Synthetic vision and memory for autonomous virtual humans. Computer Graphics Forum 21(4), 743–753 (2002)CrossRefGoogle Scholar
  15. 15.
    Peters, C., Sullivan, C.O’.: Bottom-up visual attention for virtual human animation. In: Proceedings of Computer Animation and Social Agents (CASA), New York, pp. 111–117 (2003)Google Scholar
  16. 16.
    Poggi, I., Pelachaud, C.: Gaze and its meaning in animated faces. In: McKevitt, P. (ed.) Language, vision and music, John Benjamins, Amsterdam (2000)Google Scholar
  17. 17.
    Scassellati, B.: Investigating models of social development using a humanoid robot. In: Webb, B., Consi, T. (eds.) Biorobotics, M.I.T. Press (2000)Google Scholar
  18. 18.
    Sidner, C.L., Lee, C.H., Lesh, N.B.: Engagement by looking: Behaviors for robots when collaborating with people. In: Kruiff-Korbayova, I., Kosny, C.(eds.) Diabruck: Workshop on the Semantics and Pragmatics of Dialogue, pp. 123–130 (2003)Google Scholar
  19. 19.
    Taylor, J.G.: Paying attention to consciousness. Progress in Neurobiology 71, 305–335 (2003)CrossRefGoogle Scholar
  20. 20.
    Vertegaal, R., Slagter, R., van der Veer, G., Nijholt, A.: Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: CHI 2001: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 301–308. ACM Press, New York (2001)CrossRefGoogle Scholar
  21. 21.
    Vertegaal, R., Weevers, I., Sohn, C., Cheung, C.: Gaze-2: conveying eye contact in group video conferencing using eye-controlled camera direction. In: CHI 2003: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 521–528. ACM Press, New York (2003)Google Scholar
  22. 22.
    Vilhjálmsson, H.H.: Autonomous communicative behaviors in avatars. Master’s thesis, Media Arts and Sciences, M.I.T. Media Lab., Cambridge M.A (1997)Google Scholar
  23. 23.
    Wicker, B., Perrett, D.I., Baron-Cohen, S., Decety, J.: Being the target of another’s emotion: a pet study. Neuropsychologia 41, 139–146 (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Christopher Peters
    • 1
  1. 1.IUT de MontreuilUniversité Paris8 

Personalised recommendations