Generating Nonverbal Signals for a Sensitive Artificial Listener

  • Dirk Heylen
  • Anton Nijholt
  • Mannes Poel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4775)


In the Sensitive Artificial Listener project research is performed with the aim to design an embodied agent that not only generates the appropriate nonverbal behaviors that accompany speech, but that also displays verbal and nonverbal behaviors during the production of speech by its conversational partner. Apart from many applications for embodied agents where natural interaction between agent and human partner also require this behavior, the results of this project are also meant to play a role in research on emotional behavior during conversations. In this paper, our research and implementation efforts in this project are discussed and illustrated with examples of experiments, research approaches and interfaces in development.


Head Movement Nonverbal Behavior Implementation Effort Meaningful Event Human Partner 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Allwood, J., Nivre, J., Ahlsén, E.: On the semantics and pragmatics of linguistic feedback. Journal of Semantics 9(1), 1–26 (1992)CrossRefGoogle Scholar
  2. 2.
    Bevacqua, E., Heylen, D., Pelachaud, C.: Facial feedback signals for ECAs. In: Heylen, D., Marsella, S. (eds.) Proceedings AISB 2007 Symposium Mindful Environments, New Castle upon Tyne (April 2007)Google Scholar
  3. 3.
    Chovil, N.: Social determinants of facial displays. Journal of Nonverbal Behavior 15, 141–154 (1991)CrossRefGoogle Scholar
  4. 4.
    Fukayama, A., Ohno, T., Mukawa, N., Sawaki, M., Hagita, N.: Messages embedded in gaze of interface agents - impression management with agent’s gaze. In: Proceedings of CHI 2002, pp. 41–48. ACM Press, New York (2002)Google Scholar
  5. 5.
    Gratch, J., Okhmatovskaia, A., Lamothe, F., Marsella, S., Morales, M., van der Werf, R., Morency, L.-P.: Virtual rapport. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS (LNAI), vol. 4133, pp. 14–27. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  6. 6.
    Hadar, U., Steiner, T., Rose, C.F.: Head movement during listening turns in conversation. Journal of Nonverbal Behavior 9, 214–228 (1985)CrossRefGoogle Scholar
  7. 7.
    Heylen, D.: Head gestures, gaze and the principles of conversational structure. International Journal of Humanoid Robotics 3, 241–267 (2006)CrossRefGoogle Scholar
  8. 8.
    Heylen, D.: Multimodal Backchannel Generation for Conversational Agents. In: van der Sluis, I., Theune, M., Reiter, E., Krahmer, E. (eds.) MOG 2007. Proceedings of the workshop on Multimodal Output Generation, Aberdeen, UK, pp. 81–92 (2007)Google Scholar
  9. 9.
    Heylen, D., van Es, I., van Dijk, B., Nijholt, A.: Controlling the gaze of conversational agents. In: van Kuppevelt, J., Dybkjaer, L., Bernsen, N.O. (eds.) Natural, Intelligent and Effective Interaction in Multimodal Dialogue Systems, pp. 245–262. Kluwer Academic Publishers, Dordrecht (2005)CrossRefGoogle Scholar
  10. 10.
    Kendon, A.: Movement coordination in social interaction: some examples described. Acta Psychologica 32, 100–125 (1970)CrossRefGoogle Scholar
  11. 11.
    Kendon, A.: Some uses of head shake. Gesture 2, 147–182 (2003)CrossRefGoogle Scholar
  12. 12.
    Maatman, R.M.: Responsive behavior of a listening agent. Technical report, Institute for Creative Technologies (December 2004)Google Scholar
  13. 13.
    Maatman, R., Gratch, J., Marsella, S.: Natural behavior of a listening agent. In: Panayiotopoulos, T., Gratch, J., Aylett, R., Ballin, D., Olivier, P., Rist, T. (eds.) IVA 2005. LNCS (LNAI), vol. 3661, pp. 25–36. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  14. 14.
    Mignault, A., Chauduri, A.: The many faces of a neutral face: Head tilt and perception of dominance and emotion. Journal of Nonverbal Behavior 27, 111–132 (2003)CrossRefGoogle Scholar
  15. 15.
    Pelachaud, C., Peters, C., Mancini, M., Bevacqua, E., Poggi, I.: A model of attention and interest using gaze behavior. In: Panayiotopoulos, T., Gratch, J., Aylett, R., Ballin, D., Olivier, P., Rist, T. (eds.) IVA 2005. LNCS (LNAI), vol. 3661, pp. 229–240. Springer, Heidelberg (2005)Google Scholar
  16. 16.
    Reuderink, B.: The influence of gaze and head tilt on the impression of listening agents. Manuscript, University of Twente (2006)Google Scholar
  17. 17.
    Thórisson, K.R.: Natural turn-taking needs no manual: Computational theory and model, from perception to action. Multimodality in Language and Speech Systems, 173–207 (2002)Google Scholar
  18. 18.
    Ward, N., Tsukahara, W.: Prosodic features which cue back-channel responses in English and Japanese. Journal of Pragmatics 23, 1177–1207 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Dirk Heylen
    • 1
  • Anton Nijholt
    • 1
  • Mannes Poel
    • 1
  1. 1.Human Media Interaction Group, Department of Computer Science, University of TwenteThe Netherlands

Personalised recommendations