Influence of Agent Behaviour on Human-Virtual Agent Body Interaction

  • Igor Stanković
  • Branislav Popović
  • Florian Focone
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8773)


This paper describes influence of different types of agent’s behaviour in a social human-virtual agent gesture interaction experiment. The interaction was described to participants as a game with the goal of imitating the agent’s slow upper-body movements, and where new subtle movements can be proposed by both the participant and the agent. As we are interested only in body movements, simple virtual agent was built and displayed at a local exhibition, and we asked visitors one by one to play the game. During the interaction, the agent’s behaviour varied from subject to subject, and we observed their responses. Interesting observations have been drawn from the experiment and it seems that only a small variation in the agent’s behaviour and synchronization can lead to a significantly different feel of the game in the participants.


Human-virtual agent interaction gestural body interaction behaviour experiment 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bohus, D., Rudnicky, A.: Sorry, I Didn’t Catch That! An Investigation of Non-Understanding Errors and Recovery Strategies. In: Recent Trends in Discourse and Dialogue, 39th edn., pp. 123–154. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  2. 2.
    Lee, C.H.: Fundamentals and Technical Challenges in Automatic Speech Recognition. In: SPECOM 2007, Moscow, Russia, pp. 25–44 (2007)Google Scholar
  3. 3.
    Xiao, Y., Yuan, J., Thalmann, D.: Human-virtual Human Interaction by Upper-body Gesture Understanding. In: 19th ACM Symposium on Virtual Reality Software and Technology, New York, USA, pp. 133–142 (2013)Google Scholar
  4. 4.
    Ruffieux, S., Lalanne, D., Khaled, O.A., Mugellini, E.: Developer-Oriented Visual Model for Upper-Body Gesture Characterization. In: Kurosu, M. (ed.) HCII/HCI 2013, Part V. LNCS, vol. 8008, pp. 186–195. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  5. 5.
    Gratch, J., Wang, N., Okhmatovskaia, A., Lamothe, F., Morales, M., van der Werf, R.J., Morency, L.-P.: Can Virtual Humans Be More Engaging Than Real Ones? In: Jacko, J.A. (ed.) HCI 2007. LNCS, vol. 4552, pp. 286–297. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  6. 6.
    Clavel, C., Plessier, J., Martin, J.-C., Ach, L., Morel, B.: Combining facial and postural expressions of emotions in a virtual character. In: Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjálmsson, H.H. (eds.) IVA 2009. LNCS, vol. 5773, pp. 287–300. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  7. 7.
    Poppe, R., Truong, K.P., Heylen, D.: Backchannels: Quantity, type and timing matters. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, K.R. (eds.) IVA 2011. LNCS, vol. 6895, pp. 228–239. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  8. 8.
    Stanković, I., De Loor, P., Demulier, V., Nédélec, A., Bevacqua, E.: The INGREDIBLE database: A First Step Toward Dynamic Coupling in Human-Virtual Agent Body Interaction. In: Aylett, R., et al. (eds.) IVA 2013. LNCS, vol. 8108, pp. 430–431. Springer, Heidelberg (2013)Google Scholar
  9. 9.
    Llobera, J., Spanlang, B., Ruffini, G., Slater, M.: Proxemics with Multiple Dynamic Characters in an Immersive Virtual Environment. ACM Trans. Appl. Perce. 8(1), 3:1–3:12 (2010)Google Scholar
  10. 10.
    Lakens, D.: Movement Synchrony and Perceived Entitativity. J. Exp. Soc. Psychol. 46(5), 701–708 (2010)CrossRefGoogle Scholar
  11. 11.
    Bailenson, J.N., Swinth, K., Hoyt, C., Persky, S., Dimov, A., Blascovich, J.: The Independent and Interactive Effects of Embodied-Agent Appearance and Behavior on Self-Report, Cognitive, and Behavioral Markers of Copresence in Immersive Virtual Environments. Presence-Teleop. Virt. 14(4), 379–393 (2005)CrossRefGoogle Scholar
  12. 12.
    Popović, B., Stanković, I., Ostrogonac, S.: Temporal Discrete Cosine Transform for Speech Emotion Recognition. In: 4th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2013), Budapest, Hungary, pp. 87–90 (2013)Google Scholar
  13. 13.
    Popović, B., Ostrogonac, S., Delić, V., Janev, M., Stanković, I.: Deep Architectures for Automatic Emotion Recognition Based on Lip Shape. In: 12th International Scientific Professional Symposium INFOTEH-JAHORINA, Jahorina, Bosnia and Herzegovina, pp. 939–943 (2013)Google Scholar
  14. 14.
    Jovičić, S.T., Kasić, Z.: Serbian Emotional Speech Database: Design, Processing and Evaluation. In: 9th International Conference on Speech and Computer (SPECOM 2004),, pp. 77–81 (2004)Google Scholar
  15. 15.
    Schuller, B., Rigoll, G., Lang, M.: Speech Emotion Recognition Combining Acoustic Features and Linguistic Information in a Hybrid Support Vector Machine-Belief Network Architecture. In: Acoustics, Speech, and Signal Processing (ICASSP 2004), pp. I-577–I-580 (2004)Google Scholar
  16. 16.
    Kinnunen, T., Haizhou, L.: An Overview of Text-Independent Speaker Recognition: From Features to Supervectors. Speech Commun. 52(1), 12–40 (2010)CrossRefGoogle Scholar
  17. 17.
    Delić, V., Sečujski, M., Jakovljević, N., Pekar, D., Mišković, D., Popović, B., Ostrogonac, S., Bojanić, M., Knežević, D.: Speech and language resources within speech recognition and synthesis systems for serbian and kindred south slavic languages. In: Železný, M., Habernal, I., Ronzhin, A. (eds.) SPECOM 2013. LNCS, vol. 8113, pp. 319–326. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  18. 18.
    Gnjatović, M., Bojanić, M., Popović, B., Delić, V.: An Adaptive Recovery Strategy for Handling Miscommunication in Human-Machine Interaction. In: 18th Telecommunications Forum (TELFOR 2004), Belgrade, Serbia, pp.1121–1124 (2010)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Igor Stanković
    • 1
  • Branislav Popović
    • 2
  • Florian Focone
    • 3
  1. 1.Lab-STICC, ENIBUEBFrance
  2. 2.Faculty of Technical SciencesUniversity of Novi SadSerbia
  3. 3.LIMSI-CNRSOrsayFrance

Personalised recommendations