Advertisement

Is There a Biological Basis for Success in Human Companion Interaction?

Results from a Transsituational Study
  • Dietmar Rösner
  • Dilana Hazer-Rau
  • Christin Kohrs
  • Thomas Bauer
  • Stephan Günther
  • Holger Hoffmann
  • Lin Zhang
  • André Brechmann
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9731)

Abstract

We report about a transsituational study where a representative subsample of twenty of the subjects from the LAST MINUTE experiment underwent two additional independent experiments: an fMRI study and a psychophysiological experiment with emotion induction in the VAD space (Valence, Arousal, Dominance). A major result is that dialog success in the naturalistic human machine dialogs in LAST MINUTE correlates with individual differences in brain activation as reaction to delayed system responses in the fMRI study and with the classification rate for arousal in the emotion induction experiment.

Keywords

Emotion Recognition Anterior Insula Skin Conductance Level International Affective Picture System fMRI Experiment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

The presented study is performed in the framework of the Transregional Collaborative Research Centre SFB/TRR 62 “A Companion-Technology for Cognitive Technical Systems” funded by the German Research Foundation (DFG). It is also supported by a doctoral scholarship funded by the China Scholarship Council (CSC) for Lin Zhang and a Margarete von Wrangell (MvW) habilitation scholarship for Dilana Hazer-Rau. The responsibility for the content of this paper remains with the authors.

References

  1. 1.
    Baayen, R.: Analyzing Linguistic Data - A Practical Introduction to Statistics using R. Cambridge University Press, Cambridge (2008)CrossRefGoogle Scholar
  2. 2.
    Calvo, R.A., D’Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affective Comput. 1(1), 18–37 (2010)CrossRefGoogle Scholar
  3. 3.
    Core, M., Allen, J.: Coding dialogs with the DAMSL annotation scheme. In: AAAI fall symposium on communicative action in humans and machines, pp. 28–35 (1997)Google Scholar
  4. 4.
    Craig, A.: How do you feel – now? the anterior insula and human awareness. Nat. Rev. Neurosci. 10, 59–70 (2009)CrossRefGoogle Scholar
  5. 5.
    Frommer, J., Rösner, D., Haase, M., Lange, J., Friesen, R., Otto, M.: Früherkennung und Verhinderung negativer Dialogverläufe - Operatormanual für das Wizard of Oz-Experiment. Pabst Science Publishers (2012)Google Scholar
  6. 6.
    Hazer, D., Ma, X., Rukavina, S., Gruss, S., Walter, S., Traue, H.C.: Transsituational individual-specific biopsychological classification of emotions. In: Stephanidis, C. (ed.) Proceedings of the HCI International 2015, pp. 110–117 (2015)Google Scholar
  7. 7.
    Hester, R., Foxe, J., Molholm, S., Shpaner, M., Garavan, H.: Neural mechanisms involved in error processing: a comparison of errors made with and without awareness. NeuroImage 27(3), 602–608 (2005)CrossRefGoogle Scholar
  8. 8.
    Kohrs, C., Angenstein, N., Brechmann, A.: Delays in human-computer interaction and their effects on brain activity. PLoS ONE 11(1) (2016). doi: 10.1371/journal.pone.0146250
  9. 9.
    Kohrs, C., Angenstein, N., Scheich, H., Brechmann, A.: Human striatum is differentially activated by delayed, omitted, and immediate registering feedback. Frontiers Human Neurosci. 6, 00243 (2012)CrossRefGoogle Scholar
  10. 10.
    Kohrs, C., Hrabal, D., Angenstein, N., Brechmann, A.: Delayed system response times affect immediate physiology and the dynamics of subsequent button press behavior. Psychophysiology 51(11), 1178–1184 (2014)CrossRefGoogle Scholar
  11. 11.
    Legát, M., Grůber, M., Ircing, P.: Wizard of Oz data collection for the Czech senior companion dialogue system. In: Fourth International Workshop on Human-Computer Conversation, pp. 1–4. University of Sheffield (2008)Google Scholar
  12. 12.
    Miller, R.B.: Response time in man-computer conversational transactions. In: AFIPS Conference Prodeedings, pp. 267–277. Thompson Book Company, Washington (1968)Google Scholar
  13. 13.
    Pentland, A., Pentland, S.: Honest Signals: How They Shape Our World. MIT Press, London (2008)Google Scholar
  14. 14.
    Prylipko, D., Rösner, D., Siegert, I., Günther, S., Friesen, R., Haase, M., Vlasenko, B., Wendemuth, A.: Analysis of significant dialog events in realistic human-computer interaction. J. Multimodal User Interfaces 8(1), 75–86 (2014)CrossRefGoogle Scholar
  15. 15.
    Rösner, D., Friesen, R., Günther, S., Andrich, R.: Modeling and evaluating dialog success in the LAST MINUTE Corpus. In: Proceedings of LREC 2014. ELRA, Reykjavik, May 2014Google Scholar
  16. 16.
    Rösner, D., Haase, M., Bauer, T., Günther, S., Krüger, J., Frommer, J.: Desiderata for the Design of Companion Systems - Insights from a Large Scale Wizard of Oz Experiment. Künstliche Intelligenz (2015), 28 October 2015. doi: 10.1007/s13218-015-0410-z
  17. 17.
    Rösner, D., Andrich, R., Bauer, T., Friesen, R., Günther, S.: Annotation and analysis of the LAST MINUTE corpus. In: Proceedings of the International Conference of the German Society for Computational Linguistics and Language Technology. pp. 112–121. Gesellschaft für Sprachtechnologie and Computerlinguistik e.V. (2015)Google Scholar
  18. 18.
    Rukavina, S., Gruss, S., Walter, S., Hoffmann, H., Traue, H.C.: Open_emorec_ii-a multimodal corpus of human-computer interaction. World Acad. Sci. Eng. Technol. Int. J. Comput. Electr. Autom. Control Inf. Eng. 9(5), 1135–1141 (2015)Google Scholar
  19. 19.
    Selting, M., Auer, P., Barth-Weingarten, D., Bergmann, J.R., Bergmann, P., Birkner, K., Couper-Kuhlen, E., Deppermann, A., Gilles, P., Günthner, S., et al.: Gesprächsanalytisches Transkriptionssystem 2 (GAT 2). Gesprächsforschung-Online-Zeitschrift zur verbalen Interaktion 10 (2009)Google Scholar
  20. 20.
    Wagner, J., Kim, J., André, E.: From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification. In: IEEE International Conference on Multimedia and Expo (ICME) (2005)Google Scholar
  21. 21.
    Walter, S., Kim, J., Hrabal, D., Crawcour, S.C., Kessler, H., Traue, H.C.: Transsituational individual-specific biopsychological classification of emotions. Systems, Man, and Cybernetics: Systems, IEEE Transactions 43(4), 988–995 (2013)Google Scholar
  22. 22.
    Webb, N., Benyon, D., Bradley, J., Hansen, P., Mival, O.: Wizard of Oz Experiments for a Companion Dialogue System: Eliciting Companionable Conversation. In: Proceedings of LREC 2010. ELRA (2010)Google Scholar
  23. 23.
    Wolters, M., Georgila, K., Moore, J., MacPherson, S.: Being old doesn’t mean acting old: how older users interact with spoken dialog systems. ACM Trans. Access. Comput. 2(1), 2:1–2:39 (2009)CrossRefGoogle Scholar
  24. 24.
    Zhang, L., Rukavina, S., Gruss, S., Traue, H.C., Hazer, D.: Classification analysis for the emotion recognition from psychobiological data. In: International Symposium on Companion-Technology (ISCT) (2015)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Dietmar Rösner
    • 1
  • Dilana Hazer-Rau
    • 2
  • Christin Kohrs
    • 3
  • Thomas Bauer
    • 1
  • Stephan Günther
    • 1
  • Holger Hoffmann
    • 2
  • Lin Zhang
    • 2
  • André Brechmann
    • 3
  1. 1.Institut für Wissens- und Sprachverarbeitung (IWS)Otto-von-Guericke UniversitätMagdeburgGermany
  2. 2.Medical PsychologyUlm UniversityUlmGermany
  3. 3.Special Lab Non-Invasive Brain ImagingLeibniz Institute for NeurobiologyMagdeburgGermany

Personalised recommendations