Abstract
The monitoring of emotional user states can help to assess the progress of human-machine-communication. If we look at specific databases, however, we are faced with several problems: users behave differently, even within one and the same setting, and some phenomena are sparse; thus it is not possible to model and classify them reliably. We exemplify these difficulties on the basis of SympaFly, a database with dialogues between users and a fully automatic speech dialogue telephone system for flight reservation and booking, and discuss possible remedies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ang, J., Dhillon, R., Krupski, A., Shriberg, E., Stolcke, A.: Prosodybased automatic detection of annoyance and frustration in human-computer dialog. In: Proc. ICSLP 2002, pp. 2037–2040 (2002)
Batliner, A., Fischer, K., Huber, R., Spilker, J., Nöth, E.: How to Find Trouble in Communication. Speech Communication 40, 117–143 (2003a)
Batliner, A., Hacker, C., Steidl, S., Nöth, E., D’Arcy, S., Russell, M., Wong, M.: “You stupid tin box” - children interacting with the AIBO robot: A cross-linguistic emotional speech corpus. In: Proc. LREC 2004, Lisbon (2004) (to appear)
Batliner, A., Hacker, C., Steidl, S., Nöth, E., Haas, J.: User States, User Strategies, and System Performance: How to Match the One with the Other. In: Proc. ISCA workshop on error handling in spoken dialogue systems, Chateau d’Oex, pp. 5–10. ISCA (2003b)
Batliner, A., Zeissler, V., Frank, C., Adelhardt, J., Shi, R.P., Nöth, E.: We are not amused - but how do you know? User states in a multi-modal dialogue system. In: Proc. EUROSPEECH, pp. 733–736 (2003c)
Campbell, N.: Towards Synthesising Expressive Speech: Designing and collecting Expressive Speech Data. In: Proc. EUROSPEECH, pp. 1637–1640 (2003)
Cornelius, R.R.: Theoretical Approaches to Emotion. In: Cowie, R., Douglas-Cowie, E., Schröder, M. (eds.) Proc. ISCA Workshop on Speech and Emotion, Newcastle, pp. 3–10 (2000)
Cowie, R., Cornelius, R.: Describing the emotional states that are expressed in speech. Speech Communication 40, 5–32 (2003)
Lee, C., Narayanan, S., Pieraccini, R.: Recognition of Negative Emotions from the Speech Signal. In: Proceedings of the Automatic Speech Recognition and Understanding Workshop (ASRU 2001). on CD-Rom (2001)
Steidl, S., Hacker, C., Ruff, C., Batliner, A., Nöth, E., Haas, J.: Looking at the last two turns, i’d say this dialogue is doomed – measuring dialogue success. In: Sojka, P., Kopeček, I., Pala, K. (eds.) TSD 2004. LNCS (LNAI), vol. 3206, pp. 629–636. Springer, Heidelberg (2004) (to appear)
Steininger, S., Schiel, F., Dioubina, O., Raubold, S.: Development of User-State Conventions for the Multimodal Corpus in SmartKom. In: Proc. of the Workshop ’Multimodal Resources and Multimodal Systems Evaluation’ 2002, Las Palmas, Gran Canaria, pp. 33–37 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Batliner, A., Hacker, C., Steidl, S., Nöth, E., Haas, J. (2004). From Emotion to Interaction: Lessons from Real Human-Machine-Dialogues. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds) Affective Dialogue Systems. ADS 2004. Lecture Notes in Computer Science(), vol 3068. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24842-2_1
Download citation
DOI: https://doi.org/10.1007/978-3-540-24842-2_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22143-2
Online ISBN: 978-3-540-24842-2
eBook Packages: Springer Book Archive