Overview of the Fault and Bad Language Tolerance in Automatic User-Agent Dialogues

  • Diana Pérez-Marín
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 156)


Conversational agents are computer systems able to interact with users in natural language. Advances in Natural Language Processing and Genetic Computation have promoted the possibility of gathering a corpus of user-agent dialogues. In this paper, the focus is to study the level of tolerance that users have when interacting with agents from the analysis of the recorded dialogues. Some factors that have been found as reasons for bad language are: misunderstanding of the agent, high expectations of the users, need of challenging the agent, cultural background and age. Results of several experiments in which these factors are involved are reported.


Natural Language Processing High Expectation Media Equation Pedagogical Agent Conversational Agent 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Reeves, B., Nass, C.I.: The media equation: How people treat computers, television and new media as real people and places. Cambridge University Press/CSLI, Cambridge (1996)Google Scholar
  2. 2.
    D’Angeli, A., Brahnam, S.: I hate you! Disinhibition with virtual partners. Interacting with Computers 20, 302–310 (2008)CrossRefGoogle Scholar
  3. 3.
    Potiron, K., Taillibert, P., El Fallah Seghrouchni, A.: A Step Towards Fault Tolerance for Multi-Agent Systems. In: Dastani, M.M., El Fallah Seghrouchni, A., Leite, J., Torroni, P. (eds.) LADS 2007. LNCS (LNAI), vol. 5118, pp. 156–172. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Doering, A., Veletsianos, G., Yerasimou, T.: Conversational Agents and their Longitudinal Affordances on Communication and Interaction. Journal of Interactive Learning Research 19(2), 251–270 (2008)Google Scholar
  5. 5.
    Hubal, R., Fishbein, D., Sheppard, M., Paschall, M., Eldreth, D., Hyde, C.: How do varied populations interact with embodied conversational agents? Findings from inner-city adolescents and prisoners. Computers in Human Behavior 24(3), 1104–1138 (2008)CrossRefGoogle Scholar
  6. 6.
    Komatsu, T., Yamada, S.: Adaptation gap hypothesis: How differences between users’ expected and perceived agent functions affect their subjective impression. Journal of Systemics, Cybernetics and Informatics 9, 67–74 (2011)Google Scholar
  7. 7.
    Robinson, S., Traum, D., Ittycheriah, M., Henderer, J.: What would you ask a conversational agent? Observations of human-agent dialogues in a museum setting. In: Proceedings of the Sixth International Language Resources and Evaluation (LREC 2008), Marrakech, Morocco, vol. 26 (2008)Google Scholar
  8. 8.
    Traum, D.: Talking to Virtual Humans: Dialogue Models and Methodologies for Embodied Conversational Agents. In: Wachsmuth, I., Knoblich, G. (eds.) ZiF Research Group International Workshop. LNCS (LNAI), vol. 4930, pp. 296–309. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  9. 9.
    Veletsianos, G., Scharber, C., Doering, A.: When sex, drugs, and violence enter the classroom: Conversations between adolescents and a female pedagogical agent. Interacting with Computers 20(3), 292–301 (2008)CrossRefGoogle Scholar
  10. 10.
    Gupta, S., Walker, M.A., Romano, D.M.: How Rude Are You?: Evaluating Politeness and Affect in Interaction. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 203–217. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  11. 11.
    Veletsianos, G., Miller, C.: Conversing with Pedagogical Agents: A Phenomenological Exploration of Interacting with Digital Entities. British Journal of Educational Technology 39(6), 969–986 (2008)CrossRefGoogle Scholar
  12. 12.
    Hoffmann, L., Krämer, N.C., Lam-chi, A., Kopp, S.: Media Equation Revisited: Do Users Show Polite Reactions towards an Embodied Agent? In: Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjálmsson, H.H. (eds.) IVA 2009. LNCS, vol. 5773, pp. 159–165. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  13. 13.
    Norman, D.: How might people interact with agents. In: Bradshaw, J.M. (ed.) Software Agents, pp. 49–56. MIT Press, Menlo Park (1997)Google Scholar
  14. 14.
    Matsumoto, N., Fujii, H., Goan, M., Okada, M.: Minimal design strategy for embodied communication agents. In: Proceedings of the 14th International Symposium on Robot and Human Interactive Communication, pp. 335–340 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  1. 1.Computer Science FacultyUniversidad Rey Juan CarlosMóstolesSpain

Personalised recommendations