Advertisement

International Journal of Social Robotics

, Volume 10, Issue 5, pp 583–593 | Cite as

Multiple-Robot Conversational Patterns for Concealing Incoherent Responses

  • Tsunehiro ArimotoEmail author
  • Yuichiro Yoshikawa
  • Hiroshi Ishiguro
Article
  • 218 Downloads

Abstract

Conversational robots, which are used in the fields of education, therapy, and in expositions, are expected to keep a user engaged in conversation. However, these robots sometimes utter comments that are irrelevant topic to the current context owing to a failure in recognizing the human user’s speech or intention. Such a sudden topic shift is considered to interfere with what we call the sense of conversation with which a person can feel as if he or she is participating in a conversation. In this paper, to reduce the interferes of the sudden topic shift, we propose to use multiple robots in a conversation, in which even an actually irrelevant, sudden topic shift sounds involving possible relevance to be shared with subjects in the ongoing conversation. To verify it, we conducted an experiment in which subjects experienced a conversation with either one or two robots and then evaluated their impression of the conversations. The experimental results showed that the subjects who talked with two robots felt less ignored by the robots, and had less difficulty in continuing the conversation with them, than those who talked with a single robot. Further analysis considering subjects’ social skills raised the possibility of an additional effect on robot coherence perception. Finally, we discuss a new disruption-tolerant conversational system design using multiple robots based on the experimental results.

Keywords

Conversational robot Multiple robots Sense of conversation Multi-party conversation 

Notes

Compliance with Ethical Standards

Funding

This study is partially supported by JSPS KAKENHI Grant Numbers JP25220004, JP24680022.

Conflict of interest

Y. Yoshikawa and H. Ishiguro serve as consultants of Vstone Co. Ltd.

Ethical Approval

This study received ethical approval from Graduate School of Engineering Science, Osaka University.

References

  1. 1.
    Gonzalez-Pacheco V, Ramey A, Alonso-Martin F, Castro-Gonzalez A, Salichs MA (2011) Maggie: a social robot as a gaming platform. Int J Soc Robot 3(4):371–381CrossRefGoogle Scholar
  2. 2.
    Kennedy J, Baxter P, Belpaeme T (2015) Comparing robot embodiments in a guided discovery learning interaction with children. Int J Soc Robot 7(2):293–308CrossRefGoogle Scholar
  3. 3.
    Shiomi M, Kanda T, Howley I, Hayashi K, Hagita N (2015) Can a social robot stimulate science curiosity in classrooms? Int J Soc Robot 7(5):641–652CrossRefGoogle Scholar
  4. 4.
    Stafford RQ, MacDonald BA, Li X, Broadbent E (2014) Older people’s prior robot attitudes influence evaluations of a conversational robot. Int J Soc Robot 6(2):281–297CrossRefGoogle Scholar
  5. 5.
    Johnston M, Bangalore S, Vasireddy G, Stent A, Ehlen P, Walker M, Maloor P (2002) MATCH: an architecture for multimodal dialogue systems. In: Proceedings of the 40th annual meeting on association for computational linguistics. Association for Computational Linguistics, pp 376–383Google Scholar
  6. 6.
    Weizenbaum J (1966) ELIZA-a computer program for the study of natural language communication between man and machine. Commun ACM 9(1):36–45CrossRefGoogle Scholar
  7. 7.
    Wallace RS (2009) The anatomy of ALICE. In: Parsing the turing Test, pp 181–210CrossRefGoogle Scholar
  8. 8.
    Shiomi M, Sakamoto D, Kanda T, Ishi CT, Ishiguro H, Hagita N (2008) A semi-autonomous communication robot: a field trial at a train station. In: Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction. ACM, pp 303–310Google Scholar
  9. 9.
    Matsusaka Y, Fujie S, Kobayashi T (2001) Modeling of conversational strategy for the robot participating in the group conversation. In: Interspeech, vol 1, pp 2173–2176Google Scholar
  10. 10.
    Shiwa T, Kanda T, Imai M, Ishiguro H, Hagita N (2008) How quickly should communication robots respond? In: 2008 3rd ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 153–160Google Scholar
  11. 11.
    Sakamoto D, Kanda T, Ono T, Kamashima M, Imai M, Ishiguro H (2005) Cooperative embodied communication emerged by interactive humanoid robots. Int J Hum Comput Stud 62(2):247–265CrossRefGoogle Scholar
  12. 12.
    Thrun S, Bennewitz M, Burgard W, Cremers AB, Dellaert F, Fox D, Schulz D (1999) MINERVA: a second-generation museum tour-guide robot. In: . Proceedings of the 1999 IEEE international conference on robotics and automation, vol 3. IEEEGoogle Scholar
  13. 13.
    Hayashi K, Kanda T, Miyashita T, Ishiguro H, Hagita N (2008) Robot manzai: robot conversation as a passive-social medium. Int J Hum Robot 5(01):67–86CrossRefGoogle Scholar
  14. 14.
    Pan Y, Okada H, Uchiyama T, Suzuki K (2015) On the reaction to robot’s speech in a hotel public space. Int J Soc Robot 7(5):911–920CrossRefGoogle Scholar
  15. 15.
    Sakamoto D, Hayashi K, Kanda T, Shiomi M, Koizumi S, Ishiguro H et al (2009) Humanoid robots as a broadcasting communication medium in open public spaces. Int J Soc Robot 1(2):157–169CrossRefGoogle Scholar
  16. 16.
    Rist T, Baldes S, Gebhard P, Kipp M, Klesen M, Rist P, Schmitt M (2002) CrossTalk: an interactive installation with animated presentation agents. In Proceedings of the COSIGN, vol 2Google Scholar
  17. 17.
    Suzuki SV, Yamada S (2004) Persuasion through overheard communication by life-like agents. In: Proceedings. IEEE/WIC/acm international conference on intelligent agent technology, 2004 (IAT 2004). IEEE, pp 225–231Google Scholar
  18. 18.
    Todo Y, Nishimura R, Yamamoto K, Nakagawa S (2013) Development and evaluation of spoken dialog systems with one or two agents through two domains. In: International conference on text, speech and dialogue. Springer, Berlin, pp 185–192CrossRefGoogle Scholar
  19. 19.
    Arimoto T, Yoshikawa Y, Ishiguro H (2014) Nodding responses by collective proxy robots for enhancing social telepresence. In: Proceedings of the second international conference on Human-agent interaction. ACM, pp 97–102Google Scholar
  20. 20.
    Clark Herbert H (1996) Using language. Cambridge University Press, CambridgeCrossRefGoogle Scholar
  21. 21.
    Yamashita K, Kubota H, Nishida T (2006) Designing conversational agents: effect of conversational form on our comprehension. AI Soc 20(2):125–137CrossRefGoogle Scholar
  22. 22.
    Tapus A, Tapus C, Mataric MJ (2008) User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy. Intell Serv Robot 1(2):169–183CrossRefGoogle Scholar
  23. 23.
    Kikuchi A (2007) Syakaiteki sukiru wo hakaru: KiSS-18 handbook [Measuring social skills: the handbook of KiSS-18]. Kawashima, TokyoGoogle Scholar
  24. 24.
    Koda T, Higashino H (2013) Analysis of the effects of user’s social skill on evaluations of a virtual agent that exhibits self-adaptors, vol 2013-HCI-152 No.26Google Scholar
  25. 25.
    Goldstein AP et al (1980) Skill streaming the adolescent: a structured learning approach to teaching prosocial skills. Research Press, ChampaignGoogle Scholar
  26. 26.
    Cabibihan JJ, Javed H, Ang M Jr, Aljunied SM (2013) Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. Int J Soc Robot 5(4):593–618CrossRefGoogle Scholar
  27. 27.
    Huskens B, Verschuur R, Gillesen J, Didden R, Barakova E (2013) Promoting question-asking in school-aged children with autism spectrum disorders: effectiveness of a robot intervention compared to a human-trainer intervention. Dev Neurorehabilit 16(5):345–356CrossRefGoogle Scholar
  28. 28.
    Wainer J, Dautenhahn K, Robins B, Amirabdollahian F (2014) A pilot study with a novel setup for collaborative play of the humanoid robot KASPAR with children with autism. Int J Soc Robot 6(1):45–65CrossRefGoogle Scholar
  29. 29.
    American Psychiatric Association (2013) Diagnostic and statistical manual of mental disorders (DSM-5). American Psychiatric Pub,WashingtonGoogle Scholar

Copyright information

© Springer Science+Business Media B.V., part of Springer Nature 2018

Authors and Affiliations

  • Tsunehiro Arimoto
    • 1
    • 2
    Email author
  • Yuichiro Yoshikawa
    • 1
    • 2
  • Hiroshi Ishiguro
    • 1
    • 2
  1. 1.Osaka UniversityOsakaJapan
  2. 2.JST ERATOOsakaJapan

Personalised recommendations