Skip to main content
Log in

“Don’t Neglect the User!” – Identifying Types of Human-Chatbot Interactions and their Associated Characteristics

  • Published:
Information Systems Frontiers Aims and scope Submit manuscript

Abstract

Interactions with conversational agents (CAs) become increasingly common in our daily life. While research on human-CA interactions provides insights into the role of CAs, the active role of users has been mostly neglected. We addressed this void by applying a thematic analysis approach and analysed 1000 interactions between a chatbot and customers of an energy provider. Informed by the concepts of social presence and social cues and using the abductive logic, we identified six human-chatbot interaction types that differ according to salient characteristics, including direction, social presence, social cues of customers and the chatbot and customer effort. We found that bi-directionality, a medium degree of social presence and selected social cues used by the chatbot and customers are associated with desirable outcomes in which customers mostly obtain requested information. The findings help us understand the nature of human-CA interactions in a customer service context and inform the design and evaluation of CAs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. Rapp et al. (2021) conducted a systematic literature review with a focus on how users interact with text-based chatbots and provided a summary of chatbot domains of application (See Table 5 in their study). In another study, Chaves and Gerosa (2021) surveyed the literature and developed a conceptual model of social characteristics for chatbots. They also juxtaposed domains of applications and desirable social characteristics of chatbots identified in the literature (See Table 2 in their study).

  2. Social cues are referred to as anthropomorphic design cues in some studies on CAs (e.g., Adam et al., 2020).

  3. Social cues are presented with bold texts.

References

  • Adam, M., Wessel, M., & Benlian, A. (2020). AI-based chatbots in customer service and their effects on user compliance. Electronic Markets, 31(2), 427–445.

    Article  Google Scholar 

  • Adamopoulou, E., & Moussiades, L. (2020). Chatbots: History, technology, and applications. Machine Learning with Applications, 2, 100006.

  • Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in Human Behavior, 85, 183–189.

    Article  Google Scholar 

  • Biocca, F. (1997). The cyborg’s dilemma: Progressive embodiment in virtual environments. Journal of Computer-Mediated Communication, 3(2), JCMC324.

  • Biocca, F., Harms, C., & Burgoon, J. K. (2003). Toward a more robust theory and measure of social presence: Review and suggested criteria. Presence: Teleoperators & Virtual Environments, 12(5), 456–480.

    Article  Google Scholar 

  • Bitner, J., Brown, S. W., & Meuter, M. L. (2000). Technology infusion in service encounters. Journal of the Academy of Marketing Science, 28(1), 138–149.

    Article  Google Scholar 

  • Brandtzaeg, P. B., & Folstad, A. (2017). Why people use Chatbots. In International Conference on Internet Science (pp. 377–392).

  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.

    Article  Google Scholar 

  • Brink, H. (1993). Validity and reliability in qualitative research. Curationis, 16(2), 35–38.

    Article  Google Scholar 

  • Bryan, J. (2019). Don’t miss the opportunity for cost savings offered by self-service. Gartner. Retrieved April 20, 2021, from https://www.gartner.com/smarterwithgartner/dont-miss-the-opportunity-for-cost-savings-offered-by-self-service/

  • Bulu, S. T. (2012). Place presence, social presence, co-presence, and satisfaction in virtual worlds. Computers & Education, 58(1), 154–161.

    Article  Google Scholar 

  • Car, L. T., Dhinagaran, D. A., Kyaw, B. M., Kowatsch, T., Joty, S., Theng, Y. L., & Atun, R. (2020). Conversational agents in health care: Scoping review and conceptual analysis. Journal of Medical Internet Research, 22(8), e17158.

    Article  Google Scholar 

  • Carolus, A., Schmidt, C., Schneider, F., Mayr, J., & Muench, R. (2018). Are people polite to smartphones? Lecture Notes in Computer Science, 10902, 500–511.

    Article  Google Scholar 

  • Cassell, J. (2000). More than just another pretty face: Embodied conversational interface agents. Communications of the ACM, 43(4), 70–78.

    Article  Google Scholar 

  • Chai, Y., & Liu, G. (2018). Utterance censorship of online reinforcement learning chatbot. In International Conference on Tools with Artificial Intelligence (pp. 358–362). IEEE.

  • Chaves, A. P., & Gerosa, M. A. (2021). How should my Chatbot interact? A Survey on social characteristics in human–chatbot interaction design. International Journal of Human-Computer Interaction, 37(8), 729–758.

    Article  Google Scholar 

  • Choi, J., Lee, H. J., & Kim, Y. C. (2011). The influence of social presence on customer intention to reuse online recommender systems: The roles of personalisation and product type. International Journal of Electronic Commerce, 16(1), 129–154.

    Article  Google Scholar 

  • Choque-Diaz, M., Armas-Aguirre, J., &, Shiguihara-Juarez, P. (2018). Cognitive technology model to enhanced academic support services with chatbots. In 2018 IEEE XXV International Conference on Electronics, Electrical Engineering and Computing (INTERCON) (pp. 1–4).

  • Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human-chatbot interaction. Future Generation Computer Systems, 92, 539–548.

    Article  Google Scholar 

  • Denzin, N. K. (1970). The research act: A theoretical introduction to sociological methods. Aldine Publishing Co.

  • Diederich, S., Brendel, A. B., Lichtenberg, S., & Kolbe, L. (2019a). Design for fast request fulfillment or natural interaction? Insights from an experiment with a conversational agent. In Proceedings of the 27th European Conference on Information Systems (ECIS) (pp. 1–17).

  • Diederich, S., Janben-Müller, M., Brendel, A., & Morana, S. (2019b). Emulating empathetic behavior in online service encounters with sentiment-adaptive responses: Insights from an experiment with a conversational agent. In the 40th International Conference on Information Systems.

  • Diederich, S., Lembcke, T., Brendel, A. B., & Kolbe, L. M. (2020). Not human after all: Exploring the impact of response failure on user perception of anthropomorphic conversational service agents. In The 28th European Conference on Information Systems (ECIS2020) (pp. 110–126).

  • Diederich, S., Lembcke, T., Brendel, A. B., & Kolbe, L. (2021). Understanding the impact that response failure has on how users perceive anthropomorphic conversational service agents: Insights from an online experiment. AIS Transactions on Human-Computer Interaction, 13(1), 82–103.

    Article  Google Scholar 

  • Dinakar, C., Aaltonen, V., Rämö, A., & Vilermo, M. (2010). Talk to me: The influence of audio quality on the perception of social presence. In Proceedings of the 24th BCS Interaction Specialist Group Conference.

  • Edwards, A., Edwards, C., & Gambino, A. (2019). The social pragmatics of communication with social robots: Effects of robot message design logic in a regulative context. International Journal of Social Robotics, 12, 945–957. https://doi.org/10.1007/s12369-019-00538-7

  • Elvir, M., Gonzalez, A. J., Walls, C., & Wilder, B. (2017). Remembering a conversation – A conversational memory architecture for embodied conversational agents. Journal of Intelligent Systems, 26(1), 1–21.

    Article  Google Scholar 

  • Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019a). A taxonomy of social cues for conversational agents. International Journal of Human-Computer Studies, 132, 138–161.

    Article  Google Scholar 

  • Feine, J., Morana, S., & Gnewuch, U. (2019b). Measuring service encounter satisfaction with customer service chatbots using sentiment analysis. In The 14th International Conference on Wirtschaftsinformatik (pp. 1115–1129).

  • Fernández, W. D. (2004). Using the Glaserian approach in grounded studies of emerging business practices. Electronic Journal of Business Research Methods, 2(2), 83–94.

    Google Scholar 

  • Følstad, A., Nordheim, C. B., & Bjørkli, C. A. (2018). What makes users trust a chatbot for customer service? An exploratory interview study. In International Conference on Internet Science (pp. 194–208).

  • Gambino, A., Fox, J., & Ratan, R. A. (2020). Building a stronger CASA: Extending the computers are social actors paradigm. Human-Machine Communication, 1, 71–85.

    Article  Google Scholar 

  • Gnewuch, U., Morana, S., Adam, M., & Maedche, A. (2018). Faster is not always better: Understanding the effect of dynamic response delays in human-chatbot interaction. In The 26th European Conference on Information Systems (ECIS2018), UK.

  • Goel, L., Johnson, N., Junglas, I., & Ives, B. (2013). Predicting users’ return to virtual worlds: A social perspective. Information Systems Journal, 23(1), 35–63.

    Article  Google Scholar 

  • Goofman, E. (1963). Behavior in public places: Notes on the social organization of gatherings. Free Press.

    Google Scholar 

  • Graesser, A. C., Li, H., & Forsyth, C. (2014). Learning by communicating in natural language with conversational agents. Current Directions in Psychological Science, 23(5), 374–380.

    Article  Google Scholar 

  • Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. International Journal of Educational Telecommunications, 1(2/3), 147–166.

  • Hess, T. J., Fuller, M., & Campbell, D. E. (2009). Designing interfaces with social presence: Using vividness and extraversion to create social recommendation agents. Journal of the Association for Information Systems, 10(12), 889–919.

    Article  Google Scholar 

  • Hill, J., Ford, W. R., & Farreras, I. G. (2015). Real conversations with artificial intelligence: A comparison between human–human online conversations and human–chatbot conversations. Computers in Human Behavior, 49, 245–250.

  • Hurwitz, J. S., Kaufman, M., & Bowles, A. (2015). Cognitive computing and big data analytics. Wiley.

    Google Scholar 

  • Janssen, A., Passlick, J., Cardona, D. R., & Breitner, M. H. (2020). Virtual assistance in any context: A taxonomy of design elements for domain-specific chatbots. Business and Information Systems Engineering, 62(3), 211–225.

  • Larivière, B., Bowen, D., Andreassen, T. W., Kunz, W., Sirianni, N. J., Voss, C., Wunderlich, N. V., & De Keyer, A. (2017). “Service encounter 2.0”: An investigation into the roles of technology, employees and customers. Journal of Business Research, 79, 238–246.

    Article  Google Scholar 

  • Lee, S. A., & Liang, Y. (2019). Robotic foot-in-the-door: Using sequential-request persuasive strategies in human-robot interaction. Computers in Human Behavior, 90, 351–356.

    Article  Google Scholar 

  • Lee, K. M., & Nass, C. (2003). Designing social presence of social actors in human computer interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 289–296).

  • Lewicki, R. J., & Litterer, R. (1995). Negotiation. Irwin.

    Google Scholar 

  • Li, M., & Mao, J. (2015). Hedonic and utilitarian? Exploring the impact of communication style alignment on user’s perception of virtual health advisory services. International Journal of Information Management, 35, 229–243.

    Article  Google Scholar 

  • Liu, B., Xu, Z., Sun, C., Wang, B., Wang, Z., Wong, D. F., & Zhang, M. (2018). Content-oriented user modeling for personalized response ranking. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 26(1), 122–133.

  • Meuter, M. L., Ostrom, A. L., Roundtree, R. I., & Bitner, M. J. (2000). Self-service technologies: Understanding customer satisfaction with technology-based service encounters. Journal of Marketing, 64, 50–64.

    Article  Google Scholar 

  • Michaud, L. N. (2018). Observations of a new chatbot: Drawing conclusions from early interactions with users. IT Professional, 20(5), 40–47.

    Article  Google Scholar 

  • Moore, S. (2018). Gartner says 25 percent of customer service operations will use virtual customer assistants by 2020. Gartner. Retrieved April 20, 2021, from https://www.gartner.com/en/newsroom/press-releases/2018-02-19-gartner-says-25-percent-of-customer-service-operations-will-use-virtual-customer-assistants-by-2020

  • Moore, R. J., Ducheneaut, N., & Nickell, E. (2007). Doing virtually nothing: Awareness and accountability in massively multiplayer online worlds. Computer Supported Cooperative Work, 16(3), 265–305.

  • Morana, S., Gnewuch, U., Jung, D., & Granig, C. (2020). The effect of anthropomorphism on investment decision-making with robo-advisor chatbots. In The 28th European Conference on Information Systems (ECIS2020), Marrakech, Morocco.

  • Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [from the field]. IEEE Robotics & Automation Magazine, 19(2), 98–100.

    Article  Google Scholar 

  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103.

    Article  Google Scholar 

  • Nass, C., & Steuer, J. (1993). Voices, boxes, and sources of messages: Computers and social actors. Human Communication Research, 19, 504–527.

    Article  Google Scholar 

  • Nowak, K. L., & Biocca, F. (2003). The effect of the agency and anthropomorphism on users’ sense of telepresence, copresence, and social presence in virtual environments. Presence: Teleoperators & Virtual Environments, 12(5), 481–494.

    Article  Google Scholar 

  • Nuruzzaman, M., & Hussain, O. K. (2018). A survey on chatbot implementation in customer service industry through deep neural networks. In 2018 IEEE 15th International Conference on E-Business Engineering (ICEBE) (pp. 54–61).

  • Oh, C. S., Bailenson, J. N., & Welch, G. F. (2018). A systematic review of social presence: Definition, antecedents, and implications. Frontiers in Robotics and AI, 5, 114.

  • Poser, M., Singh, S., & Bittner, E. A. (2021). Hybrid service recovery: Design for seamless inquiry handovers between conversational agents and human service agents. In The 54th Hawaii International Conference on System Sciences (pp. 1181–1190).

  • Radziwill, N. M., & Benton, M. C. (2017). Evaluating quality of chatbots and intelligent conversational agents. arXiv preprint arXiv:1704.04579. Retrieved April 20, 2021, from https://arxiv.org/ftp/arxiv/papers/1704/1704.04579.pdf

  • Ramesh, K., Ravishankaran, S., Joshi, A., & Chandrasekaran, K. (2017). A survey of design techniques for conversational agents. In International Conference on Information, Communication and Computing Technology (pp. 336–350).

  • Rapp, A., Curti, L., & Boldi, A. (2021). The human side of human-chatbot interaction: A systematic literature review of ten years of research on text-based chatbots. International Journal of Human-Computer Studies, 151, 1–24.

    Article  Google Scholar 

  • Rettie, R. (2005). Presence and embodiment in mobile phone communication. PsychNology Journal, 3(1), 16–34.

    Google Scholar 

  • Riva, G., & Mantovani, F. (2014). Extending the self through the tools and the others: A general framework for presence and social presence in mediated interactions. In G. Riva, J. Waterworth, & D. Murray (Eds.), Interacting with presence (pp. 9–31). De Gruyter Open Poland.

  • Salomonson, N., Allwood, J., Lind, M., & Alm, H. (2013). Comparing human-to-human and human-to-AEA communication in service encounters. Journal of Business Communication, 50(1), 87–116.

  • Sangaiah, A. K., Thangavelu, A., & Sundaram, V. M. (2018). Cognitive computing for big data systems over IoT: Frameworks, tools and applications. Springer.

    Book  Google Scholar 

  • Sarker, S., Xiao, X., Beaulieu, T., & Lee, A. S. (2018). Learning from first-generation qualitative approaches in the IS discipline: An evolutionary view and some implications for authors and evaluators (PART 1/2). Journal of the Association for Information Systems, 19(8), 752–774.

    Article  Google Scholar 

  • Schuetzler, R. M., Grimes, G. M., Giboney, J. S., & Nunamaker Jr., J. F. (2018). The influence of conversational agents on socially desirable responding. In 2018 Proceedings of the 51st Hawaii International Conference on System Sciences (pp. 283–292).

  • Schultze, U., & Brooks, J. A. M. (2019). An interactional view of social presence: Making the virtual other “real”. Information Systems Journal, 29(3), 707–737.

    Article  Google Scholar 

  • Seeber, I., Waizenegger, L., Seidel, S., Morana, S., Benbasat, I., & Lowry, P. B. (2020). Collaborating with technology-based autonomous agents: Issues and research opportunities. Internet Research, 30(1), 1–18. https://doi.org/10.1108/INTR-12-2019-0503

  • Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. Wiley.

    Google Scholar 

  • Srinivasan, V., & Takayama, L. (2016). Help me please: Robot politeness strategies for soliciting help from people. In Proceedings of the SIGCHI ‘16 Human Factors in Computing Systems (pp. 4945–4955).

  • Valtolina, S., Barricelli, B. R., & Gaetano, S. D. (2019). Communicability of traditional interfaces VS chatbots in healthcare and smart home domains. Behaviour & Information Technology, 39(1), 108–132.

    Article  Google Scholar 

  • Van Hoek, R., Aronsson, H., Kovács, G., & Spens, K. M. (2005). Abductive reasoning in logistics research. International Journal of Physical Distribution & Logistics Management, 35(2), 132–144.

    Article  Google Scholar 

  • Wagner, K., Nimmermann, F., & Schramm-Klein, H. (2019). Is it human? The role of anthropomorphism as a driver for the successful acceptance of digital voice assistants. In Proceedings of the 52nd Hawaii International Conference on System Sciences (pp. 1386–1395).

  • Waizenegger, L., Seeber, I., Dawson, G., & Desouza, K. C. (2020). Conversational agents – Exploring generative mechanisms and second-hand effects of actualized technology affordances. In Proceedings of the 53rd Hawaii International Conference on System Sciences (pp. 5180–5189).

  • Walsham, G. (1995). The emergence of interpretivism in IS research. Information Systems Research, 6(4), 376–394.

    Article  Google Scholar 

  • Xu, A., Liu, Z., Guo, Y., Sinha, V., & Akkiraju, R. (2017). A new chatbot for customer service on social media. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 3506–3510).

  • Yang, L., Qiu, M., Qu, C., Guo, J., Zhang, Y., Croft, W. B., Huang, J., & Chen, H. (2018). Response ranking with deep matching networks and external knowledge in information-seeking conversation systems. In The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval (pp. 245–254).

  • Zamora, J. (2017). I’m sorry, Dave, I’m afraid I can’t do that: Chatbot perception and expectations. In Proceedings of the 5th International Conference on Human Agent Interaction (pp. 253–260).

  • Zhao, S. (2003). Toward a taxonomy of copresence. Presence: Teleoperators & Virtual Environments, 12(5), 445–455.

    Article  Google Scholar 

  • Zierau, N., Wambsganss, T., Janson, A., Schöbel, S., & Leimeister, J. M. (2020). The anatomy of user experience with conversational agents: A taxonomy and propositions of service clues. In The 41st International Conference on Information Systems (pp. 1–17).

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lena Waizenegger.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

All authors equally contributed to this work.

Appendices

Appendix 1

Table 3 Operationalisation table

Appendix 2

Table 4 Open codes and patterns

Appendix 3

Table 5 Chatbot’s social cues

Appendix 4

Table 6 Customers’ social cues

Appendix 5

Table 7 Interaction types and outcomes

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nguyen, T.H., Waizenegger, L. & Techatassanasoontorn, A.A. “Don’t Neglect the User!” – Identifying Types of Human-Chatbot Interactions and their Associated Characteristics. Inf Syst Front 24, 797–838 (2022). https://doi.org/10.1007/s10796-021-10212-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10796-021-10212-x

Keywords

Navigation