Skip to main content

My Tutor is an AI: The Effects of Involvement and Tutor Type on Perceived Quality, Perceived Credibility, and Use Intention

  • Conference paper
  • First Online:
Artificial Intelligence in HCI (HCII 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13336))

Included in the following conference series:

Abstract

With the advancement of AI technology, AI tutors are already being utilized in classrooms throughout the world as teaching aides, tutors, and peer learning specialists. AI tutors are good at facilitating various teaching-learning practices within and outside the classroom, and support students 24/7. However, little is known whether AI tutors can be as effective in learning languages as human tutors, and what factors would cause student learning outcome differences between human tutors and AI tutors, especially for those students who are not engaged in learning. The current study sought to address these questions by discovering the combination of two factors: involvement (high vs. low) and tutor type (human tutor vs. AI tutor) under two conditions: weak and high writing quality. The findings indicate that there is an interaction effect between user involvement and tutor type. When user involvement is low, the AI tutor is perceived to have a higher writing quality than the human tutor; when user involvement was high, tutor type did not affect the perceived writing quality of the tutor, no matter the tutor was a human or an AI. The reason that the human tutor is preferred is that the human tutor is perceived to have a higher level of controllability than the AI tutor. The writing quality of tutors affects the credibility of tutors as well.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kelly, R.: AI-powered tutor uses google cloud to generate learning activities. https://campustechnology.com/articles/2021/09/07/ai-powered-tutor-uses-google-cloud-to-generate-learning-activities.aspx

  2. Preston, J.: Jill Watson, an AI Pioneer in Education, Turns 4. https://ic.gatech.edu/news/631545/jill-watson-ai-pioneer-education-turns-4

  3. Hao, K.: China has started a grand experiment in AI education. It could reshape how the world learns. https://www.technologyreview.com/2019/08/02/131198/china-squirrel-has-started-a-grand-experiment-in-ai-education-it-could-reshape-how-the/

  4. Nazari, N., Shabbir, M.S., Setiawan, R.: Application of artificial intelligence powered digital writing assistant in higher education: randomized controlled trial. Heliyon 7 (2021). https://doi.org/10.1016/j.heliyon.2021.e07014

  5. Ruan, S., et al.: EnglishBot: an AI-powered conversational system for second language learning. In: 26th International Conference on Intelligent User Interfaces, pp. 434–444. ACM, College Station TX USA (2021)

    Google Scholar 

  6. Tu, J.: Learn to speak like a native: AI-powered chatbot simulating natural conversation for language tutoring. J. Phys. Conf. Ser. 1693, 012216 (2020). https://doi.org/10.1088/1742-6596/1693/1/012216

  7. Pokrivcakova, S.: Preparing teachers for the application of AI-powered technologies in foreign language education. J. Lang. Cult. Educ. 7, 135–153 (2019). https://doi.org/10.2478/jolace-2019-0025

    Article  Google Scholar 

  8. Ahmad, M.I., Mubin, O., Orlando, J.: Understanding behaviours and roles for social and adaptive robots in education: teacher’s perspective. In: Proceedings of the Fourth International Conference on Human Agent Interaction, pp. 297–304. Association for Computing Machinery, New York, NY, USA (2016)

    Google Scholar 

  9. Kim, J., Merrill, K., Xu, K., Sellnow, D.D.: My teacher is a machine: understanding students’ perceptions of AI teaching assistants in online education. Int. J. Hum. Comput. Interact. 36, 1902–1911 (2020). https://doi.org/10.1080/10447318.2020.1801227

  10. Hsu, C.-K., Hwang, G.-J., Chang, C.-K.: A personalized recommendation-based mobile learning approach to improving the reading performance of EFL students. Comput. Educ. 63, 327–336 (2013). https://doi.org/10.1016/j.compedu.2012.12.004

    Article  Google Scholar 

  11. Bradac, V., Walek, B.: A comprehensive adaptive system for e-learning of foreign languages. Expert Syst. Appl. 90, 414–426 (2017). https://doi.org/10.1016/j.eswa.2017.08.019

    Article  Google Scholar 

  12. McDill, S.: Robots get private view of major pop art show (2020). https://www.reuters.com/article/us-health-coronavirus-art-robots-idUKKBN27E1ZZ

  13. Szymanska, Z.: Avatar robot goes to school for ill German boy (2022). https://www.reuters.com/technology/avatar-robot-goes-school-ill-german-boy-2022-01-14/

  14. Kohli, D.: An Andover preschool hired an unusual teacher’s aide: a robot–The Boston Globe. https://www.bostonglobe.com/2021/12/06/business/an-andover-preschool-teacher-is-robot/

  15. Hams, M.: Robot teacher introduced in Gaza classroom. https://www.i24news.tv/en/news/middle-east/palestinian-territories/1639688788-robot-teacher-introduced-in-gaza-classroom

  16. McDonagh, M.: Sligo schoolchildren’s new teacher will be Nao–a robot. https://www.irishtimes.com/news/education/sligo-schoolchildren-s-new-teacher-will-be-nao-a-robot-1.4659247

  17. Wolhuter, S.: AI in education: how chatbots are transforming learning (2019). https://wearebrain.com/blog/ai-data-science/top-5-chatbots-in-education/

  18. Edwards, A., Edwards, C., Spence, P.R., Harris, C., Gambino, A.: Robots in the classroom: differences in students’ perceptions of credibility and learning between teacher as robot and robot as teacher. Comput. Hum. Behav. 65, 627–634 (2016). https://doi.org/10.1016/j.chb.2016.06.005

    Article  Google Scholar 

  19. Edwards, C., Edwards, A., Albrehi, F., Spence, P.: Interpersonal impressions of a social robot versus human in the context of performance evaluations. Commun. Educ. 70, 165–182 (2021). https://doi.org/10.1080/03634523.2020.1802495

    Article  Google Scholar 

  20. Abendschein, B., Edwards, C., Edwards, A., Rijhwani, V., Stahl, J.: Human-Robot teaming configurations: a study of interpersonal communication perceptions and affective learning in higher education. J. Commun. Pedagog. 4, 123–132 (2021). https://doi.org/10.3316/INFORMIT.105941407010443

    Article  Google Scholar 

  21. Edwards, C., Edwards, A., Stoll, B., Lin, X., Massey, N.: Evaluations of an artificial intelligence instructor’s voice: social identity theory in human-robot interactions. Comput. Hum. Behav. 90, 357–362 (2019). https://doi.org/10.1016/j.chb.2018.08.027

    Article  Google Scholar 

  22. Xu, K.: First encounter with robot alpha: how individual differences interact with vocal and kinetic cues in users’ social responses. New Media Soc. 21, 2522–2547 (2019)

    Article  Google Scholar 

  23. Chérif, E., Lemoine, J.-F.: Anthropomorphic virtual assistants and the reactions of internet users: an experiment on the assistant’s voice. Recherche et Appl. en Mark. (Engl. Ed.) 34, 28–47 (2019). https://doi.org/10.1177/2051570719829432

    Article  Google Scholar 

  24. Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 72–78 (1994)

    Google Scholar 

  25. Nass, C., Fogg, B.J., Moon, Y.: Can computers be teammates? Int. J. Hum Comput Stud. 45, 669–678 (1996). https://doi.org/10.1006/ijhc.1996.0073

    Article  Google Scholar 

  26. Nass, C., Moon, Y.: Machines and mindlessness: social responses to computers. J. Soc. Issues 56, 81–103 (2000). https://doi.org/10.1111/0022-4537.00153

    Article  Google Scholar 

  27. Nass, C., Lee, K.M.: Does computer-synthesized speech manifest personality? Experimental tests of recognition, similarity-attraction, and consistency-attraction. J. Exp. Psychol. Appl. 7, 171–181 (2001). https://doi.org/10.1037/1076-898X.7.3.171

    Article  Google Scholar 

  28. Edwards, C., Edwards, A., Spence, P.R., Shelton, A.K.: Is that a bot running the social media feed? Testing the differences in perceptions of communication quality for a human agent and a bot agent on Twitter. Comput. Hum. Behav. 33, 372–376 (2014). https://doi.org/10.1016/j.chb.2013.08.013

    Article  Google Scholar 

  29. Waddell, T.F.: Can an algorithm reduce the perceived bias of news? Testing the effect of machine attribution on news readers’ evaluations of bias, anthropomorphism, and credibility. Journal. Mass Commun. Q. 96, 82–100 (2019). https://doi.org/10.1177/1077699018815891

    Article  Google Scholar 

  30. Sundar, S.S., Kim, J.: Machine heuristic: when we trust computers more than humans with our personal information. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 538:1–538:9. ACM, New York, NY, USA (2019)

    Google Scholar 

  31. Petty, R.E., Cacioppo, J.T., Goldman, R.: Personal involvement as a determinant of argument-based persuasion. J. Pers. Soc. Psychol. 41, 847–855 (1981). https://doi.org/10.1037/0022-3514.41.5.847

    Article  Google Scholar 

  32. Petty, R.E., Cacioppo, J.T., Schumann, D.: Central and peripheral routes to advertising effectiveness: the moderating role of involvement. J. Consum. Res. 10, 135–146 (1983). https://doi.org/10.1086/208954

    Article  Google Scholar 

  33. Petty, R.E., Cacioppo, J.T.: The elaboration likelihood model of persuasion. In: Communication and Persuasion. Springer Series in Social Psychology. Springer, New York, NY (1986). https://doi.org/10.1007/978-1-4612-4964-1_1

  34. Weiner, B.: Attribution, emotion, and action. In: Handbook of Motivation and Cognition: Foundations of Social Behavior, pp. 281–312. Guilford Press, New York, NY, US (1986)

    Google Scholar 

  35. Leo, X., Huh, Y.E.: Who gets the blame for service failures? Attribution of responsibility toward robot versus human service providers and service firms. Comput. Hum. Behav. 113, 106520 (2020). https://doi.org/10.1016/j.chb.2020.106520

    Article  Google Scholar 

  36. Van Vaerenbergh, Y., Orsingher, C., Vermeir, I., Larivière, B.: A meta-analysis of relationships linking service failure attributions to customer outcomes. J. Serv. Res. 17, 381–398 (2014). https://doi.org/10.1177/1094670514538321

    Article  Google Scholar 

  37. Gray, H., Gray, K., Wegner, D.: Dimensions of mind perception. Science 315, 619 (2007). https://doi.org/10.1126/science.1134475

    Article  Google Scholar 

  38. Gray, K., Wegner, D.M.: Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125, 125–130 (2012). https://doi.org/10.1016/j.cognition.2012.06.007

    Article  Google Scholar 

  39. McCroskey, J.C., Teven, J.J.: Goodwill: a reexamination of the construct and its measurement. Commun. Monogr. 66, 90–103 (1999). https://doi.org/10.1080/03637759909376464

    Article  Google Scholar 

  40. Hong, J.W.: Why is artificial intelligence blamed more? Analysis of faulting artificial intelligence for self-driving car accidents in experimental settings. Int. J. Hum. Comput. Interact. 36, 1768–1774 (2020). https://doi.org/10.1080/10447318.2020.1785693

    Article  Google Scholar 

  41. Hong, J.-W., Williams, D.: Racism, responsibility and autonomy in HCI: testing perceptions of an AI agent. Comput. Hum. Behav. 100, 79–84 (2019). https://doi.org/10.1016/j.chb.2019.06.012

    Article  Google Scholar 

  42. Hong, J.-W., Choi, S., Williams, D.: Sexist AI: an experiment integrating CASA and ELM. Int. J. Hum. Comput. Interact. 1–14 (2020). https://doi.org/10.1080/10447318.2020.1801226

  43. Sundar, S.: The MAIN model: a heuristic approach to understanding technology effects on credibility. In: MacArthur Foundation Digital Media and Learning Initiative. The MIT Press, Cambridge (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mo Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, M., Liu, F., Lee, YH. (2022). My Tutor is an AI: The Effects of Involvement and Tutor Type on Perceived Quality, Perceived Credibility, and Use Intention. In: Degen, H., Ntoa, S. (eds) Artificial Intelligence in HCI. HCII 2022. Lecture Notes in Computer Science(), vol 13336. Springer, Cham. https://doi.org/10.1007/978-3-031-05643-7_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05643-7_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05642-0

  • Online ISBN: 978-3-031-05643-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics