Abstract
Most ITS have a means of providing assistance to the student, either on student request or when the tutor determines it would be effective. Presumably, such assistance is included by the ITS designers since they feel it benefits the students. However, whether—and how—help helps students has not been a well studied problem in the ITS community. In this paper we present three approaches for evaluating the efficacy of the Reading Tutor’s help: creating experimental trials from data, learning decomposition, and Bayesian Evaluation and Assessment, an approach that uses dynamic Bayesian networks. We have found that experimental trials and learning decomposition both find a negative benefit for help–that is, help hurts! However, the Bayesian Evaluation and Assessment framework finds that help both promotes student long-term learning and provides additional scaffolding on the current problem. We discuss why these approaches give divergent results, and suggest that the Bayesian Evaluation and Assessment framework is the strongest of the three. In addition to introducing Bayesian Evaluation and Assessment, a method for simultaneously assessing students and evaluating tutorial interventions, this paper describes how help can both scaffold the current problem attempt as well as teach the student knowledge that will transfer to later problems.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Heiner, C., Beck, J.E., Mostow, J.: Improving the help selection policy in a Reading Tutor that listens. In: Proceedings of the InSTIL/ICALL Symposium on NLP and Speech Technologies in Advanced Language Learning Systems, Venice, Italy, pp. 195–198 (2004)
Arroyo, I., Beck, J.E., Beal, C.R., Wing, R.E., Woolf, B.P.: Analyzing students’ response to help provision in an elementary mathematics Intelligent Tutoring System in Help Provision and Help Seeking in Interactive Learning Environments. In: Workshop at the Tenth International Conference on Artificial Intelligence in Education, San Antonio (2001)
Mostow, J., Aist, G.: Evaluating tutors that listen: An overview of Project LISTEN. In: Feltovich, P. (ed.) Smart Machines in Education, pp. 169–234. MIT/AAAI Press, Menlo Park (2001)
Anderson, J.R.: Rules of the mind. Lawrence Erlbaum Associates, Hillsdale (1993)
Beck, J.: Using learning decomposition to analyze student fluency development. In: Ikeda, M., Ashley, K.D., Chan, T.-W. (eds.) ITS 2006. LNCS, vol. 4053. Springer, Heidelberg (2006)
Zhang, X., Mostow, J., Beck, J.E.: All in the (word) family: Using learning decomposition to estimate transfer between skills in a Reading Tutor that listens. In: AIED 2007 Educational Data Mining Workshop, pp. 80–87 (2007)
Beck, J.E.: Does learner control affect learning? In: Proceedings of the 13th International Conference on Artificial Intelligence in Education, Los Angeles, pp. 135–142 (2007)
Corbett, A.T., Anderson, J.R.: Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modeling and User-Adapted Interaction 4, 253–278 (1995)
Reye, J.: Student Modelling based on Belief Networks. International Journal of Artificial Intelligence in Education 14, 1–33 (2004)
Mostow, J., Beck, J., Bey, J., Cuneo, A., Sison, J., Tobin, B., Valeri, J.: Using automated questions to assess reading comprehension, vocabulary, and effects of tutorial interventions. Technology, Instruction, Cognition and Learning 2, 97–134 (2004)
Vygotsky, L.: Play and its role in the mental development of the child. In: Bruner, J., Jolly, A., Sylva, K. (eds.) Play: Its role in development and evolution (1976), pp. 461–463. Penguin Books, New York (1933)
Chang, K.-m.K., Beck, J.E., Mostow, J., Corbett, A.: A Bayes Net Toolkit for Student Modeling in Intelligent Tutoring Systems. In: 8th International Conference on Intelligent Tutoring Systems, Jhongli, Taiwan (2006)
Murphy, K.: Bayes Net Toolbox for Matlab (1998)
Jensen, F.V., Jordan, M., Lauritzen, S.L., Lawless, J.F., Nair, V. (eds.): Bayesian Networks and Decision Graphs. Statistics for Engineering and Information Science. Springer (2001)
Beck, J.E., Sison, J.: Using knowledge tracing in a noisy environment to measure student reading proficiencies. International Journal of Artificial Intelligence in Education 16, 129–143 (2006)
Hand, D., Mannila, H., Smyth, P.: Principles of Data Mining. MIT Press, Cambridge (2001)
Jonsson, A., Johns, J., Mehranian, H., et al.: Evaluating the Feasibility of Learning Student Models from Data. In: Educational Data Mining: Papers from the AAAI Workshop, pp. 1–6. AAAI Press, Pittsburgh (2005)
Conati, C., Gertner, A., VanLehn, K.: Using Bayesian Networks to Manage Uncertainty in Student Modeling. User Modeling and User-Adapted Interaction 12(4), 371–417 (2002)
Beck, J.E., Chang, K.-m.: Identifiability: A Fundamental Problem of Student Modeling. In: International Conference on User Modeling, Corfu, Greece, pp. 137–146 (2007)
Embretson, S.E., Reise, S.P.: Item Response Theory for Psychologists. In: Harlow, L.L. (ed.) Multivariate Applications, p. 371. Lawrence Erlbaum Associates, Mahwah (2000)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Beck, J.E., Chang, Km., Mostow, J., Corbett, A. (2008). Does Help Help? Introducing the Bayesian Evaluation and Assessment Methodology. In: Woolf, B.P., Aïmeur, E., Nkambou, R., Lajoie, S. (eds) Intelligent Tutoring Systems. ITS 2008. Lecture Notes in Computer Science, vol 5091. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69132-7_42
Download citation
DOI: https://doi.org/10.1007/978-3-540-69132-7_42
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-69130-3
Online ISBN: 978-3-540-69132-7
eBook Packages: Computer ScienceComputer Science (R0)