Why should an ITS bother with students' explanations?
PROBIT (PROBability Intelligent Tutor) served as a test bed for examining the potential roles of students' explanations in an ITS. PROBIT's environment was modified to enable the student to use natural language for explaining his formal language answer, and for giving examples of pertinent erroneous answers he expected other students to make. We examined the formal language inputs with relation to the natural language ones, and we compared the automatic diagnosis done by PROBIT on the basis of the formal language, with the one we conducted manually on the base of both: formal language and the natural language explanation and examples. Examples from protocols are discussed with emphasize on the inter relationships between the two goals of an ITS' learning environment: providing the student with useful tools to learn and providing the system with useful information for student modeling. We discuss the relevance of the research on producing cooperative explanations, to the incorporation of students' explanations.
Unable to display preview. Download preview PDF.
- 1.Amit, E., Bar-On, E. and Or-Bach, R. (1990). A machine learning algorithm for diagnosing misconceptions. Technical report. Knowledge and Learning Sciences (KLS) lab., Dept. of Technology and Science Education, Technion, Israel.Google Scholar
- 2.Anderson, J. R., Boyle, C. F., & Yost, G. (1986). The geometry tutor. The J. of Mathematical Behavior. 5, 5–19.Google Scholar
- 3.Bar-On, E., & Or-Bach, R. (1988). Programming mathematics: a new approach in introducing probability to less able pupils. Int. J. Math. Educ. Sci. Technol., 19, 281–297.Google Scholar
- 4.Chi, M.T.H., Bassok, M., Lewis, M. W., Reimann, P. and Glaser, R. (1989). Self explanations: How students study and use examples in learning to solve problems. Cognitive Science. 13, 145–182.Google Scholar
- 5.Gagne, R.M. and Smith, E.C. (1962). A study of the effect of verbalization on problem solving. Journal of Experimental Psycology. 63, 12–18.Google Scholar
- 6.Kodratoff, Y. (1987). Is AI a sub-field of computer science — or is AI the science of explanations? In Bratko, I. and Lavrac, N. (Eds.) Progress In Machine Learning, Sigma Press, Wilmslow.Google Scholar
- 7.Langley, P. & Ohlsson, S. (1984). Automated cognitive modeling. Proceedings of the National Conference on Artificial Intelligence. Austin, Texas, pp. 193–197.Google Scholar
- 8.Lesgold A., Eggan, G., Katz, S. and Rao, G. (in press). Possibilities for assessment using computer-based apprenticeship environments. To appear in W. Regian and Shute, V. (Eds.), Cognitive Approaches to automated instruction. Erlbaum, NJ.Google Scholar
- 9.Mitchel, T.M., Keller, R.M. and Kedar-Cabeli, S. (1986). Explanation based generalization: A unifying view. Machine Learning. 1, 48–80.Google Scholar
- 10.Moore, J.D. and Swartout, W.R. (1989). A reactive approach to explanations. In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence. Detroit, MI.Google Scholar
- 11.Or-Bach, R. & Bar-On, E. (1991). An Object-oriented paradigm for student modeling in learning environments. To appear in Bar-On, E., Eylon, B. and Scherz, Z. (Eds.) Intelligent Learning Environments: From Cognitive Analysis to Computer Implementation. Abelex pub.Google Scholar
- 12.Or-Bach, R. & Bar-On, E. (1989). PROBIT — Developing a diagnostic intelligent tutor. In Bierman, D., Breuker, J. and Sandberg, J. (Eds.) Proceedings of the 4th International Conference on AI and Education. IOS Amsterdam, Netherlands.Google Scholar
- 13.Or-Bach, R. (1988). Development of a Diagnostic Knowledge-Based System for the Tutoring of Probability Concepts. Unpublished Doctorl dissertation, Technion, Haifa.Google Scholar
- 14.Reiser, B. J., Anderson, J. R. & Farrel, R. G. (1985). Dynamic student modeling in an intelligent tutor for LISP programming. Proceedings of the Ninth International Joint Conference on Artificial Intelligence. Los Angeles, 8–14.Google Scholar
- 15.Sleeman, D.H. and Hendley, R.J. (1979). ACE: A system which analyses complex explanations. Int. Jrnl Man-Machine Studies. 11, 125–144.Google Scholar