Towards Predicting Future Transfer of Learning

  • Ryan S. J. d. Baker
  • Sujith M. Gowda
  • Albert T. Corbett
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6738)

Abstract

We present an automated detector that can predict a student’s future performance on a transfer post-test, a post-test involving related but different skills than the skills studied in the tutoring system, within an Intelligent Tutoring System for College Genetics. We show that this detector predicts transfer better than Bayesian Knowledge Tracing, a measure of student learning in intelligent tutors that has been shown to predict performance on paper post-tests of the same skills studied in the intelligent tutor. We also find that this detector only needs limited amounts of student data (the first 20% of a student’s data from a tutor lesson) in order to reach near-asymptotic predictive power.

Keywords

Transfer Bayesian Knowledge Tracing Educational Data Mining Student Modeling Robust Learning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aleven, V., Koedinger, K.R.: An effective metacognitive strategy: learning by doing and explaining with a computer-based Cognitive Tutor. Cognitive Science 26, 147–179 (2002)CrossRefGoogle Scholar
  2. 2.
    Aleven, V., McLaren, B., Roll, I., Koedinger, K.: Toward meta-cognitive tutoring: A model of help seeking with a Cognitive Tutor. International Journal of Artificial Intelligence and Education 16, 101–128 (2006)Google Scholar
  3. 3.
    Baker, R.S.J.d., Corbett, A.T., Gowda, S.M., Wagner, A.Z., MacLaren, B.M., Kauffman, L.R., Mitchell, A.P., Giguere, S.: Contextual Slip and Prediction of Student Performance After Use of an Intelligent Tutor. In: Proceedings of the 18th Annual Conference on User Modeling, Adaptation, and Personalization, pp. 52–63 (2010)Google Scholar
  4. 4.
    Baker, R.S.J.d., Corbett, A.T., Roll, I., Koedinger, K.R.: Developing a Generalizable Detector of When Students Game the System. User Modeling and User-Adapted Interaction 18(3), 287–314 (2008)CrossRefGoogle Scholar
  5. 5.
    Baker, R.S.J.d., Goldstein, A.B., Heffernan, N.T.: Detecting the moment of learning. In: Aleven, V., Kay, J., Mostow, J. (eds.) ITS 2010. LNCS, vol. 6094, pp. 25–34. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  6. 6.
    Bransford, J.D., Schwartz, D.: Rethinking transfer: A simple proposal with multiple implications. Review of Research in Education 24, 61–100 (1999)Google Scholar
  7. 7.
    Butcher, K.R.: How Diagram Interaction Supports Learning: Evidence from Think Alouds during Intelligent Tutoring. In: Goel, A.K., Jamnik, M., Narayanan, N.H. (eds.) Diagrams 2010. LNCS, vol. 6170, pp. 295–297. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  8. 8.
    Corbett, A.T., Anderson, J.R.: Knowledge Tracing: Modeling the Acquisition of Procedural Knowledge. User Modeling and User-Adapted Interaction 4, 253–278 (1995)CrossRefGoogle Scholar
  9. 9.
    Corbett, A., Bhatnagar, A.: Student Modeling in the ACT Programming Tutor: Adjusting Procedural Learning Model with Declarative Knowledge. In: User Modeling: Proceedings of the 6th International Conference, pp. 243–254 (1997)Google Scholar
  10. 10.
    Corbett, A.T., Kauffman, L., MacLaren, B., Wagner, A., Jones, E.: A Cognitive Tutor for Genetics Problem Solving: Learning Gains and Student Modeling. Journal of Educational Computing Research 42(2), 219–239 (2010)CrossRefGoogle Scholar
  11. 11.
    Efron, B., Gong, G.: A leisurely look at the bootstrap, the jackknife, and cross-validation. American Statistician 37, 36–48 (1983)MathSciNetGoogle Scholar
  12. 12.
    Gong, Y., Beck, J.E., Heffernan, N.T.: Comparing Knowledge Tracing and Performance Factor Analysis by Using Multiple Model Fitting Procedures. In: Aleven, V., Kay, J., Mostow, J. (eds.) ITS 2010. LNCS, vol. 6094, pp. 35–44. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  13. 13.
    Koedinger, K.R., Corbett, A.T., Perfetti, C.: (under review) The Knowledge-Learning-Instruction (KLI) Framework: Toward Bridging the Science-Practice Chasm to Enhance Robust Student Learning (manuscript under review)Google Scholar
  14. 14.
    Martin, J., VanLehn, K.: Student Assessment Using Bayesian Nets. International Journal of Human-Computer Studies 42, 575–591 (1995)CrossRefGoogle Scholar
  15. 15.
    Pavlik, P.I., Anderson, J.R.: Using a Model to Compute the Optimal Schedule of Practice. Journal of Experimental Psychology: Applied 14(2), 101–117 (2008)Google Scholar
  16. 16.
    Pavlik, P.I., Cen, H., Koedinger, J.R.: Performance Factors Analysis – A New Alternative to Knowledge Tracing. In: Proceedings of the 14th International Conference on Artificial Intelligence in Education, pp. 531–540 (2009)Google Scholar
  17. 17.
    Salden, R.J.C.M., Koedinger, K.R., Renkl, A., Aleven, V., McLaren, B.M.: Accounting for Beneficial Effects of Worked Examples in Tutored Problem Solving. Educational Psychology Review 22, 379–392 (2010)CrossRefGoogle Scholar
  18. 18.
    Shih, B., Koedinger, K.R., Scheines, R.: A response time model for bottom-out hints as worked examples. In: Proc. 1st Int’l Conf. on Educational Data Mining, pp. 117–126 (2008)Google Scholar
  19. 19.
    Shute, V.J.: SMART: Student modeling approach for responsive tutoring. User Modeling and User-Adapted Interaction 5(1), 1–44 (1995)CrossRefGoogle Scholar
  20. 20.
    Singley, M.K., Anderson, J.R.: The Transfer of Cognitive Skill. Harvard University Press, Cambridge (1989)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Ryan S. J. d. Baker
    • 1
  • Sujith M. Gowda
    • 1
  • Albert T. Corbett
    • 2
  1. 1.Department of Social Science and Policy StudiesWorcester Polytechnic InstituteWorcesterUSA
  2. 2.Human-Computer Interaction InstituteCarnegie Mellon UniversityPittsburghUSA

Personalised recommendations