Advertisement

Why Are Algebra Word Problems Difficult? Using Tutorial Log Files and the Power Law of Learning to Select the Best Fitting Cognitive Model

  • Ethan A. Croteau
  • Neil T. Heffernan
  • Kenneth R. Koedinger
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3220)

Abstract

Some researchers have argued that algebra word problems are difficult for students because they have difficulty in comprehending English. Others have argued that because algebra is a generalization of arithmetic, and generalization is hard, it’s the use of variables, per se, that cause difficulty for students. Heffernan and Koedinger [9] [10] presented evidence against both of these hypotheses. In this paper we present how to use tutorial log files from an intelligent tutoring system to try to contribute to answering such questions. We take advantage of the Power Law of Learning, which predicts that error rates should fit a power function, to try to find the best fitting mathematical model that predicts whether a student will get a question correct. We decompose the question of “Why are Algebra Word Problems Difficult?” into two pieces. First, is there evidence for the existence of this articulation skill that Heffernan and Koedinger argued for? Secondly, is there evidence for the existence of the skill of “composed articulation” as the best way to model the “composition effect” that Heffernan and Koedinger discovered?

Keywords

Item Response Theory Question Type Composition Effect Learning Parameter Knowledge Component 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Anderson, J.R., Lebiere, C.: The Atomic Components of Thought. Lawrence Erlbaum Associates, Mahwah (1998)Google Scholar
  2. 2.
    Baker, R.S., Corbett, A.T., Koedinger, K.R.: Statistical Techniques for Comparing ACT-R Models of Cognitive Performance. Presented at 10th Annual ACT-R Workshop (2003)Google Scholar
  3. 3.
    Corbett, A.T., Anderson, J.A.: Knowledge tracing in the ACT programming tutor. In: Proceedings of 14th Annual Conference of the Cognitive Science Society (1992)Google Scholar
  4. 4.
    Corbett, A.T., Anderson, J.R., O’Brien, A.T.: Student modeling in the ACT programming tutor. In: Nichols, P., Chipman, S., Brennan, R. (eds.) Cognitively Diagnostic Assessment. ch. 2, Erlbaum, Hillsdale (1995)Google Scholar
  5. 5.
    Draney, K.L., Pirolli, P., Wilson, M.: A measurement model for a complex cognitive skill. In: Nichols, P., Chipman, S., Brennan, R. (eds.) Cognitively Diagnostic Assessment, Erlbaum, Hillsdale (1995)Google Scholar
  6. 6.
    Embretson, S.E., Reise, S.P.: Item Response Theory for Psychologists Lawrence Erlbaum Assoc. (2000)Google Scholar
  7. 7.
    Heffernan, N.T.: Intelligent Tutoring Systems have Forgotten the Tutor: Adding a Cognitive Model of an Experienced Human Tutor. Dissertation & Technical Report. Carnegie Mellon University, Computer Science (2001), http://www.algebratutor.org/pubs.html
  8. 8.
    Heffernan, N.T.: Web-Based Evaluations Showing both Cognitive and Motivational Benefits of the Ms. Lindquist Tutor. In: 11th International Conference Artificial Intelligence in Education, Syndey, Australia (2003)Google Scholar
  9. 9.
    Heffernan, N.T., Koedinger, K.R.: The composition effect in symbolizing: the role of symbol production versus text comprehension. In: Proceeding of the Nineteenth Annual Conference of the Cognitive Science Society, pp. 307–312. Lawrence Erlbaum Associates, Hillsdale (1997)Google Scholar
  10. 10.
    Heffernan, N.T., Koedinger, K.R.: A developmental model for algebra symbolization: The results of a difficulty factors assessment. In: Proceedings of the Twentieth Annual Conference of the Cognitive Science Society, pp. 484–489. Lawrence Erlbaum Associates, Hillsdale (1998)Google Scholar
  11. 11.
    Junker, B., Koedinger, K.R., Trottini, M.: Finding improvements in student models for intelligent tutoring systems via variable selection for a linear logistic test model. Presented at the Annual North American Meeting of the Psychometric Society, Vancouver, BC, Canada (2000), http://lib.stat.cmu.edu/~brian/bjtrs.html
  12. 12.
    Koedinger, K.R., Junker, B.: Learning Factors Analysis: Mining student-tutor interactions to optimize instruction. Presented at Social Science Data Infrastructure Conference. New York University. November 12-13 (1999)Google Scholar
  13. 13.
    Koedinger, K.R., MacLaren, B.A. (2002), Developing a pedagogical domain theory of early algebra problem solving. CMU-HCII Tech Report 02-100. Accessible via, http://reports-archive.adm.cs.cmu.edu/hcii.html
  14. 14.
    Nathan, M.J., Kintsch, W., Young, E.: A theory of algebra-word-problem comprehension and its implications for the design of learning environments. Cognition & Instruction 9(4), 329–389 (1992)CrossRefGoogle Scholar
  15. 15.
    Nathan, M.J., Koedinger, K.R.: Teachers’ and researchers’ beliefs about the development of algebraic reasoning. Journal for Research in Mathematics Education 31, 168–190 (2000)CrossRefGoogle Scholar
  16. 16.
    Newell, A., Rosenbloom, P.: Mechanisms of skill acquisition and the law of practice. In: Anderson (ed.) Cognitive Skills and Their Acquisition, Erlbaum, Hillsdale (1981)Google Scholar
  17. 17.
    Raftery, A.E.: Bayesian model selection in social research. In: Marsden, P.V. (ed.) Sociological Methodology, pp. 111–196. Blackwells, Cambridge (1995)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Ethan A. Croteau
    • 1
  • Neil T. Heffernan
    • 1
  • Kenneth R. Koedinger
    • 2
  1. 1.Computer Science Department Worcester Polytechnic InstituteWorcesterUSA
  2. 2.School of Computer Science Carnegie Mellon UniversityPittsburghUSA

Personalised recommendations