The effect of language modification of mathematics story problems on problem-solving in online homework
Students’ grasp of the non-mathematical language in a mathematics story problem—such as vocabulary and syntax—may have an important effect on their problem-solving, and this may be particularly true for students with weaker language skills. However, little experimental research has examined which individual language features influence students’ performance while solving problems—much research has been correlational or has combined language features together. In the present study, we manipulated six different language features of algebra story problems—number of sentences, pronouns, word concreteness, word hypernymy, consistency of sentences, and problem topic—and examined how systematically varying readability demands impacts student performance. We examined both accuracy and response time measures, using an assignment for learning linear functions in the ASSISTments online problem-solving environment. We found little evidence that individual language features have a considerable effect on mathematics word problem solving performance for a general population of students. However, sentence consistency reduced response time and problems about motion or travel had shorter response times than problems about business or work. In addition, it appears students may benefit or be harmed by language modifications depending on their familiarity with ASSISTments. Implications for the role of language in math word problems are discussed.
KeywordsReadability Word problems Reading demands Math problems
We would like to thank Neil Heffernan, Cristina Heffernan, and Korinn Ostrow for their assistance in setting up this study. We also thank NSF for their support in making ASSISTments available to conduct this research, through NSF Cyberinfrastructure Award 1440753, SI2-SSE: Adding Research Accounts to the ASSISTments Platform: Helping Researchers do Randomized Controlled Studies with Thousands of Students.
- Abedi, J., Lord, C., Hofstetter, C., & Baker, E. (2000). Impact of accommodation strategies on English language learners’ test performance. Educational Measurement: Issues and Practice, 19(3), 16–26. https://doi.org/10.1111/j.1745-3992.2000.tb00034.x.Google Scholar
- Baker, R. S., Corbett, A. T., & Koedinger, K. R. (2007). The difficulty factors approach to the design of lessons in intelligent tutor curricula. International Journal of Artificial Intelligence in Education, 17(4), 341–369.Google Scholar
- Baker, R. S., de Carvalho, A. M., Raspat, J., Aleven, V., Corbett, A. T., & Koedinger, K. R. (2009). Educational software features that encourage and discourage “gaming the system.” Proceedings of the 14th International Conference on Artificial Intelligence in Education, 475–482. https://doi.org/10.3233/978-1-60750-028-5-475.
- Bates, D., Maechler, M., Bolker, B., & Walker, S. (2014). lme4: Linear mixed-effects models using Eigen and S4. R package version 1.1-7.Google Scholar
- Clinton, V., Basaraba, D. L., & Walkington, C. (2018). English learners and mathematical word problem solving: A systematic review. In D. L. Baker, D. L. Basaraba, & C. Richards-Tutor (Eds.), Second language acquisition: Methods, perspectives and challenges (pp. 171–208). New York: Nova Science Publishers.Google Scholar
- Clinton, V., Cooper, J. L., Michaelis, J., Alibali, M. W., & Nathan, M. J. (2017). Revising visuals based on instructional design principles: Effects on cognitive load and learning. In C. Was, F. J. Sansosti, & B. J. Morris (Eds.), Eye-tracking technology applications in educational research (pp. 195–218). Hershey, PA: IGI Global.Google Scholar
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
- De Jong, T. (2010). Cognitive load theory, educational research, and instructional design: some food for thought. Instructional Science, 38(2), 105–134.Google Scholar
- Doddannara, L. S., Gowda, S. M., Baker, R. S., Gowda, S. M., & De Carvalho, A. M. (2013). Exploring the relationships between design, students’ affective states, and disengaged behaviors within an ITS. In International Conference on Artificial Intelligence in Education (pp. 31–40). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39112-5_4.
- Duran, N. D., Bellissens, C., Taylor, R. S., & McNamara, D. S. (2007). Quantifying text difficulty with automated indices of cohesion and semantics. In Proceedings of the Annual Meeting of the Cognitive Science Society, Volume 29 (pp. 233–238). Cognitive Science Society, Austin, TX.Google Scholar
- Graesser, A. C., McNamara, D. S., & Louwerse, M. M. (2003). What do readers need to learn in order to process coherence relations in narrative and expository text. In A. P. Sweet & C. E. Snow (Eds.), Rethinking reading comprehension (pp. 82–98). New York: Guilford.Google Scholar
- Haag, N., Heppt, B., Stanat, P., Kuhl, P., & Pant, H. A. (2013). Second language learners’ performance in mathematics: Disentangling the effects of academic language features. Learning and Instruction, 28, 24–34. https://doi.org/10.1016/j.learninstruc.2013.04.001.Google Scholar
- Heffernan, N. T., & Heffernan, C. L. (2014). The ASSISTments ecosystem: Building a platform that brings scientists and teachers together for minimally invasive research on human learning and teaching. International Journal of Artificial Intelligence in Education, 24(4), 470–497. https://doi.org/10.1007/s40593-014-0024-x.Google Scholar
- Kane, M. T. (2001). Current concerns in validity theory. Journal of Educational Measurement, 38(4), 319–342. https://doi.org/10.1111/j.1745-3984.2001.tb01130.x.Google Scholar
- Kintsch, W. (1998). Comprehension: A paradigm for cognition. Cambridge, UK: Cambridge University Press.Google Scholar
- Leroy, G., Endicott, J. E., Kauchak, D., Mouradi, O., & Just, M. (2013). User evaluation of the effects of a text simplification algorithm using term familiarity on perception, understanding, learning, and information retention. Journal of Medical Internet Research, 15(7), e144. https://doi.org/10.2196/jmir.2569.Google Scholar
- Mayer, R. E., Fennell, S., Farmer, L., & Campbell, J. (2004). A personalization effect in multimedia learning: Students learn better when words are in conversational style rather than formal style. Journal of Educational Psychology, 96(2), 389–395. https://doi.org/10.1037/0022-06188.8.131.529.Google Scholar
- Mullis, I. V., Martin, M. O., & Foy, P. (2013). The impact of reading ability on TIMSS mathematics and science achievement at the fourth grade: An analysis by item reading demands. In M. O. Martin & I. V. S. Mullis (Eds.), TIMSS and PIRLS 2011: Relationships among reading, mathematics, and science achievement at the fourth grade: Implications for early learning (pp. 67–108). Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College and IEA.Google Scholar
- Noble, T., Kachchaf, R. R., & Rosebery, A. S. (2018). Perspectives from research on the linguistic features of mathematics and science test items and the performance of English Learners. In D. L. Baker & D.L. Basaraba, & C. Richards-Tutor (Eds.), Second language acquisition: Methods, perspectives and challenges (pp. 209–236). Nova Science Publishers.Google Scholar
- Schmidt, R. A., & Bjork, R. A. (1992). New conceptualizations of practice: Common principles in three paradigms suggest new concepts for training. Psychological Science, 3(4), 207–218. https://doi.org/10.1111/j.1467-9280.1992.tb00029.x.Google Scholar
- Shaftel, J., Belton-Kocher, E., Glasnapp, D., & Poggio, J. (2006). The impact of language characteristics in mathematics test items on the performance of English language learners and students with disabilities. Educational Assessment, 11, 105–126. https://doi.org/10.1207/s15326977ea1102_2.Google Scholar
- Sweller, J., Ayres, P., & Kalyuga, S. (2011). Measuring cognitive load. In Cognitive Load Theory: Explorations in the Learning Sciences, Instructional Systems and Performance Technologies, Vol 1 (pp. 237–242). Springer, New York. https://doi.org/10.1007/978-1-4419-8126-4_18.
- Sweller, J., van Merrienboer, J., & Paas, F. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10, 251–296.Google Scholar
- Van Gog, T., Paas, F., & Van Merriënboer, J. J. (2006). Effects of process-oriented worked examples on troubleshooting transfer performance. Learning and Instruction, 16(2), 154–164. https://doi.org/10.1016/j.learninstruc.2006.02.003.Google Scholar
- Walkington, C., & Bernacki, M. (2014). Motivating students by “personalizing” learning around individual interests: A consideration of theory, design, and implementation issues. In S. Karabenick & T. Urdan (Eds.), Advances in motivation and achievement (Vol. 18, pp. 139–176). Bingley: Emerald Group Publishing. https://doi.org/10.1108/s0749-742320140000018004.Google Scholar
- Walkington, C., Clinton, V., & Shivraj, P. (2018). How readability factors are differentially associated with performance for students of different backgrounds when solving mathematics word problems. American Educational Research Journal, 55(2), 362–414. https://doi.org/10.3102/0002831217737028.Google Scholar
- Walkington, C., & Maull, K. (2011). Exploring the assistance dilemma: The case of context personalization. In L. Carlson, C. Hölscher, & T. Shipley (Eds.), Proceedings of the 33rd Annual Conference of the Cognitive Science Society (pp. 90–95). Boston, MA: Cognitive Science Society.Google Scholar
- Walkington, C., Sherman, M., & Petrosino, A. (2012). ‘Playing the game’ of story problems: Coordinating situation-based reasoning with algebraic representation. Journal of Mathematical Behavior, 31(2), 174–195.Google Scholar
- Wu, M. (2010). Comparing the similarities and differences of PISA 2003 and TIMSS (OECD Working Papers No. 32). Paris: OECD Publishing. https://doi.org/10.1787/5km4psnm13nx-en.