Skip to main content
Log in

Assessing and tracking students’ problem solving performances in anchored learning environments

  • Development Article
  • Published:
Educational Technology Research and Development Aims and scope Submit manuscript

Abstract

The purpose of this randomized experiment was to compare the performance of high-, average-, and low-achieving middle school students who were assessed with parallel versions of a computer-based test (CBT) or a paper-pencil test (PPT). Tests delivered in interactive, immersive environments like the CBT may have the advantage of providing teachers with diagnostic tools that can lead to instruction tailored to the needs of students at different achievement levels. To test the feasibility of CBT, students were randomly assigned to the CBT or PPT test conditions to measure what they had learned from an instructional method called enhanced anchored math instruction. Both assessment methods showed that students benefited from instruction and differentiated students by achievement status. The navigation maps generated from the CBT revealed that the low-achieving students were able to navigate the test, spent about the same amount of time solving the subproblems as the more advanced students, and made use of the learning scaffolds.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Andersson, G., Kaldo-Sandstrom, V., Strom, L., & Stromgren, T. (2003). Internet administration of the Hospital Anxiety and Depression Scale in a sample of tinnitus patients. Journal of Psychosomatic Research, 55, 259–262.

    Article  Google Scholar 

  • Baker, F. B. (1992). Item response theory: Parameter estimation techniques. New York: Marcel Dekker.

    Google Scholar 

  • Baker, F. B., & Kim, S. H. (2004). Item response theory: Parameter estimation techniques (2nd ed.). New York: Marcel Dekker.

    Google Scholar 

  • Bottge, B. A. (1999). Effects of contextualized math instruction on problem solving of average and below-average achieving students. Journal of Special Education, 33, 81–92.

    Article  Google Scholar 

  • Bottge, B. A., & Hasselbring, T. S. (1993). A comparison of two approaches for teaching complex, authentic mathematics problems to adolescents in remedial math classes. Exceptional Children, 59, 556–566.

    Google Scholar 

  • Bottge, B. A., Heinrichs, M., Chan, S., & Serlin, R. (2001). Anchoring adolescents’ understanding of math concepts in rich problem-solving environments. Remedial and Special Education, 22, 299–314.

    Article  Google Scholar 

  • Bottge, B. A., Heinrichs, M., Mehta, Z., & Hung, Y. (2002). Weighing the benefits of anchored math instruction for students with disabilities in general education classes. The Journal of Special Education, 35, 186–200.

    Article  Google Scholar 

  • Bottge, B. A., Rueda, E., LaRoque, P. T., Serlin, R. C., & Kwon, J. (2007a). Integrating reform- oriented math instruction in special education settings. Learning Disabilities Research & Practice, 22, 96–109.

    Article  Google Scholar 

  • Bottge, B. A., Rueda, E., Serlin, R. C., Hung, Y., & Kwon, J. (2007b). Shrinking achievement differences with anchored math problems: Challenges and possibilities. The Journal of Special Education, 41, 31–49.

    Article  Google Scholar 

  • Bottge, B., Rueda, E., & Skivington, M. (2006). Situating math instruction in rich problem-solving contexts: Effects on adolescents with challenging behaviors. Behavioral Disorders, 31, 394–407.

    Google Scholar 

  • Bruner, J. S. (1975). The ontogenesis of speech acts. Journal of Child Language, 2, 1–40.

    Article  Google Scholar 

  • Brush, T., & Saye, J. (2001). The use of embedded scaffolds with hypermedia-supported student-centered learning. Journal of Multimedia and Hypermedia, 10, 333–356.

    Google Scholar 

  • Cho, S., Cohen, A., Kim, S., & Bottge, B. (2007). Latent transition analysis with a mixture response theory measurement model. Paper to be presented at the National Council on Measurement in Education Annual Conference, Chicago, IL.

  • Clariana, R., & Wallace, P. (2002). Paper-based versus computer-based assessment: Key factors associated with the test mode effect. British Journal of Educational Technology, 33, 593–602.

    Article  Google Scholar 

  • Clark, R. C., & Mayer, R. E. (2003). E-Learning and the science of instruction. Hoboken, NJ: John Wiley & Sons, Inc.

    Google Scholar 

  • Cognition, Technology Group at Vanderbilt (1990). Anchored instruction and its relationship to situated cognition. Educational Researcher, 19(3), 2–10.

    Article  Google Scholar 

  • Cognition, Technology Group at Vanderbilt (1997). The Jasper Project: Lessons in curriculum, instruction, assessment, and professional development. Mahwah, NJ: Erlbaum.

    Google Scholar 

  • CTB McGraw-Hill (2003). TerraNova Comprehensive Tests of Basic Skills. Monterey, CA.

  • Dalton, S., & Overall, J. E. (1977). Nonrandom assignment in ANCOVA: The alternate ranks design. Journal of Experimental Education, 46, 58–62.

    Google Scholar 

  • DeAngelis, S. (2000) Equivalency of computer-based and paper-and-pencil testing. Journal of Allied Health, 29(3), 21–29.

    Google Scholar 

  • Dolan, R. P., & Hall, T. E. (2001). Universal design for learning: Implications for large-scale assessment. IDA Perspectives, 27(4), 22–25.

    Google Scholar 

  • Federico, P. A. (1989). Computer-based and paper-based measurement of recognition (Research Report No. NPRDC-TR-89-7): Navy Personnel Research and Development Centre Report.

  • Ge, X., & Land, S. M. (2003). Scaffolding students’ problem-solving processes in an ill-structured task using question prompts and peer interactions. Educational Technology Research and Development, 51(1), 21–38.

    Article  Google Scholar 

  • Glasgow, N. (1997). New curriculum for new times: A guide to student-centered, problem-based learning. Thousand Oaks, CA: Corwin.

    Google Scholar 

  • Gray, K. C., & Herr, E. L. (1998). Workforce education: The basics. Boston: Allyn and Bacon.

    Google Scholar 

  • Hambelton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage.

    Google Scholar 

  • Hannafin, M. J., Hill, J., & Land, S. (1997). Student-centered learning and interactive multimedia: Status, issues, and implications. Contemporary Education, 68(2), 94–99.

    Google Scholar 

  • Hannafin, M., Land, S., & Oliver, K. (1999). Open learning environments: Foundations, methods, and models. In C. M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory (pp. 115–140). Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Hargreaves, M., Shorrocks-Taylor, D., Swinnerton, B., Tait, K., & Threlfall, J. (2004). Computer or paper? That is the question: does the medium in which assessment questions are presented affect children’s performance in mathematics? Educational Research, 46(1), 29–42.

    Article  Google Scholar 

  • Hasselbring, T. S., Lewis, P., & Bausch, M. E. (2005). Assessing students with disabilities: Moving assessment forward through universal design. Insight, 5, 1–16.

    Google Scholar 

  • Kirsch, I., Braun, H., & Sum, A. (2007). America’s perfect storm: Three forces changing our nation’s future. Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Lester, J., & Kroll, D. L. (1990). Assessing student growth in mathematical problem solving: Assessing higher order thinking in mathematics. Washington, DC: American Association for the Advancement of Science.

  • Lord, F. N. (1980). Applications of item response theory to practical testing problems. Mahweh, NJ: Erlbaum.

    Google Scholar 

  • Mason, B. J., Patry, M., & Berstein, D. J. (2001). An examination of the equivalence between non-adaptive computer-based and traditional testing. Journal of Educational Computing Research, 24(1), 29–39.

    Article  Google Scholar 

  • Mayer, R. E. (2001). Multi-media learning. New York: Cambridge University Press.

    Google Scholar 

  • Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Pschology, 38(1), 43–52.

    Article  Google Scholar 

  • Mazzeo, J., Druesne, B., Raffeld, P., Checketts, K. T., & Muhlstein, E. (1991). Comparability of computer and paper-and-pencil scores for two CLEP general examinations (ETS Report No. RR-92-14). Princeton, NJ: Educational Testing Service.

  • Mousavi, S. Y., Low, R., & Sweller, J. (1995). Reducing cognitive load by mixing auditory and visual presentation modes. Journal of Educational Psychology, 87, 319–334.

    Article  Google Scholar 

  • Murnane, R. J., & Levy, F. (1996). Teaching the new basic skills. New York, NY: The Free Press.

    Google Scholar 

  • National Center on Education, the Economy (2007). Tough choices or tough times: The report of the new commission on the skills of the American workforce. Washington, DC.

  • National Council of Teachers of Mathematics (2000). Principles and standards for school mathematics. Reston, VA: Author.

    Google Scholar 

  • No Child Left Behind Act of 2001, Pub. L. No. 107–110, 115 Stat. 1425 (2002).

  • Palincsar, A. S. (1986). The role of dialogue in providing scaffolded instruction. Educational Psychologist, 21, 73–98.

    Article  Google Scholar 

  • Palincsar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction, 1, 117–175.

    Article  Google Scholar 

  • Perie, M., Grigg, W., & Dion, G. (2005). The Nation’s Report Card: Mathematics 2005 (NCES 2006-453). U. S. Department of Education. National Center for Education Statistics. Washington, DC: Government Printing Office.

  • Puntambekar, S., & Stylianou, A. (2005). Designing navigation support in hypertext systems based on navigation patterns. Instructural Science, 33, 451–481.

    Article  Google Scholar 

  • Puntambekar, S., Stylianou, A., & Hubscher, R. (2003). Improving navigation and learning in hypertext environments with navigable concept maps. Human-Computer Interaction, 18, 395–428.

    Article  Google Scholar 

  • Rittle-Johnson, B., & Koedinger, K. R. (2005). Designing knowledge scaffolds to support mathematical problem solving. Cognition and Instruction, 23, 313–349.

    Article  Google Scholar 

  • Rittle-Johnson, B., Siegler, R. S., & Alibali, M. W. (2001). Developing conceptual understanding and procedural skill in mathematics: An iterative process. Journal of Educational Psychology, 93, 346–362.

    Article  Google Scholar 

  • Rose, D. H., & Meyer, A. (2007). Teaching every student in the digital age: Universal design for learning. Retrieved February 2, 2007 from: http://www.cast.org/teachingeverystudent/ideas/tes/

  • Schaeffer, G., Reese, C. M., Steffen, M., McKinley, R. L., & Mills, C. N. (1993). Field test of a computer-bases GRE general test (Report No. ETS-RR-93-07). Princeton, NJ: Educational Testing Service.

  • Shafer, M. C., & Romberg, T. A. (1999). Assessment in classrooms that promote understanding. In E. Fennema & T. A. Romberg (Eds.), Mathematics classrooms that promote understanding (pp. 159–184). Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Sireci, S. G., Scarpati, S. E., & Shuhong, L. (2005). Test accommodations for students with disabilities: An analysis of the interaction hypothesis. Review of Educational Research, 75, 475–490.

    Article  Google Scholar 

  • Stone, J. R., Alfeld, C., Pearson, D., Lewis, M. V., & Jensen, S. (2006). Building academic skills in context: Testing the value of enhanced math learning in CTE. Minneapolis: University of Minnesota.

    Google Scholar 

  • Thornburg, D. (2002). The new basics: Education and the future of work in the telematic age. Alexandria, VA: Association for Supervision and Curriculum Development.

    Google Scholar 

  • Uchida, D., Cetron, M., & Mckenzie, F. (1996). Preparing students for the 21st century. Arlington, VA: American Association of School Administrators.

    Google Scholar 

  • University of Iowa (2001). The Iowa Tests of Basic Skills (ITBS). Itasca, IL: Riverside Publishing.

  • U.S. Department of Labor (1991). What work requires of schools: A SCANS report for America Washington, DC: Author.

  • Vygotsky, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press.

    Google Scholar 

Download references

Acknowledgements

The research reported in this article was supported by a grant from the U. S. Department of Education, Institute of Education Sciences Cognition and Student Learning (CASL) Program, Award No. R305H040032. Any opinions, findings, or conclusions are those of the authors and do not necessarily reflect the views of the supporting agency.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Brian A. Bottge.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bottge, B.A., Rueda, E., Kwon, J.M. et al. Assessing and tracking students’ problem solving performances in anchored learning environments. Education Tech Research Dev 57, 529–552 (2009). https://doi.org/10.1007/s11423-007-9069-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11423-007-9069-y

Keywords

Navigation