Skip to main content

Technology-Based Diagnostic Assessments for Identifying Early Mathematical Learning Difficulties

  • Chapter
  • First Online:
International Handbook of Mathematical Learning Difficulties

Abstract

This chapter summarises the latest development in the field of educational assessment, focusing on the transition from paper-based to computer-based and from summative to formative and diagnostic assessment to realise efficient testing for personalised learning. The chapter introduces an online diagnostic assessment system that has been developed for the first 6 years of primary school to assess students’ progress in three main domains, including mathematics. Mathematics has always been one of the most challenging school subjects, and early failures may cause problems later in other domains of schooling as well. Making mathematics learning successful for every student requires early identification of difficulties, continuous monitoring of development, and individualised support for those who need it in the first school years. The diagnostic assessment is based on a scientifically established, detailed framework which outlines mathematics learning and development in three dimensions: (1) psychological development or mathematical reasoning, (2) application of knowledge or mathematical literacy, and (3) disciplinary mathematics or curricular content. A complex online platform, eDia, has been constructed to support the entire assessment process from item writing through item banking, test delivery, and storing and analysing the data to providing feedback. This chapter outlines the foundations of the framework and shows how it has been mapped into a large set of items. Finally, it shows some early results from the empirical scaling and validation processes and indicates further possibilities for the application of the diagnostic system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 219.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 279.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Aunola, K., Leskinen, E., Lerkkanen, M. K., & Nurmi, J. E. (2004). Developmental dynamics of math performance from preschool to grade 2. Journal of Educational Psychology, 96(4), 699–713. https://doi.org/10.1037/0022-0663.96.4.699

    Article  Google Scholar 

  • Becker, J. (2004). Computergestütztes Adaptives Testen (CAT) von Angst entwickelt auf der Grundlage der Item Response Theorie (IRT). Unpublished PhD dissertation. Freie Universität, Berlin.

    Google Scholar 

  • Beller, M. (2013). Technologies in large-scale assessments: New directions, challenges, and opportunities. In M. von Davier, E. Gonzalez, I. Kirsch, & K. Yamamoto (Eds.), The role of international large-scale assessments: Perspectives from technology, economy, and educational research (pp. 25–45). Dordrecht, The Netherlands: Springer.

    Chapter  Google Scholar 

  • Bennett, R. E. (2002). Using electronic assessment to measure student performance. The state education standard. Washington, DC: National State Boards of Education.

    Google Scholar 

  • Bennett, R. E. (2003). Online assessment and the comparability of score meaning. Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/0969595980050102

    Article  Google Scholar 

  • Bollen, K. A. (1989). Structural equations with latent variables. New York, NY: Wiley.

    Book  Google Scholar 

  • Breiter, A., Groß, L. M., & Stauke, E. (2013). Computer-based large-scale assessments in Germany. In D. Passey, A. Breiter, & A. Visscher (Eds.), Next generation of information technology in educational management (pp. 41–54). Berlin, Heidelberg: Springer.

    Chapter  Google Scholar 

  • Bridgeman, B. (2010). Experiences from large-scale computer-based testing in the USA. In F. Scheuermann & J. Bjornsson (Eds.), The transition to Computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 39–44). Brussels: European Communities.

    Google Scholar 

  • Choi, S. W., & Tinkler, T. (2002). Evaluating comparability of paper and computer based assessment in a K-12 setting. Paper presented at the annual meeting of the National Council on Measurement in Education, New Orleans, LA.

    Google Scholar 

  • Christakoudis, C., Androulakis, G. S., & Zagouras, C. (2011). Prepare items for large scale computer based assessment: Case study for teachers’ certification on basic computer skills. Procedia-Social and Behavioral Sciences, 29, 1189–1198.

    Article  Google Scholar 

  • Csapó, B. (2004). Knowledge and competencies. In J. Letschert (Ed.), The integrated person. How curriculum development relates to new competencies (pp. 35–49). Enschede: CIDREE.

    Google Scholar 

  • Csapó, B. (2007). Research into learning to learn through the assessment of quality and organization of learning outcomes. The Curriculum Journal, 18(2), 195–210. https://doi.org/10.1080/09585170701446044

    Article  Google Scholar 

  • Csapó, B. (2010). Goals of learning and the organization of knowledge. In E. Klieme, D. Leutner, & M. Kenk (Eds.), Kompetenzmodellierung. Zwischenbilanz des DFG-Schwerpunktprogramms und Perspektiven des Forschungsansatzes. 56. Beiheft der Zeitschrift für Pädagogik (pp. 12–27). Weinheim: Beltz.

    Google Scholar 

  • Csapó, B., Ainley, J., Bennett, R. E., Latour, T., & Law, N. (2012). Technological issues for computer-based assessment. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st Century skills (pp. 143–230). New York, NY: Springer. https://doi.org/10.1007/978-94-007-2324-5_4

    Chapter  Google Scholar 

  • Csapó, B., & Csépe, V. (Eds.). (2012). Framework for diagnostic assessment of reading. Budapest: Nemzeti Tankönyvkiadó.

    Google Scholar 

  • Csapó, B., Lőrincz, A., & Molnár, G. (2012). Innovative assessment technologies in educational games designed for young students. In D. Ifenthaler, D. Eseryel, & X. Ge (Eds.), Assessment in game-based learning: Foundations, innovations, and perspectives (pp. 235–254). New York, NY: Springer.

    Chapter  Google Scholar 

  • Csapó, B., Molnár, G., & Nagy, J. (2014). Computer-based assessment of school-readiness and reasoning skills. Journal of Educational Psychology, 106(2), 639–650.

    Article  Google Scholar 

  • Csapó, B., & Szabó, G. (Eds.). (2012). Framework for diagnostic assessment of science. Budapest: Nemzeti Tankönyvkiadó.

    Google Scholar 

  • Csapó, B., & Szendrei, M. (Eds.). (2011). Framework for diagnostic assessment of mathematics. Budapest: Nemzeti Tankönyvkiadó.

    Google Scholar 

  • Csíkos, C., & Csapó, B. (2011). Diagnostic assessment frameworks for mathematics: Theoretical background and practical issues. In B. Csapó & M. Szendrei (Eds.), Framework for diagnostic assessment of mathematics (pp. 137–162). Budapest: Nemzeti Tankönyvkiadó.

    Google Scholar 

  • Csíkos, C., & Verschaffel, L. (2011). Mathematical literacy and the application of mathematical knowledge. In B. Csapó & M. Szendrei (Eds.), Framework for diagnostic assessment of mathematics (pp. 57–93). Budapest: Nemzeti Tankönyvkiadó.

    Google Scholar 

  • de Koning, E. (2000). Inductive reasoning in primary education. Measurement, teaching, transfer. Zeist: Kerckebosch.

    Google Scholar 

  • DeWind, N. K., & Brannon, M. (2016). Significant inter-test reliability across approximate number system assessments. Frontiers in Psychology, 7, 310. https://doi.org/10.3389/fpsyg.2016.00310

    Article  Google Scholar 

  • Dikli, S. (2006). An overview of automated scoring of essays. The Journal of Technology, Learning, and Assessment, 5(1). http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1640/1489

  • Farcot, M., & Latour, T. (2008). An open source and large - scale computer based assessment platform: A real winner. In F. Scheuermann & A. G. Pereira (Eds.), Towards a research agenda on computer - based assessment: Challenges and needs for European educational measurement (pp. 64–67). Ispra: European Commission Joint Research Centre.

    Google Scholar 

  • Farcot, M., & Latour, T. (2009). Transitioning to Computer-based sssessments: A question of costs. In F. Scheuermann & J. Bjornsson (Eds.), The transition to Computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 108–116). Brussels: European Communities.

    Google Scholar 

  • Frey, A. (2007). Adaptives Testen. In H. Moosbrugger & A. Kelava (Eds.), Testtheorie und Testkonstruktion (pp. 261–278). Berlin, Heidelberg: Springer.

    Google Scholar 

  • Griffin, P., McGaw, B., & Care, E. (Eds.). (2012). Assessment and teaching of 21st century skills. New York: Springer.

    Google Scholar 

  • Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling A Multidisciplinary Journal, 6(1), 1–55.

    Article  Google Scholar 

  • Inhelder, B., & Piaget, J. (1958). The growth of logical thinking. London: Routledge and Kegan Paul.

    Google Scholar 

  • Jodoin, M., Zenisky, A., & Hambleton, R. K. (2006). Comparison of the psychometric properties of several computer-based test designs for credentialing exams with multiple purposes. Applied Measurement in Education, 19(3), 203–220.

    Article  Google Scholar 

  • Jurecka, A., & Hartig, J. (2007). Computer- und Netzbasiertes Assessment. In J. Hartig & E. Klieme (Eds.), Möglichkeiten und Voraussetzungen technologiebasierter Kompetenzdiagnostik (pp. 37–48). Berlin, Bonn: Bundesministerium für Bildung und Forschung.

    Google Scholar 

  • Kettler, R. J. (2011). Computer-based screening for the new modified alternate assessment. Journal of Psychoeducational Assessment, 29(1), 3–13.

    Article  Google Scholar 

  • Kikis, K. (2010). Reflections on paper-and-pencil tests to eAssessments: Narrow and broadband paths to 21st Century challenges. In F. Scheuermann & J. Bjornsson (Eds.), The Transition to Computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 99–103). Brussels: European Communities.

    Google Scholar 

  • Klauer, K. J. (1993). Denktraining für Jugendliche. Göttingen: Hogrefe.

    Google Scholar 

  • Let’s Go Learn. Retrieved from https://frontend.letsgolearn.com/login

  • Martin, R. (2010). Utilising the potential of Computer delivered surveys in assessing scientific Literacy. In F. Scheuermann & J. Bjornsson (Eds.), The Transition to Computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 172–177). Brussels: European Communities.

    Google Scholar 

  • Martin, R., & Binkley, M. (2009). Gender differences in cognitive tests: A consequence of gender-dependent preferences for specific information presentation formats? In F. Scheuermann & J. Bjórnsson (Eds.), The transition to computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 75–82). Luxembourg: Office for Official Publications of the European Communities.

    Google Scholar 

  • Math Garden from the University of Amsterdam. Retrieved from https://www.mathsgarden.com/more-info/

  • Meijer, R. (2010). Transition to computer-based assessment: Motivations and considerations. In F. Scheuermann & J. Bjornsson (Eds.), The transition to computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 104–107). Brussels: European Communities.

    Google Scholar 

  • Messick, S. (1995). Standards of validity and the validity of standards in performance assessment. Educational Measurement: Issues and Practice, 14(4), 5–8. https://doi.org/10.1111/j.1745-3992.1995.tb00881.x

    Article  Google Scholar 

  • Mitchell, T., Russel, T., Broomhead, P., & Aldridge, N. (2002). Towards robust computerized marking of free-text responses. In M. Danson (Ed.), Proceedings of the Sixth International Computer Assisted Assessment Conference. Loughborouh: Loughboroug University. Retrieved from https://dspace.lboro.ac.uk/dspace-jspui/bitstream/2134/1884/1/Mitchell_t1.pdf

  • Moe, E. (2010). Introducing large-scale computerized assessment – Lessons learned and future challenges. In F. Scheuermann & J. Bjórnsson (Eds.), The transition to computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 51–56). Luxembourg: Office for Official Publications of the European Communities.

    Google Scholar 

  • Molnár, G., Greiff, S., Wüstenberg, S., & Fischer, A. (2017). Empirical study of computer based assessment of domain-general dynamic problem solving skills. In B. Csapó, J. Funke, & A. Schleicher (Eds.), The nature of problem solving (pp. 123–143). Paris: OECD.

    Google Scholar 

  • Molnár, G., & Lőrincz, A. (2012). Innovative assessment technologies: Comparing ‘face-to-face’ and game-based development of thinking skills in classroom settings. In D. Chen (Ed.), International proceedings of economics development and research. Management and education innovation (Vol. 37, pp. 150–154). Singapore: IACSIT Press.

    Google Scholar 

  • Mullis, I. V. S. (2013). In M. O. Martin (Ed.), TIMSS 2015 assessment frameworks. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Nguyen, T., Watts, T. W., Duncan, G. J., Clements, D. H., Sarama, J. S., Wolfe, C., & Spitler, M. E. (2016). Which preschool mathematics competencies are most predictive of fifth grade achievement? Early Childhood Research Quarterly, 36, 550–560. https://doi.org/10.1016/j.ecresq.2016.02.003

    Article  Google Scholar 

  • Nunes, T., & Csapó, B. (2011). Developing and assessing mathematical reasoning. In B. Csapó & M. Szendrei (Eds.), Framework for diagnostic assessment of mathematics (pp. 17–56). Budapest: Nemzeti Tankönyvkiadó.

    Google Scholar 

  • OECD. (2000). PISA 2015 assessment and analytical framework: Science, reading, mathematic and financial literacy. Paris: OECD. https://doi.org/10.1787/9789264181564-en

    Book  Google Scholar 

  • OECD. (2014). PISA 2012 results: Creative problem solving: students’ skills in tackling real-life problems (volume V). Paris: OECD.

    Book  Google Scholar 

  • OECD. (2016). Measuring student knowledge and skills. The PISA 2000 assessment of reading, mathematical and scientific literacy. Paris: OECD. https://doi.org/10.1787/9789264255425-en

    Book  Google Scholar 

  • Pachler, N., Daly, C., Mor, Y., & Mellar, H. (2010). Formative e-assessment: Practitioner cases. Computers and Education, 54(3), 715–721.

    Article  Google Scholar 

  • PAT: Mathematics in New Zealand. Retrieved from http://www.nzcer.org.nz/tests/pat-mathematics

  • Peak, P. (2005). Recent trends in comparability studies. Pearson educational measurement. Retrieved from http://www.pearsonassessments.com/NR/rdonlyres/5FC04F5A-E79D-45FE-8484-07AACAE2DA75/0/TrendsCompStudies_rr0505.pdf

  • Pearson (2012). From paper and pencil to computer-based testing (CBT). Retrieved from http://www.pearsonvue.co.uk/india/Documents/PP_to_CBT.pdf.

  • Pearson’s MyMathLab. Retrieved from https://www.pearsonmylabandmastering.com/northamerica/mymathlab/

  • Redecker, C., & Johannessen, Ø. (2013). Changing assessment - towards a new assessment paradigm using ICT. European Journal of Education, 48(1), 79–96.

    Article  Google Scholar 

  • Ridgway, J., & McCusker, S. (2003). Using computers to assess new educational goals. Assessment in Education, 10(3), 309–328.

    Google Scholar 

  • Ripley, M. (2010). Transformational Computer-based Testing. In F. Scheuermann & J. Bjórnsson (Eds.), The transition to computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 92–98). Luxembourg: Office for Official Publications of the European Communities.

    Google Scholar 

  • SETDA. (2008). National trends report 2008. Enhancing education through technology. Retrieved from http://www.setda.org/wp-content/uploads/2013/12/National-Trends-Report-2008.pdf.

  • Scheuermann, F., & Björnsson, J. (2009). The transition to computer-based assessment: New approaches to skills assessment and implications for large-scale testing. Luxemburg: Office for Official Publications of the European Communities.

    Google Scholar 

  • Scheuermann, F., & Pereira, G. A. (Eds.). (2008). Towards a research Agenda on Computer-based assessment. Luxembourg: Office for Official Publications of the European Communities.

    Google Scholar 

  • Sim, G., & Horton, M. (2005). Performance and attitude of children in Computer based versus paper based testing. In P. Kommers & G. Richards (Eds.), Proceedings of world conference on educational multimedia, hypermedia and Telecommunications 2005 (pp. 3610–3614). Chesapeake, VA: AACE.

    Google Scholar 

  • Strain-Seymour, E., Way, W. D., & Dolan, R. P. (2009). Strategies and processes for developing innovative items in large-scale assessments. Iowa City, IA: Pearson Education.

    Google Scholar 

  • Szendrei, J., & Szendrei, M. (2011). Scientific and curriculum sspects of teaching and assessing mathematics. In B. Csapó & M. Szendrei (Eds.), Framework for diagnostic assessment of mathematics (pp. 95–135). Budapest: Nemzeti Tankönyvkiadó.

    Google Scholar 

  • The Diagnostic assessment part of PARCC. Retrieved from http://futurereadyma.org/uploads/documents/Diagnostics_ELA_and_Math_FAQs_-_December_2015.pdf

  • Valenti, S., Neri, F., & Cucchiarelli, A. (2003). An overview of current research on automated essay grading. Journal of Information Technology Education: Research, 2(1), 319–330.

    Article  Google Scholar 

  • Van der Kleij, F. M., Eggen, T. J. H. M., Timmers, C. F., & Veldkamp, B. P. (2012). Effects of feedback in a computer-based assessment for learning. Computers & Education, 58(1), 263–272.

    Article  Google Scholar 

  • van Lent, G. (2010). Risks and benefits of CBT versus PBT in high-stakes testing. In F. Scheuermann & J. Bjornsson (Eds.), The Transition to Computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 83–91). Brussels: European Communities.

    Google Scholar 

  • Verschaffel, L., De Corte, E., & Lasure, S. (1994). Realistic considerations in mathematical modeling of school arithmetic word problems. Learning and Instruction, 7, 339–359. https://doi.org/10.1016/0959-4752(94)90002-7

    Article  Google Scholar 

  • Watts, T. W., Duncan, G. J., Siegler, R. S., & Davis-Kean, P. E. (2014). What’s past is prologue: Relations between early mathematics knowledge and high school achievement. Educational Researcher, 43(7), 352–360. https://doi.org/10.3102/0013189X14553660

    Article  Google Scholar 

Download references

Acknowledgement

This study was funded by OTKA K115497.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gyöngyvér Molnár .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Molnár, G., Csapó, B. (2019). Technology-Based Diagnostic Assessments for Identifying Early Mathematical Learning Difficulties. In: Fritz, A., Haase, V.G., Räsänen, P. (eds) International Handbook of Mathematical Learning Difficulties. Springer, Cham. https://doi.org/10.1007/978-3-319-97148-3_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-97148-3_40

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-97147-6

  • Online ISBN: 978-3-319-97148-3

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics