Mathematical Competency Demands of Assessment Items: a Search for Empirical Evidence
The implementation of mathematical competencies in school curricula requires assessment instruments to be aligned with this new view on mathematical mastery. However, there are concerns over whether existing assessments capture the wide variety of cognitive skills and abilities that constitute mathematical competence. The current study applied an explanatory item response modelling approach to investigate how teacher-rated mathematical competency demands could account for the variation in item difficulty for mathematics items from the Programme for International Student Assessment (PISA) 2012 survey and a Norwegian national grade 10 exam. The results show that the rated competency demands can explain slightly more than and less than half of the variance in item difficulty for the PISA and exam items, respectively. This provides some empirical evidence for the relevance of the mathematical competencies for solving the assessment items. The results also show that for the Norwegian exam, only two of the competencies, Reasoning and argument and Symbols and formalism, appear to influence the difficulty of the items, which questions to what extent the exam items capture the variety of cognitive skills and abilities that constitute mathematical competence. We argue that this type of empirical data from the psychometric modelling should be used to improve assessments and assessment items, as well as to inform and possibly further develop theoretical concepts of mathematical competence.
KeywordsAssessment items Explanatory item response modelling Mathematical competency demand Sources of item difficulty
The authors would like to thank Ross Turner for his support, feedback and contribution of material during this study. Further, we would also like to thank the teachers and prospective teachers for their contribution, the Norwegian PISA Group for allowing access to the PISA material and the Norwegian Directorate for Education and Training for access to the national exam material.
- Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B. N. Petrov & F. Csáki (Eds.), Second international symposium on information theory (pp. 267–281). Budapest, Hungary: Aademiai Kiado.Google Scholar
- Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67, 1–48. https://doi.org/10.18637/jss.v067.i01.
- Boesen, J., Lithner, J. & Palm, T. (2016). Assessing mathematical competencies: An analysis of Swedish national mathematics tests. Scandinavian Journal of Educational Research, 1–16. https://doi.org/10.1080/00313831.2016.1212256.
- De Boeck, P., Cho, S. J., & Wilson, M. (2016). Explanatory item response models. In A. A. Rupp & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies, and applications (pp. 249–268). Hoboken: Wiley.Google Scholar
- Embretson, S. E., & Daniel, R. C. (2008). Understanding and quantifying cognitive complexity level in mathematical problem solving items. Psychology Science, 50(3), 328–344.Google Scholar
- Feeley, T. H. (2002). Comment on halo effects in rating and evaluation research. Human Communication Research, 28(4), 578–586. https://doi.org/10.1111/j.1468-2958.2002.tb00825.x.
- Graf, E. A., Peterson, S., Steffen, M., & Lawless, R. (2005). Psychometric and cognitive analysis as a basis for the design and revision of quantitative item models (No. RR-05-25). Princeton: Educational Testing Service.Google Scholar
- Janssen, R., Schepers, J., & Peres, D. (2004). Models with item and item group predictors. In P. De Boeck & M. Wilson (Eds.), Explanatory item response models (pp. 189–212). New York: Springer.Google Scholar
- Kilpatrick, J. (2014). Competency frameworks in mathematics education. In S. Lerman (Ed.), Encyclopedia of mathematics education (pp. 85–87). Dordrecht, The Netherlands: Springer.Google Scholar
- National Council of Teachers of Mathematics (NCTM). (2000). Principles and standards for school mathematics. Reston: NCTM.Google Scholar
- Niss, M. (2007). Reflections on the state of and trends in research on mathematics teaching and learning. In F. K. J. Lester (Ed.), Second handbook of research on mathematics teaching and learning (pp. 1293–1312). Charlotte, NC: Information Age.Google Scholar
- Niss, M. (2015). Mathematical competencies and PISA. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 35–55): Heidelberg: Springer.Google Scholar
- Niss, M., & Højgaard, T. (Eds.). (2011). Competencies and mathematical learning. Denmark: Roskilde University.Google Scholar
- Norwegian Directorate for Education and Training [Utdanningsdirektoratet]. (2014). Eksamensveiledning - om vurdering av eksamensbesvarelser. MAT0010 Matematikk. Sentralt gitt skriftlig eksamen. Grunnskole [Manual - to be used to assess exam papers. MAT0010 Mathematics. National written exam, end of compulsory education]. Oslo: Utdanningsdirektoratet.Google Scholar
- Organization for Economic Co-operation and Development (OECD). (2013). PISA 2012 Assessment and analytical framework: Mathematics, reading, science, problem solving and financial literacy. Paris: OECD Publishing. https://doi.org/10.1787/9789264190511-en.
- Organization for Economic Co-operation and Development (OECD). (2014). PISA 2012 technical report. Paris: OECD Publishing. Retrieved from https://www.oecd.org/pisa/pisaproducts/PISA-2012-technical-report-final.pdf
- Pettersen, A., & Nortvedt, G. A. (2017). Identifying competency demands in mathematical tasks: recognising what matters. International Journal of Science and Mathematics Education. https://doi.org/10.1007/s10763-017-9807-5.
- R Core Team. (2016). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from http://www.R-project.org
- Turner, R., Blum, W., & Niss, M. (2015). Using competencies to explain mathematical item demand: A work in progress. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 85–115). New York: Springer.Google Scholar
- Turner, R., Dossey, J., Blum, W., & Niss, M. (2013). Using mathematical competencies to predict item difficulty in PISA: A MEG study. In M. Prenzel, M. Kobarg, K. Schöps, & S. Rönnebeck (Eds.), Research on PISA (pp. 23–37). New York: Springer.Google Scholar
- Valenta, A., Nosrati, M., & Wæge, K. (2015). Skisse av den «ideelle læreplan i matematikk» [Draft of the «ideal curriculum in mathematics»]. Trondheim: Nasjonalt senter for matematikk i opplæringen. Retrieved from https://nettsteder.regjeringen.no/fremtidensskole/files/2014/05/Skisse-av-den-ideellel%C3%A6replanen-i-matematikk.pdf.
- Wilson, M., De Boeck, P., & Carstensen, C. H. (2008). Explanatory item response models: A brief introduction. In E. Klieme & D. Leutner (Eds.), Assessment of competencies in educational contexts: State of the art and future prospects (pp. 91–120). Göttingen: Hogrefe & Huber.Google Scholar