Skip to main content
Log in

Mathematical Competency Demands of Assessment Items: a Search for Empirical Evidence

  • Published:
International Journal of Science and Mathematics Education Aims and scope Submit manuscript

Abstract

The implementation of mathematical competencies in school curricula requires assessment instruments to be aligned with this new view on mathematical mastery. However, there are concerns over whether existing assessments capture the wide variety of cognitive skills and abilities that constitute mathematical competence. The current study applied an explanatory item response modelling approach to investigate how teacher-rated mathematical competency demands could account for the variation in item difficulty for mathematics items from the Programme for International Student Assessment (PISA) 2012 survey and a Norwegian national grade 10 exam. The results show that the rated competency demands can explain slightly more than and less than half of the variance in item difficulty for the PISA and exam items, respectively. This provides some empirical evidence for the relevance of the mathematical competencies for solving the assessment items. The results also show that for the Norwegian exam, only two of the competencies, Reasoning and argument and Symbols and formalism, appear to influence the difficulty of the items, which questions to what extent the exam items capture the variety of cognitive skills and abilities that constitute mathematical competence. We argue that this type of empirical data from the psychometric modelling should be used to improve assessments and assessment items, as well as to inform and possibly further develop theoretical concepts of mathematical competence.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. In this paper the term mathematical competence refers to a general definition of what it means to master mathematics (e.g. the description provided by Niss and Højgaard), while the term mathematical competency (or competencies in plural) refers to one or a set of the constituent parts of mathematical competence.

  2. AIC Akaike’s information criterion (Akaike, 1973) balances absolute fit to the data with model complexity in terms of the number of parameters. Best model is a model that is parsimonious but still fits well.

References

  • Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B. N. Petrov & F. Csáki (Eds.), Second international symposium on information theory (pp. 267–281). Budapest, Hungary: Aademiai Kiado.

  • Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67, 1–48. https://doi.org/10.18637/jss.v067.i01.

  • Blömeke, S., Gustafsson, J.-E., & Shavelson, R. J. (2015). Beyond dichotomies: Viewing competence as a continuum. Zeitschrift für Psychologie, 223(1), 3–13.

    Article  Google Scholar 

  • Boesen, J., Helenius, O., Bergqvist, E., Bergqvist, T., Lithner, J., Palm, T., & Palmberg, B. (2014). Developing mathematical competence: From the intended to the enacted curriculum. The Journal of Mathematical Behavior, 33, 72–87.

    Article  Google Scholar 

  • Boesen, J., Lithner, J. & Palm, T. (2016). Assessing mathematical competencies: An analysis of Swedish national mathematics tests. Scandinavian Journal of Educational Research, 1–16. https://doi.org/10.1080/00313831.2016.1212256.

  • Daniel, R. C., & Embretson, S. E. (2010). Designing cognitive complexity in mathematical problem-solving items. Applied Psychological Measurement, 34(5), 348–364.

    Article  Google Scholar 

  • De Boeck, P., Cho, S. J., & Wilson, M. (2016). Explanatory item response models. In A. A. Rupp & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies, and applications (pp. 249–268). Hoboken: Wiley.

  • Duval, R. (2006). A cognitive analysis of problems of comprehension in a learning of mathematics. Educational Studies in Mathematics, 61(1), 103–131.

    Article  Google Scholar 

  • Elia, I., Panaoura, A., Eracleous, A., & Gagatsis, A. (2007). Relations between secondary pupils’ conceptions about functions and problem solving in different representations. International Journal of Science and Mathematics Education, 5(3), 533–556.

    Article  Google Scholar 

  • Embretson, S. E., & Daniel, R. C. (2008). Understanding and quantifying cognitive complexity level in mathematical problem solving items. Psychology Science, 50(3), 328–344.

    Google Scholar 

  • Embretson, S. E., & Gorin, J. (2001). Improving construct validity with cognitive psychology principles. Journal of Educational Measurement, 38(4), 343–368.

    Article  Google Scholar 

  • Enright, M. K., Morley, M., & Sheehan, K. M. (2002). Items by design: The impact of systematic feature variation on item statistical characteristics. Applied Measurement in Education, 15(1), 49–74.

    Article  Google Scholar 

  • Feeley, T. H. (2002). Comment on halo effects in rating and evaluation research. Human Communication Research, 28(4), 578–586. https://doi.org/10.1111/j.1468-2958.2002.tb00825.x.

  • Gorin, J. S., & Embretson, S. E. (2006). Item difficulty modeling of paragraph comprehension items. Applied Psychological Measurement, 30(5), 394–411.

    Article  Google Scholar 

  • Graf, E. A., Peterson, S., Steffen, M., & Lawless, R. (2005). Psychometric and cognitive analysis as a basis for the design and revision of quantitative item models (No. RR-05-25). Princeton: Educational Testing Service.

  • Harks, B., Klieme, E., Hartig, J., & Leiss, D. (2014). Separating cognitive and content domains in mathematical competence. Educational Assessment, 19(4), 243–266.

    Article  Google Scholar 

  • Janssen, R., Schepers, J., & Peres, D. (2004). Models with item and item group predictors. In P. De Boeck & M. Wilson (Eds.), Explanatory item response models (pp. 189–212). New York: Springer.

  • Kilpatrick, J. (2014). Competency frameworks in mathematics education. In S. Lerman (Ed.), Encyclopedia of mathematics education (pp. 85–87). Dordrecht, The Netherlands: Springer.

  • Koedinger, K. R., & Nathan, M. J. (2004). The real story behind story problems: Effects of representations on quantitative reasoning. The Journal of the Learning Sciences, 13(2), 129–164.

    Article  Google Scholar 

  • Koeppen, K., Hartig, J., Klieme, E., & Leutner, D. (2008). Current issues in competence modeling and assessment. Zeitschrift für Psychologie, 216(2), 61–73.

    Article  Google Scholar 

  • Lane, S. (2004). Validity of high-stakes assessment: Are students engaged in complex thinking? Educational Measurement: Issues and Practice, 23(3), 6–14.

    Article  Google Scholar 

  • Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50(9), 741–749.

    Article  Google Scholar 

  • National Council of Teachers of Mathematics (NCTM). (2000). Principles and standards for school mathematics. Reston: NCTM.

  • Niss, M. (2007). Reflections on the state of and trends in research on mathematics teaching and learning. In F. K. J. Lester (Ed.), Second handbook of research on mathematics teaching and learning (pp. 1293–1312). Charlotte, NC: Information Age.

    Google Scholar 

  • Niss, M. (2015). Mathematical competencies and PISA. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 35–55): Heidelberg: Springer.

  • Niss, M., Bruder, R., Planas, N., Turner, R., & Villa-Ochoa, J. A. (2016). Survey team on: Conceptualisation of the role of competencies, knowing and knowledge in mathematics education research. ZDM, 48(5), 611–632.

    Article  Google Scholar 

  • Niss, M., & Højgaard, T. (Eds.). (2011). Competencies and mathematical learning. Denmark: Roskilde University.

    Google Scholar 

  • Norwegian Directorate for Education and Training [Utdanningsdirektoratet]. (2014). Eksamensveiledning - om vurdering av eksamensbesvarelser. MAT0010 Matematikk. Sentralt gitt skriftlig eksamen. Grunnskole [Manual - to be used to assess exam papers. MAT0010 Mathematics. National written exam, end of compulsory education]. Oslo: Utdanningsdirektoratet.

  • Organization for Economic Co-operation and Development (OECD). (2013). PISA 2012 Assessment and analytical framework: Mathematics, reading, science, problem solving and financial literacy. Paris: OECD Publishing. https://doi.org/10.1787/9789264190511-en.

  • Organization for Economic Co-operation and Development (OECD). (2014). PISA 2012 technical report. Paris: OECD Publishing. Retrieved from https://www.oecd.org/pisa/pisaproducts/PISA-2012-technical-report-final.pdf

  • Pettersen, A., & Nortvedt, G. A. (2017). Identifying competency demands in mathematical tasks: recognising what matters. International Journal of Science and Mathematics Education. https://doi.org/10.1007/s10763-017-9807-5.

  • R Core Team. (2016). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from http://www.R-project.org

  • Stylianou, D. A. (2011). An examination of middle school students’ representation practices in mathematical problem solving through the lens of expert work: Towards an organizing scheme. Educational Studies in Mathematics, 76(3), 265–280.

    Article  Google Scholar 

  • Turner, R., Blum, W., & Niss, M. (2015). Using competencies to explain mathematical item demand: A work in progress. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 85–115). New York: Springer.

  • Turner, R., Dossey, J., Blum, W., & Niss, M. (2013). Using mathematical competencies to predict item difficulty in PISA: A MEG study. In M. Prenzel, M. Kobarg, K. Schöps, & S. Rönnebeck (Eds.), Research on PISA (pp. 23–37). New York: Springer.

  • Valenta, A., Nosrati, M., & Wæge, K. (2015). Skisse av den «ideelle læreplan i matematikk» [Draft of the «ideal curriculum in mathematics»]. Trondheim: Nasjonalt senter for matematikk i opplæringen. Retrieved from https://nettsteder.regjeringen.no/fremtidensskole/files/2014/05/Skisse-av-den-ideellel%C3%A6replanen-i-matematikk.pdf.

  • Wilson, M., De Boeck, P., & Carstensen, C. H. (2008). Explanatory item response models: A brief introduction. In E. Klieme & D. Leutner (Eds.), Assessment of competencies in educational contexts: State of the art and future prospects (pp. 91–120). Göttingen: Hogrefe & Huber.

  • Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Kuhn, C. (2015). The international state of research on measurement of competency in higher education. Studies in Higher Education, 40(3), 393–411.

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Ross Turner for his support, feedback and contribution of material during this study. Further, we would also like to thank the teachers and prospective teachers for their contribution, the Norwegian PISA Group for allowing access to the PISA material and the Norwegian Directorate for Education and Training for access to the national exam material.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andreas Pettersen.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pettersen, A., Braeken, J. Mathematical Competency Demands of Assessment Items: a Search for Empirical Evidence. Int J of Sci and Math Educ 17, 405–425 (2019). https://doi.org/10.1007/s10763-017-9870-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10763-017-9870-y

Keywords

Navigation