Identifying Competency Demands in Mathematical Tasks: Recognising What Matters



In more and more countries, the goal of mathematics education is to develop students’ mathematical competence beyond procedural and conceptual knowledge. Thus, students need to engage in a rich variety of tasks comprising competencies such as communication, reasoning and problem-solving. In addition, assessment tasks should be constructed to measure the particular aspects of mathematical competence to align assessment with curriculum. Teachers need to be able to recognise competency demand in the mathematical tasks they want to use for teaching and assessment purposes, which might prove difficult. The aim of this study was to analyse the outcome of 5 teachers’ and prospective teachers’ use of an item analysis tool. For each of 141 assessment tasks, the teachers and prospective teachers individually applied the tool to identify the competency demand of the task on a scale from 0 to 3 for each of 6 mathematical competencies. Overall, the analysis reveals high consistency in their analysis. However, the teachers and prospective teachers utilised a restricted range of the scale, rarely judging a task to demand a high level of competence. This indicates that the 5 teachers and prospective teachers can use the tool to identify which of the 6 competencies are at play in solving a task, but can only differentiate to a limited extent between tasks that demand a low level of competence and those that demand a high level. In conclusion, we propose that an item analysis tool could be useful to teachers and prospective teachers as a means of analysing and selecting appropriate tasks that enhance development of mathematical competencies.


Mathematical competence Teachers’ task analysis Task complexity Cognitive demand in tasks Task analysis tool 



The authors would like to thank Mogens Niss and Ross Turner from the PISA MEG for their support and discussions prior to the data collection for this study. Further, the teachers and prospective teachers are thanked for their contribution, the Norwegian PISA Group for allowing access to the PISA material, and the Norwegian Directorate for Education and Training for access to the National Exam material. A short version of this article has been accepted for presentation at the 40th Annual Meeting of the International Group for Psychology of Mathematics Education (PME 40), Szeged, Hungary (Pettersen & Nortvedt, 2016).


  1. Arbaugh, F. & Brown, C. A. (2005). Analyzing mathematical tasks: A catalyst for change? Journal of Mathematics Teacher Education, 8(6), 499–536.CrossRefGoogle Scholar
  2. Blömeke, S., Gustafsson, J.-E. & Shavelson, R. J. (2015). Beyond dichotomies. Zeitschrift für Psychologie, 223(1), 3–13.CrossRefGoogle Scholar
  3. Blomhøj, M. & Jensen, T. H. (2007). What’s all the fuss about competencies? In W. Blum, P. L. Galbraith, H.-W. Henn & M. Niss (Eds.), Modelling and applications in mathematics education: The 14th ICMI Study (pp. 45–56). Dordrecht: Springer.CrossRefGoogle Scholar
  4. Boaler, J. & Staples, M. (2008). Creating mathematical futures through an equitable teaching approach: The case of Railside School. The Teachers College Record, 110(3), 608–645.Google Scholar
  5. Boesen, J., Helenius, O., Bergqvist, E., Bergqvist, T., Lithner, J., Palm, T. & Palmberg, B. (2014). Developing mathematical competence: From the intended to the enacted curriculum. The Journal of Mathematical Behavior, 33, 72–87.CrossRefGoogle Scholar
  6. Borman, W. C. (1975). Effects of instructions to avoid halo error on reliability and validity of performance evaluation ratings. Journal of Applied Psychology, 60(5), 556-560.Google Scholar
  7. Boston, M. D. & Smith, M. S. (2011). A ‘task-centric approach’ to professional development: Enhancing and sustaining mathematics teachers’ ability to implement cognitively challenging mathematical tasks. ZDM, 43(6–7), 965–977.CrossRefGoogle Scholar
  8. Burkhardt, H. (2014). Curriculum design and curriculum change. In Y. Li & G. Lappan (Eds.), Mathematics curriculum in school education (pp. 13–33). Dordrecht: Springer.CrossRefGoogle Scholar
  9. Charalambous, C. Y. & Philippou, G. N. (2010). Teachers’ concerns and efficacy beliefs about implementing a mathematics curriculum reform: Integrating two lines of inquiry. Educational Studies in Mathematics, 75(1), 1–21.CrossRefGoogle Scholar
  10. Cortina, J. M. (1993). What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology, 78(1), 98-104.Google Scholar
  11. Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334.CrossRefGoogle Scholar
  12. Doyle, W. (1988). Work in mathematics classes: The context of students’ thinking during instruction. Educational Psychologist, 23(2), 167–180.CrossRefGoogle Scholar
  13. Feeley, T. H. (2002). Comment on halo effects in rating and evaluation research. Human Communication Research, 28(4), 578–586. doi: 10.1111/j.1468-2958.2002.tb00825.x.CrossRefGoogle Scholar
  14. Fleiss, J. L. & Cohen, J. (1973). The equivalence of weighted kappa and the intraclass correlation coefficient as measures of reliability. Educational and Psychological Measurement, 33(3), 613–619.CrossRefGoogle Scholar
  15. Gall, M. D., Gall, J. P. & Borg, W. R. (2007). Educational research: An introduction (8th ed.). Boston: Allyn and Bacon.Google Scholar
  16. Hiebert, J. (1986). Conceptual and procedural knowledge: The case of mathematics. Hillsdale: Erlbaum.Google Scholar
  17. Hiebert, J. & Wearne, D. (1993). Instructional tasks, classroom discourse, and students’ learning in second-grade arithmetic. American Educational Research Journal, 30(2), 393–425.CrossRefGoogle Scholar
  18. Hiebert, J., Carpenter, T. P., Fennema, E., Fuson, K., Wearne, D., Murray, H., Olivier, A. & Human, P. (1997). Making sense: Teaching and learning mathematics with understanding. Portsmouth, NH: Heinemann.Google Scholar
  19. Hiebert, J., Gallimore, R., Garnier, H., Givvin, K., Hollingsworth, H., Jacobs, J., Chui, A. M., Wearne, D., Smith, M., Kersting, N., Manaster, A., Tseng, E. A., Etterbeek, W., Manaster, C. & Stigler, J. (2003). Teaching mathematics in seven countries: Results from the TIMSS 1999 video study. NCES 2003–013. Washington: National Center for Education Statistics.Google Scholar
  20. Hoyt, W. T. & Kerns, M.-D. (1999). Magnitude and moderators of bias in observer ratings: A meta-analysis. Psychological Methods, 4(4), 403–424.CrossRefGoogle Scholar
  21. Kaur, B. (2010). Mathematical tasks from Singapore classrooms. In Y. Shimizu, B. Kaur, R. Huang & D. Clarke (Eds.), Mathematical tasks in classrooms around the world (pp. 15–33). Rotterdam: Sense Publishers.Google Scholar
  22. Kilpatrick, J. (2014). Competency frameworks in mathematics education. In S. Lerman (Ed.), Encyclopedia of mathematics education (pp. 85–87). Dordrecht: Springer.Google Scholar
  23. Kilpatrick, J., Swafford, J. & Findell, B. (Eds.). (2001). Adding it up: Helping children learn mathematics. Washington, DC: National Academies Press.Google Scholar
  24. Koeppen, K., Hartig, J., Klieme, E. & Leutner, D. (2008). Current issues in competence modeling and assessment. Zeitschrift für Psychologie, 216(2), 61–73.CrossRefGoogle Scholar
  25. Krauss, S., Baumert, J. & Blum, W. (2008). Secondary mathematics teachers’ pedagogical content knowledge and content knowledge: Validation of the COACTIV constructs. The International Journal on Mathematics Education, 40(5), 873–892. doi: 10.1007/s11858-008-0141-9.Google Scholar
  26. Landis, J. R. & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33, 159–174.CrossRefGoogle Scholar
  27. Lithner, J. (2004). Mathematical reasoning in calculus textbook exercises. The Journal of Mathematical Behavior, 23(4), 405–427.CrossRefGoogle Scholar
  28. McGraw, K. O. & Wong, S. P. (1996). Forming inferences about some intraclass correlation coefficients. Psychological Methods, 1(1), 30–46.CrossRefGoogle Scholar
  29. National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston: National Council of Teachers of Mathematics.Google Scholar
  30. Niss, M. (2015). Mathematical competencies and PISA. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy (pp. 35–55). Dordrecht: Springer.Google Scholar
  31. Niss, M. & Højgaard, T. (Eds.). (2011). Competencies and mathematical learning. Roskilde: Roskilde University.Google Scholar
  32. Niss, M. & Højgaard J. T. (2002). Kompetencer og matematiklæring: ideer og inspiration til udvikling af matematikundervisning i Danmark [Competencies and mathematical learning: ideas and inspiration for the development of mathematics teaching and learning in Denmark] (Vol. nr 18). Copenhagen: Undervisningsministeriet.Google Scholar
  33. Niss, M., Bruder, R., Planas, N., Turner, R. & Villa-Ochoa, J. A. (2016). Survey team on: Conceptualisation of the role of competencies, knowing and knowledge in mathematics education research. ZDM, 48(5), 611–632.CrossRefGoogle Scholar
  34. Norwegian Directorate for Education and Training (2014). Eksamensveiledning - om vurdering av eksamensbesvarelser. MAT0010 Matematikk. Sentralt gitt skriftlig eksamen. Grunnskole [Manual - to be used to assess exam papers. MAT0010 Mathematics. National written exam, end of compulsory education]. Oslo: Utdanningsdirektoratet.Google Scholar
  35. Organization for Economic Co-operation and Development (2013a). PISA 2012 assessment and analytical framework: Mathematics, reading, science, problem solving and financial literacy. Paris: OECD Publishing.Google Scholar
  36. Organization for Economic Co-operation and Development (2013b). PISA 2012 result: What students know and can do—Student performance in Mathematics, Reading, Science (volume I). Paris: OECD Publishing.Google Scholar
  37. Palm, T., Boesen, J. & Lithner, J. (2011). Mathematical reasoning requirements in Swedish upper secondary level assessments. Mathematical Thinking and Learning, 13(3), 221–246.CrossRefGoogle Scholar
  38. Pikkarainen, E. (2014). Competence as a key concept of educational theory: A semiotic point of view. Journal of Philosophy of Education, 48(4), 621–636.CrossRefGoogle Scholar
  39. Shimizu, Y., Kaur, B., Huang, R. & Clarke, D. (2010). The role of mathematical tasks in different cultures. In Y. Shimizu, B. Kaur, R. Huang & D. Clarke (Eds.), Mathematical tasks in classrooms around the world (pp. 1–14). Rotterdam: Sense Publishers.Google Scholar
  40. Shrout, P. E. & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428.CrossRefGoogle Scholar
  41. Stein, M. K. & Lane, S. (1996). Instructional tasks and the development of student capacity to think and reason: An analysis of the relationship between teaching and learning in a reform mathematics project. Educational Research and Evaluation, 2(1), 50–80. doi: 10.1080/1380361960020103.CrossRefGoogle Scholar
  42. Stein, M. K., Baxter, J. A. & Leinhardt, G. (1990). Subject-matter knowledge and elementary instruction: A case from functions and graphing. American Educational Research Journal, 27(4), 639–663.CrossRefGoogle Scholar
  43. Stein, M. K., Smith, M. S., Henningsen, M. A. & Silver, E. A. (2000). Implementing standards-based mathematics instruction: A casebook for professional development. New York: Teachers College Press National Council of Teachers of Mathematics.Google Scholar
  44. Tinsley, H. E. A. & Weiss, D. J. (2000). Interrater reliability and agreement. In H. E. Tinsley & S. D. Brown (Eds.), Handbook of applied multivariate statistics and mathematical modeling (pp. 95–124). San Diego: Academic.Google Scholar
  45. Tucker, M. (2013). What does it really mean to be college and work ready? The mathematics required of first year community college students. Washington, DC: National Center on Education and the Economy.Google Scholar
  46. Turner, R., Dossey, J., Blum, W. & Niss, M. (2013). Using mathematical competencies to predict item difficulty in PISA: A MEG study. In M. Prenzel, M. Kobarg, K. Schöps & S. Rönnebeck (Eds.), Research on PISA (pp. 23–37). New York: Springer.CrossRefGoogle Scholar
  47. Turner, R., Blum, W. & Niss, M. (2015). Using competencies to explain mathematical item demand: A work in progress. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 85–115). New York: Springer.Google Scholar
  48. Westera, W. (2001). Competences in education: A confusion of tongues. Journal of Curriculum Studies, 33(1), 75–88.CrossRefGoogle Scholar

Copyright information

© Ministry of Science and Technology, Taiwan 2017

Authors and Affiliations

  1. 1.Department of Teacher Education and School ResearchUniversity of OsloOsloNorway

Personalised recommendations