# Identifying Competency Demands in Mathematical Tasks: Recognising What Matters

## Abstract

In more and more countries, the goal of mathematics education is to develop students’ mathematical competence beyond procedural and conceptual knowledge. Thus, students need to engage in a rich variety of tasks comprising competencies such as communication, reasoning and problem-solving. In addition, assessment tasks should be constructed to measure the particular aspects of mathematical competence to align assessment with curriculum. Teachers need to be able to recognise competency demand in the mathematical tasks they want to use for teaching and assessment purposes, which might prove difficult. The aim of this study was to analyse the outcome of 5 teachers’ and prospective teachers’ use of an item analysis tool. For each of 141 assessment tasks, the teachers and prospective teachers individually applied the tool to identify the competency demand of the task on a scale from 0 to 3 for each of 6 mathematical competencies. Overall, the analysis reveals high consistency in their analysis. However, the teachers and prospective teachers utilised a restricted range of the scale, rarely judging a task to demand a high level of competence. This indicates that the 5 teachers and prospective teachers can use the tool to identify which of the 6 competencies are at play in solving a task, but can only differentiate to a limited extent between tasks that demand a low level of competence and those that demand a high level. In conclusion, we propose that an item analysis tool could be useful to teachers and prospective teachers as a means of analysing and selecting appropriate tasks that enhance development of mathematical competencies.

### Keywords

Mathematical competence Teachers’ task analysis Task complexity Cognitive demand in tasks Task analysis tool### References

- Arbaugh, F. & Brown, C. A. (2005). Analyzing mathematical tasks: A catalyst for change?
*Journal of Mathematics Teacher Education, 8*(6), 499–536.CrossRefGoogle Scholar - Blömeke, S., Gustafsson, J.-E. & Shavelson, R. J. (2015). Beyond dichotomies.
*Zeitschrift für Psychologie, 223*(1), 3–13.CrossRefGoogle Scholar - Blomhøj, M. & Jensen, T. H. (2007). What’s all the fuss about competencies? In W. Blum, P. L. Galbraith, H.-W. Henn & M. Niss (Eds.),
*Modelling and applications in mathematics education: The 14th ICMI Study*(pp. 45–56). Dordrecht: Springer.CrossRefGoogle Scholar - Boaler, J. & Staples, M. (2008). Creating mathematical futures through an equitable teaching approach: The case of Railside School.
*The Teachers College Record, 110*(3), 608–645.Google Scholar - Boesen, J., Helenius, O., Bergqvist, E., Bergqvist, T., Lithner, J., Palm, T. & Palmberg, B. (2014). Developing mathematical competence: From the intended to the enacted curriculum.
*The Journal of Mathematical Behavior, 33*, 72–87.CrossRefGoogle Scholar - Borman, W. C. (1975). Effects of instructions to avoid halo error on reliability and validity of performance evaluation ratings.
*Journal of Applied Psychology, 60*(5), 556-560.Google Scholar - Boston, M. D. & Smith, M. S. (2011). A ‘task-centric approach’ to professional development: Enhancing and sustaining mathematics teachers’ ability to implement cognitively challenging mathematical tasks.
*ZDM, 43*(6–7), 965–977.CrossRefGoogle Scholar - Burkhardt, H. (2014). Curriculum design and curriculum change. In Y. Li & G. Lappan (Eds.),
*Mathematics curriculum in school education*(pp. 13–33). Dordrecht: Springer.CrossRefGoogle Scholar - Charalambous, C. Y. & Philippou, G. N. (2010). Teachers’ concerns and efficacy beliefs about implementing a mathematics curriculum reform: Integrating two lines of inquiry.
*Educational Studies in Mathematics, 75*(1), 1–21.CrossRefGoogle Scholar - Cortina, J. M. (1993). What is coefficient alpha? An examination of theory and applications.
*Journal of Applied Psychology, 78*(1), 98-104.Google Scholar - Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests.
*Psychometrika, 16*(3), 297–334.CrossRefGoogle Scholar - Doyle, W. (1988). Work in mathematics classes: The context of students’ thinking during instruction.
*Educational Psychologist, 23*(2), 167–180.CrossRefGoogle Scholar - Feeley, T. H. (2002). Comment on halo effects in rating and evaluation research.
*Human Communication Research, 28*(4), 578–586. doi:10.1111/j.1468-2958.2002.tb00825.x.CrossRefGoogle Scholar - Fleiss, J. L. & Cohen, J. (1973). The equivalence of weighted kappa and the intraclass correlation coefficient as measures of reliability.
*Educational and Psychological Measurement, 33*(3), 613–619.CrossRefGoogle Scholar - Gall, M. D., Gall, J. P. & Borg, W. R. (2007).
*Educational research: An introduction*(8th ed.). Boston: Allyn and Bacon.Google Scholar - Hiebert, J. (1986).
*Conceptual and procedural knowledge: The case of mathematics*. Hillsdale: Erlbaum.Google Scholar - Hiebert, J. & Wearne, D. (1993). Instructional tasks, classroom discourse, and students’ learning in second-grade arithmetic.
*American Educational Research Journal, 30*(2), 393–425.CrossRefGoogle Scholar - Hiebert, J., Carpenter, T. P., Fennema, E., Fuson, K., Wearne, D., Murray, H., Olivier, A. & Human, P. (1997).
*Making sense: Teaching and learning mathematics with understanding*. Portsmouth, NH: Heinemann.Google Scholar - Hiebert, J., Gallimore, R., Garnier, H., Givvin, K., Hollingsworth, H., Jacobs, J., Chui, A. M., Wearne, D., Smith, M., Kersting, N., Manaster, A., Tseng, E. A., Etterbeek, W., Manaster, C. & Stigler, J. (2003).
*Teaching mathematics in seven countries: Results from the TIMSS 1999 video study. NCES 2003–013*. Washington: National Center for Education Statistics.Google Scholar - Hoyt, W. T. & Kerns, M.-D. (1999). Magnitude and moderators of bias in observer ratings: A meta-analysis.
*Psychological Methods, 4*(4), 403–424.CrossRefGoogle Scholar - Kaur, B. (2010). Mathematical tasks from Singapore classrooms. In Y. Shimizu, B. Kaur, R. Huang & D. Clarke (Eds.),
*Mathematical tasks in classrooms around the world*(pp. 15–33). Rotterdam: Sense Publishers.Google Scholar - Kilpatrick, J. (2014). Competency frameworks in mathematics education. In S. Lerman (Ed.),
*Encyclopedia of mathematics education*(pp. 85–87). Dordrecht: Springer.Google Scholar - Kilpatrick, J., Swafford, J. & Findell, B. (Eds.). (2001).
*Adding it up: Helping children learn mathematics*. Washington, DC: National Academies Press.Google Scholar - Koeppen, K., Hartig, J., Klieme, E. & Leutner, D. (2008). Current issues in competence modeling and assessment.
*Zeitschrift für Psychologie, 216*(2), 61–73.CrossRefGoogle Scholar - Krauss, S., Baumert, J. & Blum, W. (2008). Secondary mathematics teachers’ pedagogical content knowledge and content knowledge: Validation of the COACTIV constructs.
*The International Journal on Mathematics Education, 40*(5), 873–892. doi:10.1007/s11858-008-0141-9.Google Scholar - Landis, J. R. & Koch, G. G. (1977). The measurement of observer agreement for categorical data.
*Biometrics, 33*, 159–174.CrossRefGoogle Scholar - Lithner, J. (2004). Mathematical reasoning in calculus textbook exercises.
*The Journal of Mathematical Behavior, 23*(4), 405–427.CrossRefGoogle Scholar - McGraw, K. O. & Wong, S. P. (1996). Forming inferences about some intraclass correlation coefficients.
*Psychological Methods, 1*(1), 30–46.CrossRefGoogle Scholar - National Council of Teachers of Mathematics. (2000).
*Principles and standards for school mathematics*. Reston: National Council of Teachers of Mathematics.Google Scholar - Niss, M. (2015). Mathematical competencies and PISA. In K. Stacey & R. Turner (Eds.),
*Assessing mathematical literacy*(pp. 35–55). Dordrecht: Springer.Google Scholar - Niss, M. & Højgaard, T. (Eds.). (2011).
*Competencies and mathematical learning*. Roskilde: Roskilde University.Google Scholar - Niss, M. & Højgaard J. T. (2002).
*Kompetencer og matematiklæring: ideer og inspiration til udvikling af matematikundervisning i Danmark*[Competencies and mathematical learning: ideas and inspiration for the development of mathematics teaching and learning in Denmark] (Vol. nr 18). Copenhagen: Undervisningsministeriet.Google Scholar - Niss, M., Bruder, R., Planas, N., Turner, R. & Villa-Ochoa, J. A. (2016). Survey team on: Conceptualisation of the role of competencies, knowing and knowledge in mathematics education research.
*ZDM, 48*(5), 611–632.CrossRefGoogle Scholar - Norwegian Directorate for Education and Training (2014).
*Eksamensveiledning - om vurdering av eksamensbesvarelser. MAT0010 Matematikk. Sentralt gitt skriftlig eksamen. Grunnskole*[Manual - to be used to assess exam papers. MAT0010 Mathematics. National written exam, end of compulsory education]. Oslo: Utdanningsdirektoratet.Google Scholar - Organization for Economic Co-operation and Development (2013a).
*PISA 2012 assessment and analytical framework: Mathematics, reading, science, problem solving and financial literacy*. Paris: OECD Publishing.Google Scholar - Organization for Economic Co-operation and Development (2013b).
*PISA 2012 result: What students know and can do—Student performance in Mathematics, Reading, Science (volume I)*. Paris: OECD Publishing.Google Scholar - Palm, T., Boesen, J. & Lithner, J. (2011). Mathematical reasoning requirements in Swedish upper secondary level assessments.
*Mathematical Thinking and Learning, 13*(3), 221–246.CrossRefGoogle Scholar - Pikkarainen, E. (2014). Competence as a key concept of educational theory: A semiotic point of view.
*Journal of Philosophy of Education, 48*(4), 621–636.CrossRefGoogle Scholar - Shimizu, Y., Kaur, B., Huang, R. & Clarke, D. (2010). The role of mathematical tasks in different cultures. In Y. Shimizu, B. Kaur, R. Huang & D. Clarke (Eds.),
*Mathematical tasks in classrooms around the world*(pp. 1–14). Rotterdam: Sense Publishers.Google Scholar - Shrout, P. E. & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability.
*Psychological Bulletin, 86*(2), 420–428.CrossRefGoogle Scholar - Stein, M. K. & Lane, S. (1996). Instructional tasks and the development of student capacity to think and reason: An analysis of the relationship between teaching and learning in a reform mathematics project.
*Educational Research and Evaluation, 2*(1), 50–80. doi:10.1080/1380361960020103.CrossRefGoogle Scholar - Stein, M. K., Baxter, J. A. & Leinhardt, G. (1990). Subject-matter knowledge and elementary instruction: A case from functions and graphing.
*American Educational Research Journal, 27*(4), 639–663.CrossRefGoogle Scholar - Stein, M. K., Smith, M. S., Henningsen, M. A. & Silver, E. A. (2000).
*Implementing standards-based mathematics instruction: A casebook for professional development*. New York: Teachers College Press National Council of Teachers of Mathematics.Google Scholar - Tinsley, H. E. A. & Weiss, D. J. (2000). Interrater reliability and agreement. In H. E. Tinsley & S. D. Brown (Eds.),
*Handbook of applied multivariate statistics and mathematical modeling*(pp. 95–124). San Diego: Academic.Google Scholar - Tucker, M. (2013).
*What does it really mean to be college and work ready? The mathematics required of first year community college students*. Washington, DC: National Center on Education and the Economy.Google Scholar - Turner, R., Dossey, J., Blum, W. & Niss, M. (2013). Using mathematical competencies to predict item difficulty in PISA: A MEG study. In M. Prenzel, M. Kobarg, K. Schöps & S. Rönnebeck (Eds.),
*Research on PISA*(pp. 23–37). New York: Springer.CrossRefGoogle Scholar - Turner, R., Blum, W. & Niss, M. (2015). Using competencies to explain mathematical item demand: A work in progress. In K. Stacey & R. Turner (Eds.),
*Assessing mathematical literacy: The PISA experience*(pp. 85–115). New York: Springer.Google Scholar - Westera, W. (2001). Competences in education: A confusion of tongues.
*Journal of Curriculum Studies, 33*(1), 75–88.CrossRefGoogle Scholar