Skip to main content

Advertisement

Log in

Metacognition in mathematics: do different metacognitive monitoring measures make a difference?

  • Original Article
  • Published:
ZDM Aims and scope Submit manuscript

Abstract

Metacognitive monitoring in educational contexts is typically measured by calibration indicators, which are based on the correspondence between cognitive performance and metacognitive confidence judgment. Despite this common rationale, a variety of alternative methods are used in the field of monitoring research to assess performance and judgment data and to calculate calibration indicators from them. However, the impact of these methodological differences on the partly incongruent picture of monitoring research has hardly been considered. Thus, the goal of the present study is to examine the effects of methodological choices in the context of mathematics education. To do so, the study compares the effects of two judgment scales (Likert scale vs. visual analogue scale), two response formats (open-ended response vs. closed response format), the information base of judgment (prospective vs. retrospective), and students’ achievement level on confidence judgments. Secondly, the study contrasts measures of three calibration constructs, namely absolute accuracy (Absolute Accuracy Index, Hamann Coefficient), relative accuracy (Gamma, d’), and diagnostic accuracy (sensitivity and specificity). One hundred and nine seventh-grade students completed a set of 20 mathematical problems and rated their confidence in a correct solution for each problem prospectively and retrospectively. Our results show a pervasive overconfidence of students across achievement levels. Monitoring was more precise for retrospective judgments and the visual analogue scale format. Gamma, sensitivity, and specificity proved to be susceptible for boundary values, caused by the general overconfidence in the sample. Measures of absolute accuracy were affected by response format of the task and judgment scale, with higher accuracy found for closed response format and visual analogue scale. We observed substantial correlations within the three calibration constructs and comparably low correlations between indicators of different constructs, confirming three interrelated aspects of monitoring accuracy. The low correlations between corresponding prospective and retrospective calibration indicators suggest different calibration processes. Implications for studies on calibration and mathematics education are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Baker, L. (1989). Metacognition, comprehension monitoring, and the adult reader. Educational Psychology Review, 1(1), 3–38.

    Article  Google Scholar 

  • Baten, E., Praet, M., & Desoete, A. (2017). The relevance and efficacy of metacognition for instructional design in the domain of mathematics. ZDM Mathematics Education, 49(4), 613–623.

    Article  Google Scholar 

  • Boekaerts, M., & Rozendaal, J. S. (2010). Using multiple calibration indices in order to capture the complex picture of what affects students’ accuracy of feeling of confidence. Learning and Instruction, 20, 372–382.

    Article  Google Scholar 

  • Bol, L., & Hacker, D. J. (2012). Calibration research: Where do we go from here? Frontiers in Psychology, 3, 1–6.

    Article  Google Scholar 

  • Chen, P. P. (2003). Exploring the accuracy and predictability of the self-efficacy beliefs of seventh-grade mathematics students. Learning and Individual Differences, 14, 77–90.

    Article  Google Scholar 

  • De Clercq, A., Desoete, A., & Roeyers, H. (2000). EPA2000: A multilingual, programmable computer assessment of off-line metacognition in children with mathematical-learning disabilities. Behavior Research Methods, Instruments, & Computers, 32, 304–311.

    Article  Google Scholar 

  • Desoete, A. (2008). Multi-method assessment of metacognitive skills in elementary school children: How you test is what you get. Metacognition and Learning, 3, 189–206.

    Article  Google Scholar 

  • Desoete, A., & Roeyers, H. (2006). Metacognitive macroevaluations in mathematical problem solving. Learning and Instruction, 16, 12–25.

    Article  Google Scholar 

  • Desoete, A., Roeyers, H., & Buysse, A. (2001). Metacognition and mathematical problem solving in grade 3. Journal of Learning Disabilities, 34, 435–447.

    Article  Google Scholar 

  • Desoete, A., & Veenman, M. (2006). Metacognition in mathematics: Critical issues on nature, theory, assessement and treatment. In A. Desoete & M. Veenman (Eds.), Metacogniton in mathematics education (pp. 1–10). Haupauge: Nova Science.

    Google Scholar 

  • Dunlosky, J., Mueller, M. L., & Thiede, K. W. (2016). Methodology for investigating human metamemory. In J. Dunlosky & S. K. Tauber (Eds.), The Oxford handbook of metamemory (pp. 23–37). New York: Oxford University Press.

    Google Scholar 

  • Dunlosky, J., & Tauber, S. K. (Eds.). (2016). The Oxford handbook of metamemory. New York: Oxford University Press.

    Google Scholar 

  • Dunning, D., Johnson, K., Ehrlinger, J., & Kruger, J. (2003). Why people fail to recognize their own incompetence. Current Directions in Psychological Science, 12, 83–87.

    Article  Google Scholar 

  • Efklides, A. (2008). Metacognition. Defining its facets and levels of functioning in relation to self-regulation and co-regulation. European Psychologist, 13, 277–287.

    Article  Google Scholar 

  • Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.

    Article  Google Scholar 

  • Flavell, J. H., Miller, P. H., & Miller, S. A. (2002). Cognitive development (4th ed.). Upper Saddle River: Prentice-Hall.

    Google Scholar 

  • García, T., Rodríguez, C., González-Castro, P., González-Pienda, J. A., & Torrance, M. (2016). Elementary students’ metacognitive processes and post-performance calibration on mathematical problem-solving tasks. Metacognition and Learning, 11, 139–170.

    Article  Google Scholar 

  • Garofalo, J., & Lester, F. K. (1985). Metacognition, cognitive monitoring, and mathematical performance. Journal for Research in Mathematics Education, 16, 163–176.

    Article  Google Scholar 

  • Hacker, D. J., Bol, L., & Keener, M. C. (2008). Metacognition in education: A focus on calibration. In J. Dunlosky & R. A. Bjork (Eds.), Handbook of metamemory and memory (pp. 429–455). New York: Psychology Press.

    Google Scholar 

  • Higham, P. A., Zawadzka, K., & Hanczakowski, M. (2016). Internal mapping and its impact on measures of absolute and relative metacognitive accuracy. In J. Dunlosky & S. K. Tauber (Eds.), The Oxford handbook of metamemory (pp. 39–61). New York: Oxford University Press.

    Google Scholar 

  • Keren, G. (1991). Calibration and probability judgments: Conceptual and methodological issues. Acta Psychologica, 77, 217–273.

    Article  Google Scholar 

  • Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77, 1121–1134.

    Article  Google Scholar 

  • Lucangeli, D., & Cornoldi, C. (1997). Mathematics and metacognition: What is the nature of the relationship? Mathematical Cognition, 3, 121–139.

    Article  Google Scholar 

  • Maki, R. H., Shields, M., Wheeler, A. E., & Zacchilli, T. L. (2005). Individual differences in absolute and relative metacomprehension accuracy. Journal of Educational Psychology, 97, 723–731.

    Article  Google Scholar 

  • Nelson, T. O. (1999). Cognition versus metacognition. American Psychologist, 51, 102–116.

    Article  Google Scholar 

  • Özsoy, G. (2011). An investigation of the relationship between metacognition and mathematics achievement. Asia Pacific Education Review, 12, 227–235.

    Article  Google Scholar 

  • Pajares, F., & Miller, M. D. (1997). Mathematics self-efficacy and mathematical problem solving: Implications of using different forms of assessment. The Journal of Experimental Education, 65(3), 213–228.

    Article  Google Scholar 

  • Polya, G. (1949). Schule des Denkens—Vom Lösen mathematischer Probleme [How to solve it]. Tübingen: Francke Verlag.

    Google Scholar 

  • Pressley, M., & Ghatala, E. S. (1990). Self-regulated learning: Monitoring learning from text. Educational Psychologist, 25, 19–33.

    Article  Google Scholar 

  • Roderer, T., & Roebers, C. M. (2013). Children’s performance estimation in mathematics and science tests over a school year: A pilot study. Electronic Journal of Research in Educational Psychology, 11, 5–24.

    Google Scholar 

  • Rutherford, T. (2017). The measurement of calibration in real contexts. Learning and Instruction, 47, 33–42.

    Article  Google Scholar 

  • Schneider, W., & Artelt, C. (2010). Metacognition and mathematics education. ZDM—International Journal of Mathematics Education, 42, 149–161.

    Google Scholar 

  • Schoenfeld, A. H. (1985). Mathematical problem solving. New York: Academic Press.

    Google Scholar 

  • Schoenfeld, A. H. (1987). What’s all that fuss about metacognition? In A. H. Schoenfeld (Ed.), Cognitive science and mathematics education (pp. 189–215). Hillsdale: Erlbaum.

    Google Scholar 

  • Schraw, G. (2009). Measuring metacognitive judgments. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of metacognition in education (pp. 415–429). Mahwah: Erlbaum.

    Google Scholar 

  • Schraw, G., Kuch, F., Gutierrez, A. P., & Richmond, A. S. (2014). Exploring a three-level model of calibration accuracy. Journal of Educational Psychology, 106(4), 1192–1202.

    Article  Google Scholar 

  • Schraw, G., Potenza, M. T., & Nebelsick-Gullet, L. (1993). Constraints on the calibration of performance. Contemporary Educational Psychology, 18(4), 455–463.

    Article  Google Scholar 

  • Schwartz, B. L., & Metcalfe, J. (1994). Methodological problems and pitfalls in the study of human metacognition. In J. Metcalfe & A. P. Shimamura (Eds.), Metacognition: Knowing about knowing (pp. 93–113). Cambridge: MIT Press.

    Google Scholar 

  • Tobias, S., & Everson, H. (2000). Assessing metacognitive knowledge monitoring. In G. Schraw & J. C. Impara (Eds.), Issues in the measurement of metacognition (pp. 147–222). Lincoln: Buros Institute of Mental Measurements.

    Google Scholar 

  • Tobias, S., & Everson, H. T. (2009). The importance of knowing what you know: A knowledge monitoring framework for studying metacognition in education. In D. Hacker, J. Dunlosky, & A. Graesser (Eds.), Handbook of metacognition in education (pp. 107–127). Mahwah: Erlbaum.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Klaus Lingel.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lingel, K., Lenhart, J. & Schneider, W. Metacognition in mathematics: do different metacognitive monitoring measures make a difference?. ZDM Mathematics Education 51, 587–600 (2019). https://doi.org/10.1007/s11858-019-01062-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11858-019-01062-8

Keywords

Navigation