Advances in Health Sciences Education

, Volume 17, Issue 3, pp 311–323 | Cite as

Self-monitoring and its relationship to medical knowledge

  • Meghan M. McConnell
  • Glenn Regehr
  • Timothy J. Wood
  • Kevin W. Eva
Article

Abstract

In the domain of self-assessment, researchers have begun to draw distinctions between summative self-assessment activities (i.e., making an overall judgment of one’s ability in a particular domain) and self-monitoring processes (i.e., an “in the moment” awareness of whether one has the necessary knowledge or skills to address a specific problem with which one is faced). Indeed, previous research has shown that, when responding to both short answer and multiple choice questions, individuals are able to assess the likelihood of answering questions correctly on a moment-by-moment basis, even though they are not able to generate an accurate self-assessment of overall performance on the test. These studies, however, were conducted in the context of low-stakes tests of general “trivia”. The purpose of the present study was to further this line of research by investigating the relationship between self-monitoring and performance in the context of a high stakes test assessing medical knowledge. Using a recent administration of the Medical Council of Canada Qualifying Examination Part I, we examined three measures intended to capture self-monitoring: (1) the time taken to respond to each question, (2) the number of questions a candidate flagged as needing to be considered further, and (3) the likelihood of changing one’s initial answer. Differences in these measures as a function of the accuracy of the candidate’s response were treated as indices of each candidate’s ability to judge his or her likelihood of responding correctly. The three self-monitoring indices were compared for candidates at three different levels of overall performance on the exam. Relative to correct responses, when examinees initially responded incorrectly, they spent more time answering the question, were more likely to flag the question for future consideration, and were more likely to change their answer before committing to a final answer. These measures of self-monitoring were modulated by candidate performance in that high performing examinees showed greater differences on these indices relative to poor performing examinees. Furthermore, reliability analyses suggest that these difference measures hold promise for reliably differentiating self-monitoring at the level of individuals, at least within a given content area. The results suggest that examinees were self-monitoring their knowledge and skills on a question by question basis and altering their behavior appropriately in the moment. High performing individuals showed stronger evidence of accurate self-monitoring than did low performing individuals and the reliability of these measures suggests that they have the potential to differentiate between individuals. How these findings relate to performance in actual clinical settings remains to be seen.

Keywords

Medical education Medical student Physician competency Self-assessment Self-monitoring 

References

  1. Colliver, J. A., Verhulst, S. J., & Barrows, H. S. (2005). Self-assessment in medical practice: A further concern about the conventional research paradigm. Teaching and Learning and Medicine, 17, 200–201.CrossRefGoogle Scholar
  2. Davis, D. A., Mazmanian, P. E., Fordis, M., Harrison, R. V., Thorpe, K. E., & Perrier, L. (2006). Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA, 296, 1094–1102.CrossRefGoogle Scholar
  3. Di Milia, L. (2007). Benefits from multiple-choice exams: The positive impact of answer switching. Educational Psychology, 27, 607–615.CrossRefGoogle Scholar
  4. Dunning, D., Heath, C., & Suls, J. (2004). Flawed self-assessment: Implications for health, education, and the workplace. Psychological Science in the Public Interest, 5, 69–106.CrossRefGoogle Scholar
  5. Eva, K. W., & Regehr, G. (2005). Self-assessment in the health professions: A reformulation and research agenda. Academic Medicine, 80, S46–S54.CrossRefGoogle Scholar
  6. Eva, K. W., & Regehr, G. (2007). Knowing when to look it up: A new conception of self-assessment ability. Academic Medicine, 82, S81–S84.CrossRefGoogle Scholar
  7. Eva, K. W., & Regehr, G. (2008). I’ll never play professional football and other fallacies of self-assessment. Journal of Continuing Education in the Health Professions, 28, 14–19.CrossRefGoogle Scholar
  8. Eva, K. W., & Regehr, G. (2010). Exploring the divergence between self-assessment and self-monitoring. Advances in Health Sciences Education [epub ahead of publication].Google Scholar
  9. Ferguson, K. J., Kreiter, C. D., Peterson, M. W., Roawt, J. A., & Elliott, S. T. (2002). Is that your final answer? Relationship of changed answers to overall performance on a computer-based medical school course examination. Teaching and Learning in Medicine, 14, 20–23.CrossRefGoogle Scholar
  10. Fischer, M. R., Herrmann, S. M., & Kopp, V. (2005). Answering multiple choice questions in high-stakes medical examinations. Medical Education, 39, 890–894.CrossRefGoogle Scholar
  11. Geiger, M. A. (1997). An examination of the relationship between answer changing, testwiseness, and examination performance. Journal of Experimental Education, 66, 49–60.CrossRefGoogle Scholar
  12. Gordon, M. J. (1991). A review of the validity and accuracy of self-assessments in health professions training. Academic Medicine, 66, 762–769.CrossRefGoogle Scholar
  13. Handfield-Jones, R. S., Mann, K. V., Challis, M. E., Hobma, S. O., Klass, D. J., McManus, I. C., et al. (2002). Linking assessment to learning: A new route to quality assurance in medical practice. Medical Education, 36, 949–958.CrossRefGoogle Scholar
  14. Higham, P. A., & Gerrard, C. (2005). Not all errors are created equal: Metacognition and changing answers on multiple-choice tests. Canadian Journal of Experimental Psychology, 59, 28–34.CrossRefGoogle Scholar
  15. Hodges, B., Regehr, G., & Martin, D. (2001). Difficulties in recognizing one’s own incompetence: Novice physicians who are unskilled and unaware of it. Academic Medicine, 76, S87–S89.CrossRefGoogle Scholar
  16. Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: Difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77, 1121–1134.CrossRefGoogle Scholar
  17. Moulton, C. E., Regehr, G., Mylopoulos, M., & MacRae, H. M. (2007). Slowing down when you should: A new model of expert judgment. Academic Medicine, 82, S109–S116.CrossRefGoogle Scholar
  18. Moulton, C. E., Regehr, G., Lingard, L., Merritt, C., & MacRae, H. (2010a). ‘Slowing down when you should’: Initiators and influences of the transition from the routine to the effortful. Journal of Gastrointestinal Surgery, 14, 1019–1026.CrossRefGoogle Scholar
  19. Moulton, C. E., Regehr, G., Lingard, L., Merritt, C., & MacRae, H. (2010b). Slowing down to stay out of trouble in the operating room: Remaining attentive in automaticity. Academic Medicine, 85, 1571–1577.CrossRefGoogle Scholar
  20. Norman, G. R., Rosenthal, D., Brooks, L. R., Allen, S. W., & Muzzin, L. J. (1989). The development of expertise in dermatology. Archives of Dermatology, 125, 1063–1068.CrossRefGoogle Scholar
  21. Regehr, G., & Eva, K. W. (2006). Self-assessment, self-direction, and the self-regulating professional. Clinical Orthopaedics and Related Research, 449, 34–38.Google Scholar
  22. Regehr, G., Hodges, B., Tiberius, R., & Lofchy, J. (1996). Measuring self-assessment skills: An innovative relative ranking model. Academic Medicine, 71, S52–S54.CrossRefGoogle Scholar
  23. Rogers, W. T., & Bateson, D. J. (1991). Verification of a model of test-taking behaviour of high school seniors. Journal of Experimental Education, 59, 331–350.Google Scholar
  24. Salthouse, T. A., & Berish, D. E. (2005). Correlates of within-person (across-occasion) variability in reaction time. Neuropsychology, 19, 77–87.CrossRefGoogle Scholar
  25. Ward, M., Gruppen, L., & Regehr, G. (2002). Measuring self-assessment: Current state of the art. Advances in Health Sciences Education, 7, 63–80.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2011

Authors and Affiliations

  • Meghan M. McConnell
    • 1
  • Glenn Regehr
    • 2
  • Timothy J. Wood
    • 1
  • Kevin W. Eva
    • 2
  1. 1.Medical Council of CanadaOttawaCanada
  2. 2.Centre for Health Education ScholarshipUniversity of British ColumbiaVancouverCanada

Personalised recommendations