Skip to main content

Advertisement

Log in

The Educational Effects of a Summative Diagnostic Reasoning Examination Among Second-Year Medical Students

  • Original Research
  • Published:
Medical Science Educator Aims and scope Submit manuscript

Abstract

The medical education literature is beginning to address the educational effects of summative closed-response (MCQ) and open-response (short answer) exam formats. This study examines the student-reported educational effects of these types of testing. From 2013 to 2016, an open-response summative “diagnostic reasoning examination” (DxRx) has been administered in the Reproductive Systems Module. The DxRx consists of unfolding cases requiring short answers and a final explanation of pathophysiology. The module also utilizes an NBME MCQ examination. Annually, the authors have queried students on their preparation for these examinations in post-course surveys. Narrative responses were categorized using three domains from a validated framework: cognitive processing, resources, and content. The average survey response rate was 72.5% (n = 343). The percentage indicating their study strategy for the DxRx differed from the NBME exam ranged from 81.6 to 97.9%. Ninety percent of respondents provided comments. On cognitive processing, 38% reported re-organizing course content by clinical presentation (rather than studying disease in isolation), 18% described developing and/or practicing generation of differential diagnoses, 14% reported shifting to group study. Students reported using case-based material and focusing on content directly related to clinical problem-solving. Ten percent of students volunteered metacognitive insights suggesting more robust learning from DxRx preparation. The majority of students reported different approaches to studying for the DxRx that support constructing new knowledge and the cognitive skills required for clinical problem-solving. We believe these findings represent an important step in exploring the educational effect of these two types of summative testing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Schuwirth LW, van der Vleuten CP, Stoffers HE, Peperkamp AG. Computerized long-menu questions as an alternative to open-ended questions in computerized assessment. Med Educ. 1996;30:50–5.

    Article  Google Scholar 

  2. Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 conference. Med Teach. 2011;33:206–14.

    Article  Google Scholar 

  3. Hift RJ. Should essays and other “open-ended”-type questions retain a place in written summative assessment in clinical medicine? BMC Med Educ. 2014;14:249.

    Article  Google Scholar 

  4. Joughin G. The hidden curriculum revisited: a critical review of research into the influence of summative assessment on learning. Assess Eval High Educ. 2010;35:335–45.

    Article  Google Scholar 

  5. Newble DI, Jaeger K. The effect of assessments and examinations on the learning of medical students. Med Educ. 1983;17:165–71.

    Article  Google Scholar 

  6. Huwendiek S, Reichert F, Duncker C, de Leng BA, van der Vleuten CPM, Muijtjens AMM, et al. Electronic assessment of clinical reasoning in clerkships: a mixed-methods comparison of long-menu key-feature problems with context-rich single best answer questions. Med Teach. 2017;39:476–85.

    Article  Google Scholar 

  7. Cilliers FJ, Schuwirth LW, Adendorff HJ, Herman N, van der Vleuten CP. The mechanism of impact of summative assessment on medical students’ learning. Adv Health Sci Educ Theory Pract. 2010;15:695–715.

    Article  Google Scholar 

  8. Cilliers FJ, Schuwirth LW, Herman N, Adendorff HJ, van der Vleuten CP. A model of the pre-assessment learning effects of summative assessment in medical education. Adv Health Sci Educ Theory Pract. 2012;17:39–53.

    Article  Google Scholar 

  9. Cilliers FJ, Schuwirth LW, van der Vleuten CP. A model of the pre-assessment learning effects of assessment is operational in an undergraduate clinical context. BMC Med Educ. 2012;12:9.

    Article  Google Scholar 

  10. Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med. 2006;355:2217–25.

    Article  Google Scholar 

  11. American Association of Medical Colleges. Curriculum Inventory Reports. https://www.aamc.org/initiatives/cir/406454/05b.html. Accessed 14 Feb 2018.

  12. Lucey C. Reading to create strong illness scripts. In: Clinical Problem Solving. Coursera. 2013. https://vimeo.com/album/2358328/video/64767466 . Accessed 20 Jul 2017.

Download references

Acknowledgements

Melissa Ward-Peterson is supported by the National Institute on Minority Health and Health Disparities grant (1U54MD012393-01) for FIU-RCMI.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Carla S. Lupi.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. For this type of study formal consent is not required.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lupi, C.S., Tempest, H.G., Ward-Peterson, M. et al. The Educational Effects of a Summative Diagnostic Reasoning Examination Among Second-Year Medical Students. Med.Sci.Educ. 28, 667–673 (2018). https://doi.org/10.1007/s40670-018-0610-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40670-018-0610-x

Keywords

Navigation