Skip to main content

Advertisement

Log in

Investigating the validity of web-enabled mechanistic case diagramming scores to assess students’ integration of foundational and clinical sciences

  • Published:
Advances in Health Sciences Education Aims and scope Submit manuscript

Abstract

As medical schools have changed their curricula to address foundational and clinical sciences in a more integrated fashion, teaching methods such as concept mapping have been incorporated in small group learning settings. Methods that can assess students’ ability to apply such integrated knowledge are not as developed, however. The purpose of this project was to assess the validity of scores on a focused version of concept maps called mechanistic case diagrams (MCDs), which are hypothesized to enhance existing tools for assessing integrated knowledge that supports clinical reasoning. The data were from the medical school graduating class of 2018 (N = 136 students). In 2014–2015 we implemented a total of 16 case diagrams in case analysis groups within the Mechanisms of Health and Disease (MOHD) strand of the pre-clinical curriculum. These cases were based on topics being taught during the lectures and small group sessions for MOHD. We created an overall score across all 16 cases for each student. We then correlated these scores with performance in the preclinical curriculum [as assessed by overall performance in MOHD integrated foundational basic science courses and overall performance in the Clinical and Professional Skills (CAPS) courses], and standardized licensing exam scores [United States Medical Licensing Exam (USMLE)] Step 1 (following core clerkships) and Step 2 Clinical Knowledge (at the beginning of the fourth year of medical school). MCD scores correlated with students’ overall basic science scores (r = .46, p = .0002) and their overall performance in Clinical and Professional Skills courses (r = .49, p < .0001). In addition, they correlated significantly with standardized exam measures, including USMLE Step 1 (r = .33, p ≤ .0001), and USMLE Step 2 CK (r = .39, p < .0001). These results provide preliminary validity evidence that MCDs may be useful in identifying students who have difficulty in integrating foundational and clinical sciences.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Association of American Medical Colleges. (2018). Curriculum change in US medical schools: Types of change in 20162017. AAMC Curriculum Inventory, 20162017. Retrieved August 30, 2018, from https://www.aamc.org/initiatives/cir/427196/27.html.

  • Azer, S. A. (2005). Facilitation of students’ discussion in problem-based learning tutorials to create mechanisms: The use of five key questions. Annals of the Academy of Medicine, Singapore,34, 492–498.

    Google Scholar 

  • Bandiera, G., Kuper, A., Mylopoulos, M., Whitehead, C., Ruetalo, M., Kulasegaram, K., et al. (2018). Back from basics: Integration of science and practice in medical education. Medical Education,52, 78–85.

    Article  Google Scholar 

  • Bierer, S. B., Taylor, C. A., & Dannefer, E. F. (2009). Evaluation of essay questions used to assess medical students’ application and integration of basic and clinical science knowledge. Teaching and Learning in Medicine,21, 344–350.

    Article  Google Scholar 

  • Cook, D. A., Brydges, R., Ginsburg, S., & Hatala, R. (2015). A contemporary approach to validity arguments: A practical guide to Kane’s framework. Medical Education,49, 560–575.

    Article  Google Scholar 

  • Dee, F. R. (2013). Mechanisms of disease/pathway diagramming exercises facilitate small group case analysis. Paper presented at the International Association of Medical Science Educators meeting, St. Andrews, Scotland.

  • Dee, F. R., Haugen, T. H., & Kreiter, C. D. (2014). New web-based applications for mechanistic case diagramming. Medical Education Online,19, 24708.

    Article  Google Scholar 

  • Dee, F. R., Wessels, S., Bolton, L., Kreiter, C. D., Schmidt, T., Rubenstein, P., & Leaven T. (2011). Improving basic medical science education through case-based pathway diagramming exercises. Presented at the Central Group for Educational Affairs of the AAMC at Omaha, Nebraska.

  • Denton, G. D., Durning, S. J., Wimmer, A. P., Pangaro, L. N., & Hemmer, P. A. (2004). Is a faculty developed pretest equivalent to pre-third year GPA or USMLE Step 1 as a predictor of third-year Internal Medicine clerkship outcomes? Teaching and Learning in Medicine,16(4), 329–332.

    Article  Google Scholar 

  • Downing, S. M. (2003). Validity: On the meaningful interpretation of assessment data. Medical Education,37, 830–837.

    Article  Google Scholar 

  • Ferguson, K. J., Kreiter, C. D., Haugen, T. H., & Dee, F. R. (2018). Web-enabled mechanistic case diagramming: A novel tool for assessing students’ ability to integrate foundational and clinical sciences. Academic Medicine,93, 1146–1149.

    Article  Google Scholar 

  • Fischer, K., Sullivan, A., Krupat, E., & Schwartzstein R. M. (2018). Assessing the effectiveness of using mechanistic concept maps in case-based collaborative learning. Academic Medicine, September 11, 2018 - Volume Published Ahead of Print.

  • Guerrero, A. P. S. (2001). Mechanistic case diagramming: A tool for problem-based learning. Academic Medicine,76, 385–389.

    Article  Google Scholar 

  • Ho, V. W., Harris, P. G., Kumara, R. K., & Velan, G. M. (2018). Knowledge maps: A tool for online assessment with automated feedback. Medical Education Online,23, 1457394.

    Article  Google Scholar 

  • Kane, M. T. (2006). In R. I. Brennan (Ed.), Educational measurement (4th ed., pp. 17–64). Westport, CT: Praeger.

    Google Scholar 

  • Kane, M. T. (2013). Validating the interpretation and uses of test scores. Journal of Educational Measurement,50, 1–73.

    Article  Google Scholar 

  • Kim, M., & Kang, B. J. (2017). Exploring the pros and cons of mechanistic case diagrams for problem-based learning. Korean Journal of Medical Education,29(3), 153–163.

    Article  Google Scholar 

  • Kulasegaram, K., Manzone, J. C., Ku, C., Skye, A., Wadey, V., & Woods, N. N. (2015). Cause and effect: Testing a mechanism and method for the cognitive integration of basic science. Academic Medicine,90, S63–S69.

    Article  Google Scholar 

  • Kulasegaram, K. M., Martimianakis, M. A., Mylopoulos, M., Whitehead, C. R., & Woods, N. N. (2013). Cognition before curriculum: Rethinking the integration of basic science and clinical learning. Academic Medicine,88, 1578–1585.

    Article  Google Scholar 

  • Kumar, S., Dee, F., Kumar, R., & Velan, G. (2011). Benefits of testable concept maps for learning about pathogenesis of disease. Teaching and Learning in Medicine,23, 137–143.

    Article  Google Scholar 

  • McCrorie, P. (2000). The place of the basic sciences in medical curricula. Medical Education,34, 594–595.

    Article  Google Scholar 

  • Novak, J. D., & Cañas, A. J. (2008). The theory underlying concept maps and how to construct and use them. Technical report IHMC Cmap Tools 2006-01 Rev 01-2008. Florida Institute for Human and Machine Cognition; 2008. Retrieved February 7, 2018, from http://www.webcitation.org/6Ovjy5ZZd.

  • Spencer, A. L., Brosenitsch, T., Levine, A. S., & Kanter, S. L. (2008). Back to the basic sciences: An innovative approach to teaching senior medical students how to best integrate basic science and clinical medicine. Academic Medicine,83, 662–669.

    Article  Google Scholar 

  • Wood, T. J., Cunnington, J. P. W., & Norman, G. R. (2009). Assessing the measurement properties of a clinical reasoning exercise. Teaching and Learning in Medicine,12, 196–200.

    Article  Google Scholar 

  • Woods, N. N., Brooks, L. R., & Norman, G. R. (2005). The value of basic science in clinical diagnosis: Creating coherence among signs and symptoms. Medical Education,39, 1173–1177.

    Article  Google Scholar 

Download references

Acknowledgements

This project was funded by the Stemmler Fund of the National Board of Medical Examiners. Development of the software was funded by grants from the University of Iowa’s Innovations in Teaching with Technology Fund and from the Office of Consultation and Research in Medical Education’s Educational Development Fund. Approval for this project was obtained from the University of Iowa Carver College of Medicine Umbrella IRB on June 21, 2016 (Project ID # 201509).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kristi J. Ferguson.

Ethics declarations

Conflict of interest

All authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ferguson, K.J., Kreiter, C.D., Franklin, E. et al. Investigating the validity of web-enabled mechanistic case diagramming scores to assess students’ integration of foundational and clinical sciences. Adv in Health Sci Educ 25, 629–639 (2020). https://doi.org/10.1007/s10459-019-09944-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10459-019-09944-y

Keywords

Navigation