Development of a Clinical Reasoning Documentation Assessment Tool for Resident and Fellow Admission Notes: a Shared Mental Model for Feedback

Abstract

Background

Residents and fellows receive little feedback on their clinical reasoning documentation. Barriers include lack of a shared mental model and variability in the reliability and validity of existing assessment tools. Of the existing tools, the IDEA assessment tool includes a robust assessment of clinical reasoning documentation focusing on four elements (interpretive summary, differential diagnosis, explanation of reasoning for lead and alternative diagnoses) but lacks descriptive anchors threatening its reliability.

Objective

Our goal was to develop a valid and reliable assessment tool for clinical reasoning documentation building off the IDEA assessment tool.

Design, Participants, and Main Measures

The Revised-IDEA assessment tool was developed by four clinician educators through iterative review of admission notes written by medicine residents and fellows and subsequently piloted with additional faculty to ensure response process validity. A random sample of 252 notes from July 2014 to June 2017 written by 30 trainees across several chief complaints was rated. Three raters rated 20% of the notes to demonstrate internal structure validity. A quality cut-off score was determined using Hofstee standard setting.

Key Results

The Revised-IDEA assessment tool includes the same four domains as the IDEA assessment tool with more detailed descriptive prompts, new Likert scale anchors, and a score range of 0–10. Intraclass correlation was high for the notes rated by three raters, 0.84 (95% CI 0.74–0.90). Scores ≥6 were determined to demonstrate high-quality clinical reasoning documentation. Only 53% of notes (134/252) were high-quality.

Conclusions

The Revised-IDEA assessment tool is reliable and easy to use for feedback on clinical reasoning documentation in resident and fellow admission notes with descriptive anchors that facilitate a shared mental model for feedback.

This is a preview of subscription content, access via your institution.

Figure 1
Figure 2

References

  1. 1.

    Accreditation Council for Graduate Medical Education. ACGME common program requirements. Available at: http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_2017-07-01.pdf.Revised July 1. Accessed January 17, 2020.

  2. 2.

    Young M, Thomas A, Lubarsky S, et al. Drawing boundaries: the difficulty in defining clinical reasoning. Acad Med. 2018;93(7):990-995.

    Article  Google Scholar 

  3. 3.

    Daniel M, Rencic J, Durning SJ, et al. Clinical reasoning assessment methods: a scoping review and practical guidance. Acad Med. 2019;94(6):902-912.

    Article  Google Scholar 

  4. 4.

    Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med. 2006;355(21):2217-2225.

    CAS  Article  Google Scholar 

  5. 5.

    Accreditation Council for Graduate Medical Education. ACGME core program requirements. Available at: http://www.acgme.org/portals/0/pdfs/milestones/internalmedicinemilestones.pdf. Accessed January 17, 2020.

  6. 6.

    March CA, Scholl G, Dversdal RK, et al. Use of electronic health record simulation to understand the accuracy of intern progress notes. J Grad Med Educ. 2016;8(2):237-40.

    Article  Google Scholar 

  7. 7.

    Colicchio TK, Cimino JJ. Clinicians’ reasoning as reflected in electronic clinical note-entry and reading/retrieval: a systematic review and qualitative synthesis. J Am Med Inform Assoc. 2019;26(2):172-184.

    Article  Google Scholar 

  8. 8.

    Bierman JA, Hufmeyer KK, Liss DT, Weaver AC, Heiman HL. Promoting responsible electronic documentation: validity evidence for a checklist to assess progress notes in the electronic health record. Teach Learn Med. 2017;29(4):420-432.

    Article  Google Scholar 

  9. 9.

    Tierney MJ, Pageler NM, Kahana M, Pantaleoni JL, Longhurst CA. Medical education in the electronic medical record (EMR) era: benefits, challenges, and future directions. Acad Med. 2013;88(6):748-52.

    Article  Google Scholar 

  10. 10.

    Lessing JN, Rendón P, Durning SJ, Roesch JJ. Approaches to clinical reasoning assessment. Acad Med. 2020;95(8):1285.

    Article  Google Scholar 

  11. 11.

    Middleman AB, Sunder PK, Yen AG. Reliability of the history and physical assessment (HAPA) form. Clin Teach. 2011;8(3):192-5.

    Article  Google Scholar 

  12. 12.

    Habboush Y, Hoyt R, Beidas S. Electronic health records as an educational tool: viewpoint. JMIR Med Educ. 2018;4(2):e10306.

    Article  Google Scholar 

  13. 13.

    Varpio L, Day K, Elliot-Miller P, et al. The impact of adopting EHRs: how losing connectivity affects clinical reasoning. Med Educ. 2015;49(5):476-86.

    Article  Google Scholar 

  14. 14.

    Berndt M, Fischer MR. The role of electronic health records in clinical reasoning. Ann NY Acad Sci. 2018;1434(1):109-114.

    Article  Google Scholar 

  15. 15.

    Atwater AR, Rudd M, Brown A, et al. Developing teaching strategies in the EHR era: a survey of GME experts. J Grad Med Educ. 2016;8(4):581-586.

    Article  Google Scholar 

  16. 16.

    Schenarts PJ, Schenarts KD. Educational impact of the electronic medical record. J Surg Educ. 2012;69(1):105-12.

    Article  Google Scholar 

  17. 17.

    Pageler NM, Friedman CP, Longhurst CA. Refocusing medical education in the EMR era. JAMA. 2013;310(21):2249-50.

    Article  Google Scholar 

  18. 18.

    Burke HB, Hoang A, Becher D, et al. QNOTE: an instrument for measuring the quality of EHR clinical notes. J Am Med Inform Assoc. 2014;21(5):910-6.

    Article  Google Scholar 

  19. 19.

    Stetson PD, Bakken S, Wrenn JO, Siegler EL. Assessing electronic note quality using the physician documentation quality instrument (PDQI-9). Appl Clin Inform. 2012;3(2):164-174.

    Article  Google Scholar 

  20. 20.

    Baker EA, Ledford CH, Fogg L, Way DP, Park YS. The IDEA assessment tool: assessing the reporting, diagnostic reasoning, and decision-making skills demonstrated in medical students’ hospital admission notes. Teach Learn Med. 2015;27(2):163-73.

    Article  Google Scholar 

  21. 21.

    King MA, Phillipi CA, Buchanan PM, Lewin LO. Developing validity evidence for the written pediatric history and physical exam evaluation rubric. Acad Pediatr. 2017;17(1):68-73.

    Article  Google Scholar 

  22. 22.

    Kotwal S, Klimpl D, Tackett S, Kauffman R, Wright S. Documentation of clinical reasoning in admission notes of hospitalists: validation of the CRANAPL assessment rubric. J Hosp Med. 2019;14:E1-e8.

    Article  Google Scholar 

  23. 23.

    Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(2):166.e7-16.

    Article  Google Scholar 

  24. 24.

    Schaye V, Janjigian M, Hauck K, et al. A workshop to train medicine faculty to teach clinical reasoning. Diagn. 2019;6(2):109-113.

    Article  Google Scholar 

  25. 25.

    Schaye V, Eliasz KL, Janjigian M, Stern DT. Theory-guided teaching: implementation of a clinical reasoning curriculum in residents. Med Teach. 2019;41(10):1192-1199.

    Article  Google Scholar 

  26. 26.

    Horlick M, Miller L, Cocks P, Bui L, Schwartz M, Dembitzer A. Calling it like you see it: three-hour workshop improves hospitalists observation and feedback skills. Abstracts from the 38th annual meeting of the society of general internal medicine. J Gen Intern Med. 2015;30(Suppl 2):45-551.

    Google Scholar 

  27. 27.

    Thammasitboon S, Rencic JJ, Trowbridge RL, Olson AP, Sur M, Dhaliwal G. The assessment of reasoning tool (ART): structuring the conversation between teachers and learners. Diagn. 2018;5(4):197-203.

    Article  Google Scholar 

  28. 28.

    Bandaranayake RC. Setting and maintaining standards in multiple choice examinations: AMEE Guide No. 37. Med Teach. 2008;30(9-10):836-845.

    Article  Google Scholar 

  29. 29.

    Olson A, Rencic J, Cosby K, et al. Competencies for improving diagnosis: an interprofessional framework for education and training in health care. Diagn. 2019;6(4):335-341.

    Article  Google Scholar 

  30. 30.

    Ende J. Feedback in clinical medical education. JAMA. 1983;250(6):777-781.

    CAS  Article  Google Scholar 

Download references

Contributors

There are no additional contributors who do not meet the criteria for authorship.

Funding

This work was supported by internal grant funding at the NYU Grossman School of Medicine with a grant from the Program for Medical Education Innovations and Research.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Verity Schaye MD, MHPE.

Ethics declarations

Conflict of Interest

The authors declare that they do not have a conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Schaye, V., Miller, L., Kudlowitz, D. et al. Development of a Clinical Reasoning Documentation Assessment Tool for Resident and Fellow Admission Notes: a Shared Mental Model for Feedback. J GEN INTERN MED (2021). https://doi.org/10.1007/s11606-021-06805-6

Download citation

KEY WORDS

  • clinical reasoning
  • documentation
  • assessment
  • feedback