Abstract
Introduction
Undergraduate medical education has evolved necessarily with the increasing utilization of technology and the availability of ancillary resources developed for medical students. However, medical educational resources are expensive and there have been few studies validating these resources for their ability to significantly modify student exam performance.
Methods
A post-exam survey was devised to evaluate medical students for resource usage, student-perceived preparedness, and exam performance.
Results
Students who felt more prepared for exams performed better than students who felt less prepared (p = .017). Students who watched didactic lectures online and those who utilized peer-to-peer tutoring outperformed students who did not use these resources (p = .035, p = .008). Analyses of the data show that none of the purchased resources utilized significantly improved student exam performance. The majority of students used between six and eight resources for exam preparation. There may be a slightly negative association with the quantity of resources used and exam scores (p = .18).
Discussion
Contrary to traditional confidence studies that correlate overconfidence with underperformance, medical students who reported feeling more prepared for exams performed better than students who felt less prepared.
Conclusion
Medical students may have a more complete grasp of their knowledge base and deficits, which may enable a more accurate match between exam expectations and academic performance. This post-exam survey method can be customized and applied to evaluate resource utility as it pertains to specific undergraduate medical education curricula at individual institutions.
Introduction
Medical school curricula and available student resources have drastically evolved alongside advancements in technology and the internet [1, 2]. The medical students of today are not the medical students of their predecessors. They must constantly review new information as it becomes available. With the current rate at which medical knowledge is updated, it is crucial to ensure that future physicians are comfortable synthesizing and applying new evidence as it unfolds. This has been observed in the recent transition to a “competency-based” approach where students are taught to seek out and apply new information readily [3]. In the same token, the exponential growth in availability of online medical educational resources over the past decade has placed the onus primarily on medical students to ascertain which is the most effective method of knowledge acquisition outside of their formal curriculums [2, 4].
Scheduled progress exams serve to assess a student’s breadth of knowledge in the preclinical setting. It is thus of great interest for medical educators and students alike to identify which modifiable factors will result in the highest utility for exam preparation. Prior studies have found a negative correlation when comparing student overconfidence with examination success [5]. Determining which resources instill a favorable level of confidence while still best preparing students is important for developing a curriculum that both meets the expectations of students and fulfills the goals of the educators [6]. It has also been suggested that the total number of resources used for preparation may have an effect on exam score [7]. The present study examines how post-exam preparedness relates to test performance for first-year students at a new allopathic school. The most commonly utilized preclinical resources used for exam preparation were evaluated for how they influenced test performance and student confidence on exams. We lastly posit that increasing the number of resources utilized will lead to poorer performance outcomes.
Materials and Methods
Our study evaluates the resource usage of medical school students in the preclinical phase of a newly formed medical school. Our curriculum utilizes an organ systems-based approach to exams using the National Board of Medical Education’s (NBME®) Customized Assessment Services. This tool allows educators to develop exams by selecting from a large pool of NBME® questions with content specific to the current curricular testing block. Tests are timed and administered as biweekly progress exams to a cohort of 60 students. Scoring is measured on a 100-point scale, and educator-facing features allow for the analysis of how examinees compare to the performance of medical schools throughout the USA.
We developed an anonymous post-exam survey that assessed for the following components:
-
1.
A “Preparedness Score” (PS), which asked students how prepared they felt about their exam performance on a scale of 1 to 10;
-
2.
A list of potential resources students may have used to prepare, which included:
-
a)
Formal curriculum resources that are recommended in the institution’s preclinical syllabus, which included in-person lectures, online recorded lectures, in-person anatomy sessions, and assigned textbook readings;
-
b)
Extracurricular resources outside of the syllabus, which included peer-to-peer tutoring sessions with above-level medical students; online medical education video-based supplemental curricula (Boards & Beyond®, Pathoma™); practice question banks (Kaplan™ Step 1 QBank, AMBOSS©); visual learning platforms (SketchyMedical©); flashcard review programs (Anki); and study preparation books (First Aid for the USMLE® Step 1);
-
3.
A “Resource-Specific Score” (RSS; scale of 1 to 10) that ascertained how well students felt each utilized resource helped prepare them for each exam;
-
4.
A free response comment section for additional written feedback.
Figure 1 shows a portion of the survey administered to students. This survey was voluntarily completed and administered over eight systems-based NBME® examinations. All participants of this study were informed of survey administration prior to their exams and were given the option to opt out of participation. Surveys were conducted after completion of each exam and before exam scores (ES) were released. Examinees submitted survey results with an anonymized testing ID number. An exam proctor blinded from student responses to the survey paired the final ES with their associated survey results; only fully completed surveys with a relevant testing ID number were included in this study. Data analysis included a linear regression of ES and PS using the Pearson correlation coefficient, an unpaired t test to compare the ES/PS of each resource utilized versus its non-utilization, and the regression analysis of the summative number of resources used and overall ES/PS.
The post-examination questionnaire utilized in this study is included as an online supplement file to provide a model for faculty who wish to incorporate similar studies at their institutions.
Results
There were 161 survey responses collected among eight separately administered examinations. Of the 60 students for each exam session, between 15 and 55% responded to the post-exam survey, with an average response rate of 34% per exam. The mean ES among respondents was 78.7 (ranging between 71.1 and 85.2 across exams), compared to 81.7 across all examinees. The average reported PS was 6.0 (with a range of 4.7 to 7.6 across all exams). Table 1 summarizes the results of respondee’s ES, PS, and overall class ES averages among all exams.
Perception of Test Performance Versus Exam Outcome
We used an alpha level of 0.05 for all statistical tests. Among survey respondents, the reported PS and mean ES results were observed to be strongly positively correlated, r(9) = 0.83, p = 0.017. There was a minimum average ES of 67.5 and maximum of 82.0 (for a PS of 0 and 9, respectively). Figure 2 shows the correlation between mean ES and PS.
Relationship Between Specific Resources Utilized and Exam Performance
Table 2 summarizes the individual use of specific resources for exam preparation and the average ES, PS, and RSS. Examinees who attended online lectures were found to have a 4% increase in ES compared to non-attendees (p = 0.035); those who utilized our curriculum’s peer-to-peer tutoring were found to have a 7% improvement (p = 0.008). When using Kaplan’s QBank to prepare for exams, there was somewhat of an improvement in overall exam performance (ES of 79 vs 76, p = 0.08). Six of the resources utilized were associated with a score higher than the overall mean ES: in-person and online lecture attendance, assigned textbook readings, Kaplan’s QBank, tutoring sessions, and Pathoma. Non-use of Boards & Beyond and AMBOSS ES were slightly higher than 78.7.
When compared to the overall mean PS, survey respondents were found to have reported a marginal increase in perceived preparedness when using Kaplan QBank, Sketchy, and Anki. Conversely, PS were slightly higher when not using in-person lecture attendance, assigned textbooks, anatomy sessions, and Pathoma. No significant differences in PS were observed when comparing any individual resource use to non-use.
On average, examinees rated the specific resources higher than their overall perceived preparedness for an exam (i.e., RSS > PS), with the exception of attending online lectures. The resources with the highest reported RSS were First Aid, Sketchy, tutoring sessions, and Anki.
Number of Resources Used and Exam Performance
The overall average number of resources used for each exam was seven and follows a normal distribution (standard deviation of 2.03; Fig. 3). Higher exam scores were moderately associated with less utilization of resources (Fig. 4), r(11) = − 0.43, p = 0.18. There was no correlation between number of resources and average PS, r(11) = 0.05, p = 0.87. More than half of respondents (55%) used between six and eight resources for exam preparation. By far, the most utilized resources by examinees were Boards & Beyond, followed by First Aid and the Kaplan QBank (n = 151, 135, and 131, respectively).
Discussion
Educational preparation resources available for undergraduate medical students have seen a substantial growth over the past decade [2, 4, 8]. In particular, advancements in technologies have paved the way for increased use of digital resources in medical education [1, 2, 9]. As a result, increased pressure is placed on educators and students alike to identify which resources are most effective for their respective curriculums and individual study habits. Determining if certain resources lead to scholastic improvements is critical to refining a developing medical curriculum, and our assessment of medical student resource utilization on academic results is an attempt to better understand the overall effectiveness of these resources.
The Dunning-Kruger effect describes the phenomenon where individuals with lower skill levels on a specific task overestimate their predicted performance [10]. This has concerning ramifications in the formal education setting, as students demonstrating suboptimal aptitude may underperform in test-taking scenarios. Low performance secondary to overconfidence has been attributed to resultant diminished exam preparation strategies, poor restudy decisions, and personal desires for higher scores biasing grade predictions [11,12,13,14]. Our study shows a significant positive correlation between student perceived preparedness and exam outcomes, which contrasts these traditional findings of overconfidence predicting worse academic performance. This suggests that confidence and testing outcomes may apply differently to medical students, who may have a greater self-awareness of their knowledge deficits post-matriculation into a medical program. This skill is essential to succeeding in medical education given the rigorous coursework and high performance expectations [15]. In contrast to performance in other academic settings [12,13,14], medical students may not be as susceptible to “the planning fallacy” in terms of overestimating how much dedicated study time is needed to obtain a high exam score [16]. As a result, post-exam expectations on how prepared students felt may more accurately represent their actual overall performance.
To our knowledge, there have been no studies that have compared undergraduate medical education resource usage and preclinical knowledge-based progress tests. We hypothesized that increasing the number of resources used would result in poorer outcomes as students would struggle with filtering and prioritizing the amount of available content. Our findings indicate a slightly negative correlation on exam performance with an increasing number of resources used. However, these results were not statistically significant, which aligns with other studies that did not find a major impact on exam performance with the number of resources used [17]. It is of note that reported resource numbers followed a normal distribution, implying that there is a comfortable range for most medical students.
A heavily debated aspect in undergraduate medical education is the utility of the traditional in-person didactic lectures. Attendance of in-class lectures has shown a marked decline over time and is a poor predictor for exam performance [18, 19]. When student attendance declines, faculty report decreased enthusiasm for their work and lower job satisfaction. Students cite many reasons for preferring online lectures, including efficiency and schedule flexibility [20]. Our results found the only resources which resulted in significant exam performance improvements were peer-to-peer tutoring and the use of lecture recordings. The latter of these is consistent with some previous studies [21], but it stands in contrast to other reported findings [22]. While our results show a clear positive association between the use of online lectures and exam performance, students rated the online lectures poorly on their perceived preparedness utility, with a mean reported RSS of 5.1. This may be due to the varying teaching style of educators at the institution, as rarely did any one individual teach across multiple organ systems.
Near-peer tutoring serves as an important adjunct to improve upon a student’s base medical knowledge. The results of our study showed a significant improvement in ES by students who utilized the school’s tutoring services. This may reflect the advantage of directly interfacing with peers who are more advanced in their medical education. Those who participate in these programs are afforded additional learning and revision opportunities in a low-stress, small group learning environment [23]. Participants receive the added benefit of obtaining insight and advice from their experienced peers [24]. There is unfortunately a dearth of published research on peer-assisted learning in medical education, an area that could benefit from future longitudinal studies [25].
None of the other resources in this study showed significant improvements in ES, which suggests for this specific curriculum that none are superior for improving student exam performance. Most students reported using between six to eight resources to prepare for exams, which make it difficult to evaluate the true utility of each resource in isolation. Indeed, certain resources may be synergistic when used in conjunction, or may have a proportionally greater impact when approaching certain topics compared to others [17]. For example, traditional textbook or lecture-based learning paired with expanded-retrieval platforms like Anki may lead to greater studying efficiency and improved overall outcomes [26]. It is possible that the students who performed better on exams in our study also utilized specific resources differently than those who had lower ES [27].
Medical students also report preferring resources that provide high-yield and easily accessible content [17]. Question banks have previously been reported to be the most frequently used resource by medical students. It has also been suggested that with the increased usage of these online question banks, medical schools should independently confirm the utility and validity of such resources with respect to their curriculums when offering them to students [6]. The results of our study did show a modest improvement in exam scores with the use of the Kaplan™ QBank. This may be partly due to the question bank format interfacing well with the longitudinal progress test format of our school’s curriculum [28]. Future studies could consider evaluating the use of question banks with a larger sample size to evaluate for truly significant impacts on exam performance.
There are a number of factors that may influence student resource choice, cost being one of the most important considerations [8, 17]. A recent study found the mean cost for study materials and exam fees during undergraduate medical education totaled nearly $7,500 on average [29]. The prohibitive expenses of medical school attendance coupled with growing student debts highlight the importance of being selective with effective, often non-budget-friendly resources [29]. With the increase in the number of resources, medical students have reported feeling overwhelmed by their myriad of options. Our students reported utilizing the school-provided resources much more frequently than independently purchased resources, which is reflected in the individual resource utilization sample size.
The survey used in this study provides a versatile method to allow institutions to identify potential gaps in their curriculum and determine which resources may most benefit their specific student populations. The increasing utilization of online learning by medical students prompts an investigation of how well resources are being used [30]. Our findings have helped inform our institution’s approach on how future student cohorts may best prepare for preclinical examinations, which includes selecting an ideal number of resources for study and utilization of near-peer sessions. However, we recognize that medical curricula, student compositions, and faculty backgrounds differ across academic institutions. To this effect, we provide the survey that may be customized to fit a specific institution’s needs. By administering this survey post-exam before students have access to their scores, student responses are not influenced by their overall exam performance. Anonymizing survey results also allows students to provide candid responses regarding resources used for study, including those developed for a school’s formal curriculum. Finally, the free response section offers an opportunity for students to provide their input on how faculty may improve future exams. The relative newness of our curriculum served as an opportunity to identify learning gaps early when preparing students for preclinical knowledge mastery. Evaluating which resources improve student performance may also help reduce the financial burden of the preclinical education years by providing students with curriculum-specific validated resources and may also allow faculty to determine which resources may best serve as supplements to their structured educational programs.
This study has limitations. Because ES were released to students at varying times after exams, some students may have taken the survey after seeing their score which could alter the student’s perception of their own preparedness and introduce response bias. We limited this occurrence by having the survey immediately accessible post-exam in the same room testing was administered. The survey was not mandatory, so respondents who felt more confident in their performance may have been more likely to participate. Fewer students completed the survey with each successive exam, so the sample size progressively decreased throughout the course of the study. Our institution’s curriculum is designed around organ systems, so resource usage throughout the year likely changed depending on the subject. The questionnaire was modified as the year progressed to include new resources used by students, which may have limited the data set for certain resources. Furthermore, our survey included an “Other” category for resources not directly included when the survey was conducted, which may lead to an over- or underrepresentation of a resource’s utility. Time spent with in-house learning support specialists (academic counselors, course directors) was a specific resource not considered in our initial study design, which may have impacted overall confidence and academic performance and warrants future study.
Conclusions
Contrary to traditional norms widely reported in the literature, our findings suggest that medical students who feel they are more prepared for exams tend to outperform students who felt less prepared. Purchased resources and the total number of resources used do not seem to have a significant impact on how well students perform on exams. Students who watched lectures online and those who utilized peer-to-peer tutoring services scored higher on exams than students who did not utilize these tools. There was no improvement in exam score seen for students who attended lectures in person. The use of a post-exam survey to evaluate specific resource utilization and exam performance may be helpful for medical school administrations in their selection of supplementary resources for preclinical education. These findings may allow medical schools to modify their curricula and methods of disseminating learning materials based on what is most accessible and feasible for each school. Evaluating resources based on a school’s individual curriculum has the potential to reduce financial burden while reassuring students that they are using validated resources to most effectively prepare for their exams.
Availability of Data and Material
The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.
References
Ruiz JG, Mintzer MJ, Leipzig RM. The impact of E-learning in medical education. Acad Med. 2006;81(3):207–12. https://doi.org/10.1097/00001888-200603000-00002. PMID: 16501260.
Scott K, Morris A, Marais B. Medical student use of digital learning resources. Clin Teach. 2018;15(1):29–33. https://doi.org/10.1111/tct.12630. Epub 2017 Mar 16. PMID: 28300343.
Laird-Fick HS, Solomon DJ, Parker CJ, Wang L. Attendance, engagement and performance in a medical school curriculum: early findings from competency-based progress testing in a new medical school curriculum. PeerJ (San Francisco, CA). 2018;6:e5283. https://doi.org/10.7717/peerj.5283.
Egle JP, Smeenge DM, Kassem KM, Mittal VK. The internet school of medicine: use of electronic resources by medical trainees and the reliability of those resources. J Surg Educ. 2014;72(2):316–20. https://doi.org/10.1016/j.jsurg.2014.08.005.
Bell P, Volckmann D. Knowledge surveys in general chemistry: confidence, overconfidence, and performance. J Chem Educ. 2011;88(11):1469–76. https://doi.org/10.1021/ed100328c.
Wynter L, Burgess A, Kalman E, Heron JE, Bleasel J. Medical students: what educational resources are they using? BMC Med Educ. 2019;19(1):36. https://doi.org/10.1186/s12909-019-1462-9.
Volk AS, Rhudy AK, Marturano MN, Ott L, DuCoin C. Best study strategy for the NBME clinical science surgery exam. J Surg Educ. 2019;76(6):1539–45. https://doi.org/10.1016/j.jsurg.2019.05.012.
Choi-Lundberg DL, Low TF, Patman P, Turner P, Sinha SN. Medical student preferences for self-directed study resources in gross anatomy. Anat Sci Educ. 2016;9(2):150–60. https://doi.org/10.1002/ase.1549. Epub 2015 Jun 1. PMID: 26033851.
Egarter S, Mutschler A, Tekian A, Norcini J, Brass K. Medical assessment in the age of digitalisation. BMC Med Educ. 2020;20(1):101. https://doi.org/10.1186/s12909-020-02014-7. PMID: 32234051; PMCID: PMC7110637.
Ehrlinger J, Johnson KL, Banner M, Dunning DA, Kruger J. Why the unskilled are unaware: further explorations of (absent) self-insight among the incompetent. Organ Behav Hum Decis Process. 2008;105:98–121. https://doi.org/10.1016/j.obhdp.2007.05.002.
Dunlosky J, Serra MJ, Matvey G, Rawson KA. Second-order judgments about judgments of learning. J Gen Psychol. 2005;132:335–46. https://doi.org/10.3200/GENP.132.4.335-346.
Grimes PW. The overconfident principles of economics student: an examination of a metacognitive skill. J Econ Educ. 2002;33:15–30. https://doi.org/10.1080/00220480209596121.
Serra MJ, DeMarree KG. Unskilled and unaware in the classroom: college students’ desired grades predict their biased grade predictions. Mem Cogn. 2016;44:1127–37. https://doi.org/10.3758/s13421-016-0624-9.
Shanks LL, Serra MJ. Domain familiarity as a cue for judgments of learning. Psychon Bull Rev. 2014;21:445–53. https://doi.org/10.3758/s13423-013-0513-1.
Kötter T, Wagner J, Brüheim L, Voltmer E. Perceived Medical School stress of undergraduate medical students predicts academic performance: an observational study. BMC Med Educ. 2017;17(1):256. https://doi.org/10.1186/s12909-017-1091-0. PMID: 29246231; PMCID: PMC5732510.
Buehler R, Griffin D, Ross M. Inside the planning fallacy: the causes and consequences of optimistic time predictions. In: Gilovich T, Griffin D, Kahneman D, editors. Heuristics and biases: The psychology of intuitive judgment. Cambridge, UK: Cambridge University Press; 2002. p. 251–70.
Taylor JA, Shaw CM, Tan SA, Falcone JL. Are the kids alright? review books and the internet as the most common study resources for the general surgery clerkship. Am J Surg. 2017;215(1):191–5. https://doi.org/10.1016/j.amjsurg.2017.01.036.
Ikonne U, Campbell AM, Whelihan KE, Bay RC, Lewis JH. Exodus From the classroom: student perceptions, lecture capture technology, and the inception of on-demand preclinical medical education. J Am Osteopath Assoc. 2018;118(12):813–23. https://doi.org/10.7556/jaoa.2018.174. PMID: 30476993.
Kauffman CA, Derazin M, Asmar A, Kibble JD. Relationship between classroom attendance and examination performance in a second-year medical pathophysiology class. Adv Physiol Educ. 2018;42(4):593–598. https://doi.org/10.1152/advan.00123.2018.
Zazulia AR, Goldhoff P. Faculty and medical student attitudes about preclinical classroom attendance. Teach Learn Med. 2014;26(4):327–34. https://doi.org/10.1080/10401334.2014.945028. PMID: 25318026.
Tang B, Coret A, Qureshi A, Barron H, Ayala AP, Law M. Online lectures in undergraduate medical education: scoping review. JMIR Med Educ. 2018;4(1):e11. Published 2018 Apr 10. https://doi.org/10.2196/mededu.9091.
Doggrell SA. No apparent association between lecture attendance or accessing lecture recordings and academic outcomes in a medical laboratory science course. BMC Med Educ. 2020;20(1):207. https://doi.org/10.1186/s12909-020-02066-9. PMID: 32605579; PMCID: PMC7329538.
Burgess A, Dornan T, Clarke AJ, Menezes A, Mellis C. Peer tutoring in a medical school: perceptions of tutors and tutees. BMC Med Educ. 2016;16:85. https://doi.org/10.1186/s12909-016-0589-1. PMID: 26956642; PMCID: PMC4784332.
Menezes A, Burgess A, Clarke AJ, Mellis C. Peer-assisted learning in medical school: tutees’ perspective. Adv Med Educ Pract. 2016;7:31–8. https://doi.org/10.2147/AMEP.S94570. PMID: 26848282; PMCID: PMC4723028.
Akinla O, Hagan P, Atiomo W. A systematic review of the literature describing the outcomes of near-peer mentoring programs for first year medical students. BMC Med Educ. 2018;18(1):98. https://doi.org/10.1186/s12909-018-1195-1. Erratum in: BMC Med Educ. 2018 Jul 13;18(1):167. PMID: 29739376; PMCID: PMC5941612.
Pumilia CA, Lessans S, Harris D. An evidence-based guide for medical students: how to optimize the use of expanded-retrieval platforms. Cureus. 2020;12(9):e10372. https://doi.org/10.7759/cureus.10372. PMID: 33062495; PMCID: PMC7550004.
Jayakumar KL. Applying feedback lessons to online medical question banks. J Grad Med Educ. 2018;10(1):109. https://doi.org/10.4300/JGME-D-17-00621.1. PMID: 29467987; PMCID: PMC5821013.
Freeman A, Nicholls A, Ricketts C, Coombes L. Can we share questions? Performance of questions from different question banks in a single medical school. Med Teach. 2010;32(6):464–6. https://doi.org/10.3109/0142159X.2010.486056. PMID: 20515373.
Bhatnagar V, Diaz SR, Bucur PA. The cost of board examination and preparation: an overlooked factor in medical student debt. Curēus (Palo Alto, CA). 2019;11(3):e4168. https://doi.org/10.7759/cureus.4168.
Sheehy R. This is not your grandfather’s medical school: novel tools to enhance medical education. Mo Med. 2019;116(5):371–5.
Funding
No specific funding was received from any bodies in the public, commercial, or not-for-profit sectors to carry out the work described in this article.
Author information
Authors and Affiliations
Contributions
All authors contributed to the completion of this manuscript. Study and survey conceptualization was achieved by JB and AA; survey administration was performed by VR; data collection and analysis were performed by VR, AD, TM, RN. The final draft was written by JB and AA; all authors provided revisions on previous versions of the manuscript. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Ethics Approval
This study has been approved through the UNLV Institutional Review Board: [1030906–1] School of Medicine use of program evaluation data for research dated April 3, 2017.
Consent to Participate
Verbal consent was obtained from all individual participants included in the study. No identifying information about participants was included in the final manuscript.
Competing Interests
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Bauzon, J., Alver, A., Ravikumar, V. et al. The Impact of Educational Resources and Perceived Preparedness on Medical Education Performance. Med.Sci.Educ. 31, 1319–1326 (2021). https://doi.org/10.1007/s40670-021-01306-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40670-021-01306-x