Abstract
Physician-scientists are uniquely positioned to achieve significant biomedical advances to improve the human condition. Their clinical and scientific training allows them to bridge fields and contribute to cutting-edge, clinically relevant research. The need for a highly skilled physician-scientist workforce has never been more acute. We propose a competency-guided program design (CGPD) framework that focuses on core skills to enhance the physician-scientist training curriculum. In partnership with clinical and graduate curricula, the CGPD framework can be employed as a tool to meaningfully integrate physician-scientist training, address barriers to attract and sustain the physician-scientist workforce, and avoid overprogramming that detracts from a solid foundation of clinical and graduate research training.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Background
As clinical and research disciplines have become more specialized and sophisticated, the training of physician-scientists has evolved, better-preparing trainees for increasingly diverse careers. The NIH Physician-Scientist Workforce Working Group Report released in 2014 and numerous following publications have described and documented remaining challenges facing physician-scientist [1,2,3]. These challenges include funding research programs, mentorship, and personal development and wellness.
Physician-scientist curricula are changing in response to documented challenges to attract and sustain the workforce as well as benchmarks established by competing programs, student-led initiatives, institutional requirements, Liaison Committee on Medical Education (LCME) clinical training requirements, training program evaluation, and the blending of updated clinical and graduate training activities. While these adaptations have led to numerous innovations, they can result in diminished curriculum integration, overprogramming, redundant or irrelevant training activities, a loss of training focus, and increased administrative burden. While innovations to physician-scientist training activities help optimize skill development, increase recruitment, and maximize retention, curricular design and refinement should be guided by systematic approaches based on overall program training objectives [2].
Competency-Based Education
Competency-based education (CBE) de-emphasizes time-based training and provides a learner the opportunity to advance at their own pace based on their ability to master particular skills [4, 5]. Competencies include explicit, measurable, transferable learning objectives that empower trainees to understand what is expected for mastery and advancement. In addition, an essential component of CBE is timely, personalized support for each student based on individual learning needs.
In 1999, the Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Medical Specialties (ABMS) endorsed six core clinical competency domains: Patient Care, Medical Knowledge, Professionalism, Interpersonal and Communication Skills, Practice-Based Learning and Improvement, and Systems-Based Practice. The Vanderbilt University School of Medicine (VUSM) implemented the ACGME competency-based assessment in 2013 at the undergraduate medical education (UME) level, including evidence-based curricula [6, 7]. Competency-based education carefully aligns assessment with the acquisition of competencies expected of graduating physicians.
Graduate training programs have been slower than medical schools to implement CBE, but change is imminent. [8, 9] The NIH recently modified the requirements for graduate training programs to “identify training needs and objectives (i.e., specific, measurable outcomes the program intends to achieve), and develop plans to implement evidence-based training activities.” The new funding opportunity announcement (FOA) also challenges funded programs “to provide evidence of accomplishing the training objectives.” The initial step in accomplishing these overarching goals is to define these training objectives in terms of attained competencies.
Physician-scientists address critical medical care and human health issues by integrating clinical practice and research activities into their careers. Consequently, their training curriculum must comprise a strong core education in clinical medicine and intensive training in scientific inquiry. Based on our experience, we developed a Competency-Guided Program Design (CGPD) framework to align our curriculum with nine core competencies (Fig. 1) that prepare physician-scientists to translate clinically driven research activities into medical advances.
Physician-Scientist Competencies
The Vanderbilt University Medical Scientist Training Program (MSTP) integrates the clinical and graduate educational programs by providing a framework for trainees to develop and practice competencies deemed critical for successful physician-scientist careers. The map in Fig. 2 depicts our training program in which students complete the requirements to receive both the M.D. and Ph.D. degrees. MSTP students typically complete the Foundations of Medical Knowledge (medical school year 1, M1) and Foundations of Clinical Care (medical school year 2, M2) phases, engage in graduate studies (graduate years, G), and then return for a single remaining medical school year (M4), the Immersion Phase. Although each training phase has a primary core competency focus, there is intentional overlap embedded as the application of multiple competencies is uniquely viewed through the physician and scientist lenses and is implemented longitudinally throughout training (i.e., mentoring/teaching and career/personal development).
The nine core competency domains (CD, Fig. 1) were derived from published desired competencies from graduate and medical education. [9, 10] Importantly, trainees provide longitudinal feedback on competency domains and reinforce their use when planning programmatic activities, including student-led initiatives. Within each of these nine domains, we address multiple skill sets such as oral and written communication, translation of clinical questions to research, management of challenges and conflict resolution, teamwork and group dynamics, wellness, self-knowledge, the balance of clinical and research responsibilities, best mentoring practices, scientific promotion, advocacy, and policy. Although some of these domains may already be addressed in both medical (e.g., patient care) and graduate (e.g., communication skills) curricula, including these domains in the physician-scientist curricula reinforces their importance and allows for deeper integration of these skills across various training experiences and learning environments.
Advantages of Competency-Guided Program Design
There are several advantages to using a competency-guided program design (CGPD) approach. First, a CGPD approach facilitates the intentional configuration of learning activities to maximize trainee competency acquisition. CGPD provides a framework to deliberately map programming activities to specific competency domains to ensure alignment with desired training objectives (Fig. 3). Second, the CGPD approach facilitates quality improvement by exposing gaps in curriculum design that result in insufficient training in the desired competency domain. Conversely, it aids in identifying programming misaligned with essential competencies necessary to train physician-scientists and allows the retiring of these activities. CGPD provides a tool to avoid overprogramming, a natural outcome given the heavy emphasis on innovation and enhancement during program review processes. Third, mapping proposed new activities to specific competencies serves as a litmus test to determine if new programming is advantageous and justified. Fourth, using the CGPD framework provides valuable information to trainees as they seek to develop their physician-scientist careers. Fifth, CGPD can aid program evaluation, a mandatory component of NIH T-32 supported training programs, by clearly articulating the desired program outcomes (short-, mid-, and long-term) aligned with the necessary resources (inputs) and educational activities. Finally, CGPD aids in identifying opportunities for greater integration between clinical and graduate training curricula, thus reinforcing and translating learned skills in different professional environments.
Overall, by addressing the issues listed above, the CBPD framework allows programs to directly tackle challenges and barriers associated with physician-scientist careers [1,2,3]. For example, programs can be intentional in addressing funding challenges by establishing a grantsmanship curriculum specifically designed to develop skills critical to securing research funding as trainees’ careers progress. Programs can also facilitate mentor development by implementing formalized mentoring programs, including “mentoring the mentor” training. Finally, CGPD can raise the importance of personal development and wellness skills that will facilitate work-life balance and minimize burnout.
The Vanderbilt University MSTP uses CGPD to fine-tune our curriculum. After defining the competency domains, we mapped all core curriculum and programmatic activities to each domain (Fig. 3) and then adjusted programming as indicated. We provide changes made to the Vanderbilt MSTP curriculum to illustrate the CGPD approach.
Applying CGPD to the weekly MSTP Seminar Series, the flagship course of our educational program, which extends throughout the medical and graduate phases of training, we mapped the series learning objectives to the core competencies for physician-scientists (Fig. 3). The series is a weekly, student-led, literature-based course consisting of journal club, research in progress, and clinical-based presentations. The objectives of the series are to (a) foster development of critical-thinking skills by appraisal of contemporary scientific literature; (b) enhance scientific creativity through discussion of experimental approaches and techniques; (c) develop oral communication skills; (d) afford weekly opportunities for interaction between MSTP trainees and the Vanderbilt physician-scientist community; and (e) provide practical leadership and peer-peer mentoring experiences through organizing and running the series. Mapping these learning objectives to the core competency domains facilitated refining the series format to ensure programming achieved the series objectives. For example, first-year MSTP student journal club talks were modified to enhance mentor–mentee relationships (mentoring/teaching), appraising a scientific paper (discipline-specific foundational knowledge), and developing presentation skills (communication skills).
Our competency-based curriculum map identified a gap in our Career and Personal Development domain, which led to deliberate efforts to implement wellness and women-in-training programming to address the mental health and gender equity needs specific to the MSTP community and physician-scientist training. To close the identified gap, we developed faculty-guided, student-led committees tasked with enhancing the wellness of the student body through programming that decreases mental health stigma, provides resources, educates trainees about the challenges facing physician-scientists, and provides social interactions to foster near-peer and peer-peer mentoring relationships. These highly rated programs are now in their 3rd year.
The CGPD framework with curriculum mapping has also identified overprogramming. For example, with the implementation of the MSTP Mock Study Section (a student-led, faculty moderated, fellowship review committee modeled after NIH review panels) and increasing grantsmanship training through formal coursework and seminars in graduate school, we retired our annual MSTP Grant Writing Workshop.
Finally, CGPD can be a guiding mechanism in evaluating training programs. The NIH MSTP T32 funding mechanism now requires formal program evaluation using evidence-based approaches. The NIH defines program evaluation as a “systematic assessment of the operations and/or outcomes of a program, compared to a set of explicit or implicit stands, as a means of contributing to the improvement of the program.” [12, 13] The NIH also requires every funded M.D.-Ph.D. training program to “describe an evaluation or assessment process to determine whether the overall program is effective in meeting its training mission and objectives.”[14] Formal program evaluation models offer numerous benefits by (1) providing a clear definition of the program’s scope; (2) improving planning, management, and quality improvement; and (3) augmenting intentionality and purpose [15, 16]. The CGPD offers a framework to define an evaluation model based on program components (e.g., resources, activities, outputs) and desired outcomes mapped to short, intermediate, and long-term competencies. [17].
The CGPD is a conceptual overlay structure for any program that trains physician-scientists and implementation as such does not require significant infrastructure. The implementation would usually consist of incorporation into existing programmatic structure and workflows as curricular changes are being considered. Overall, the implementation of CGPD optimizes and streamlines program development, thus focusing activities on maximizing the physician-scientist training experience.
Future Directions
Refining Competencies
The competency domains designed to guide medical student training were first adopted in 1999 by the ACGME focus on the physician’s role and their ability to provide high-quality patient care. [11] Since their inception, the ACGME core competencies have continued to evolve, and other frameworks have been developed, such as the Entrustable Professional Activities (EPA). [18] The proposed nine core competencies for physician-scientists (Fig. 1) should also evolve to ensure that they continue to include all of the essential skills for success in the physician-scientist career path. Therefore, we believe that a robust national conversation regarding physician-scientists core competencies rooted in desired outcomes should commence balancing the efficiency of training with skill development throughout the continuum of physician-scientist education (from trainees to faculty).
Assessment
As mentioned above, defining and adopting core competencies is an initial step in CGPD. To assess trainee competency attainment, the development of validated milestone-based instruments is an essential next step in the competency-based education of physician-scientists. Milestones are developmental behavioral descriptors of the learning trajectory within the core competencies that align expectations between the learner and the faculty. Currently, milestones are utilized primarily in graduate medical education (GME) to assess the development of residents and, to a lesser extent, in undergraduate medical education (UME) [7, 19, 20]. Biomedical graduate training programs lack validated milestone-based assessment methods and have only begun to explore the milestone framework [9]. CGPD provides an opportunity for graduate doctoral programs to respond proactively to the NIH’s call for the development and implementation of “effective, evidence-based approaches to biomedical graduate training.” [19] While the individual development plan (IDP) is one tool that has been successfully implemented to help support, plan, and track trainees’ career development and learning opportunities and to facilitate communication between mentees and mentors, [21] there is an opportunity to complement the IDP with milestone-based assessment. However, these methods differ in their primary approach. While the IDP is primarily used by mentees and mentors to plan and communicate progress, milestone-based assessments require multiple observers and learning settings (e.g., classroom, laboratory, department). In addition, they are designed to monitor and improve educational outcomes at the individual learner level by modifying the training program structure. Thus, the natural companion to CGPD is building in assessment tools to track trainee progress towards competency attainment to a level beyond the stage of proficiency.
Conclusions
Why should you consider the CGPD approach described here to refine your program design? We have observed that the CGPD framework facilitates optimization of physician-scientist education anchored on core competencies that build upon a solid integrated foundation of clinical and graduate research training. It does this by (1) ensuring that activities align with desired outcomes, (2) exposing gaps in training, (3) avoiding overprogramming, (4) informing new programming, (5) guiding program evaluation, and (6) identifying opportunities for greater integration. Notably, the trainees also benefit from CGPD because the program’s expectations (and career outcomes) are clearly articulated, therefore allowing trainees to assess their progress and design personalized learning goals. Although this manuscript is focused on the application of the CGPD to dual degree (M.D./Ph.D.) programs, we posit that it can be applied to other graduate programs, including master’s level degree programs (e.g., master’s in clinical investigation, master’s in biomedical informatics, master’s in public health) and late bloomer physician-scientist programs (e.g., research in residency).
To conclude, we hope that the CGPD provides a springboard for critical national discussions on the core competencies that all trainees need to attain to become successful in their future careers as physician-scientists, effective assessment strategies to track our trainees’ progress, and build evaluation models that lead to intentional and impactful quality improvement efforts.
References
Salata RA, Geraci MW, Rockey DC, et al. U.S. physician-scientist workforce in the 21st century: recommendations to attract and sustain the pipeline. Acad Med. Apr 2018;93(4):565–573. https://doi.org/10.1097/ACM.0000000000001950
Williams CS, Iness AN, Baron RM, et al. Training the physician-scientist: views from program directors and aspiring young investigators. JCI Insight. Dec 6 2018;3(23). https://doi.org/10.1172/jci.insight.125651
Feldman AM. The National Institutes of Health Physician-Scientist Workforce Working Group report: a roadmap for preserving the physician-scientist. Clin Transl Sci. 2014;7(4):289–90. https://doi.org/10.1111/cts.12209.
Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010;32(8):631–7. https://doi.org/10.3109/0142159X.2010.500898.
Patrick S, Sturgis C. Necessary for success: building mastery of world-class skills a state policymakers guide to competency education. Accessed August 5, 2020. http://www.aurora-institute.org/wp-content/uploads/necessary-for-success.pdf
Lomis KD, Carpenter RO, Miller BM. Moral distress in the third year of medical school; a descriptive review of student case reflections. Am J Surg. 2009;197(1):107–12. https://doi.org/10.1016/j.amjsurg.2008.07.048.
Pettepher CC, Lomis KD, Osheroff N. From theory to practice: utilizing competency-based milestones to assess professional growth and development in the Foundational Science Blocks of a Pre-Clerkship Medical School Curriculum. Med Sci Educ. 2016;26(3):491–7. https://doi.org/10.1007/s40670-016-0262-7.
Barnett JV, Harris RA, Mulvany MJ. A comparison of best practices for doctoral training in Europe and North America. FEBS Open Bio. 2017;7(10):1444–52. https://doi.org/10.1002/2211-5463.12305.
Verderame MF, Freedman VH, Kozlowski LM, McCormack WT. Competency-based assessment for the training of PhD students and early-career scientists. Elife. May 31 2018;7. https://doi.org/10.7554/eLife.34801
Guyadeen D, Seasons M. Evaluation theory and practice: comparing program evaluation and evaluation in planning. J Plan Educ Res. 2018;38(1):98–110. https://doi.org/10.1177/0739456x16675930.
ACGME. Accreditation Council for Graduate Medical Education (ACGME) Outcome Project, © ACGME. 2003.
Weiss CH. Evaluation research evaluation: methods for studying programs and policies. 2nd ed. Prentice Hall. 1998.
Mertens DM, Wilson AT. Program evaluation theory and practice, Second Edition. Guilford Publications. 2018.
Porteous N, Sheldrick B, Stewart P. Introducing program teams to logic models: facilitating the learning process. Canadian J Prog Eval. 2002;17(3):113–41.
Wholey JS, Hatry HP, Newcomer KE. Handbook of practical program evaluation. Wiley 2010.
ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39:1176–7.
Lomis KD, Russell RG, Davidson MA, et al. Competency milestones for medical students: design, implementation, and analysis at one medical school. Med Teach. 2017;39(5):494–504. https://doi.org/10.1080/0142159x.2017.1299924.
Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system–rationale and benefits. N Engl J Med. 2012;366(11):1051–6. https://doi.org/10.1056/NEJMsr1200117.
PAR-19–036: Medical Scientist Training Program (T32). Accessed August 4, 2020. https://grants.nih.gov/grants/guide/pa-files/par-19-036.html
Individual Development Plan for Postdoctoral Fellows. FASEB. Accessed August 1, 2020. http://www.faseb.org/portals/0/pdfs/opa/idp.pdf
Bills JL, Davidson M, Dermody TS. Effectiveness of a clinical intervention for MD/PhD students re-entering medical school. Teach Learn Med. 2013;25(1):77–83. https://doi.org/10.1080/10401334.2012.741539.
Acknowledgements
We thank the Vanderbilt MSTP Leadership Team (Danny Winder, Ph.D.; Ambra Pozzi, Ph.D.; Sally York, M.D., Ph.D.; Bryn Sierra, M.S., M.Ed.; and Melissa Krasnove, M.Ed.) for their group efforts to define core competencies for physician-scientists. We also thank Steven J. Weissenburger, M.Ed., for his expertise in facilitating the team’s discussion to define these competencies. In addition, we thank Dean Jeffrey Balser for the vision, leadership, and support of the Vanderbilt MSTP. Finally, we also thank our students, who actively engage in improving physician-scientist training at Vanderbilt and nationally.
Funding
This work was supported by the NIGMS of the National Institutes of Health under award number T32GM007347.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Ethics Approval
Not applicable.
Consent to Participate
Not applicable.
Conflict of Interest
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Estrada, L., Williams, M.A. & Williams, C.S. A Competency-Guided Approach to Optimizing a Physician-Scientist Curriculum. Med.Sci.Educ. 32, 523–528 (2022). https://doi.org/10.1007/s40670-022-01525-w
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40670-022-01525-w