Journal of General Internal Medicine

, Volume 23, Issue 7, pp 969–972

Impact of a 360-degree Professionalism Assessment on Faculty Comfort and Skills in Feedback Delivery


    • Department of MedicineMontefiore Medical Center/Albert Einstein College of Medicine
  • Deborah Korenstein
    • Division of General Internal MedicineMount Sinai School of Medicine
  • Reena Karani
    • Brookdale Department of Geriatrics and Adult DevelopmentMount Sinai School of Medicine
Brief Report

DOI: 10.1007/s11606-008-0586-0

Cite this article as:
Stark, R., Korenstein, D. & Karani, R. J GEN INTERN MED (2008) 23: 969. doi:10.1007/s11606-008-0586-0



Professionalism is identified as a competency of resident education. Best approaches to teaching and evaluating professionalism are unknown, but feedback about professionalism is necessary to change practice and behavior. Faculty discomfort with professionalism may limit their delivery of feedback to residents.


A pilot program to implement a 360-degree evaluation of observable professionalism behaviors and determine how its use impacts faculty feedback to residents.


Internal Medicine (IM) residents were evaluated during ambulatory rotations using a 360-degree assessment of professional behaviors developed by the National Board of Medical Examiners®. Faculty used evaluation results to provide individual feedback to residents.


Fifteen faculty members.

Measurements and Main Results

Faculty completed pre- and post-intervention surveys. Using a 7-point Likert scale, faculty reported increased skill in giving general feedback (4.85 vs 4.36, p < .05) and feedback about professionalism (4.71 vs 3.57, p < .01) after the implementation of the 360-degree evaluation. They reported increased comfort giving feedback about professionalism (5.07 vs 4.35, p < .05) but not about giving feedback in general (5.43 vs 5.50).


A 360-degree professionalism evaluation instrument used to guide feedback to residents improves faculty comfort and self-assessed skill in giving feedback about professionalism.


professionalismfeedback360-degree evaluationinternshipresidency


Since publication of the Physician Charter on Medical Professionalism,1 and the inclusion of professionalism among six core competencies of the Accreditation Council on Graduate Medical Education (ACGME) Outcomes Project,2 educators have focused on incorporating the training and assessment of professionalism into residency education.

Previously, professionalism was viewed as an innate quality that could not be taught. The medical education community now recognizes the importance of formal education about professionalism throughout the continuum of medical education36; yet, best approaches to teaching professionalism remain uncertain. Simple didactic sessions may not be effective teaching modalities for advanced learners and may not be the optimal approach to teaching a concept that requires internalization through authentic experiences.7

Informal and formal discussions between faculty and learners about professional behaviors and values are largely absent when faculty are observed during their usual routines.8 Despite excellent guidelines for giving effective feedback about learners’ general performance,911 it is likely that faculty rarely engage learners in formal feedback about professionalism. Such conversations may be an important first step in enhancing professionalism in the medical training environment.

In this pilot study, we set out to determine if an educational intervention involving recognition of professional behaviors and subsequent feedback about these behaviors is a feasible modality to emphasize professionalism in our Internal Medicine (IM) residency program. We utilized an instrument under development by the National Board of Medical Examiners® (NBME), which employs a 360-degree evaluation of the observable professional behaviors of trainees. Residents in our program were evaluated by faculty and staff using the instrument.

We used pre- and post-intervention surveys to determine if implementation of this instrument in the outpatient setting impacts the experience of faculty who deliver feedback to trainees during ambulatory rotations. Specifically, we hypothesized that sustained use of the professionalism evaluation instrument would improve faculty comfort delivering feedback and discussing professional behaviors with trainees.



Clinician–educator faculty in the Division of General Internal Medicine at Mount Sinai Medical Center were subjects (n = 15).

Description of the Educational Intervention

Behaviors of Professionalism Evaluation Instrument

The professionalism evaluation form was provided by the NBME as part of the Assessment of Professional Behaviors Field Trial.12 The instrument is a 360-degree assessment, through which multiple observers evaluate an individual on identical items. The instrument includes 21 observable behaviors, 6 relational items, 3 global performance ratings, and free text comments. Data collected from all evaluators was generated in the form of an “Individual Summary Report.”

Site and Timing of the Professionalism Assessments

Evaluations and feedback occurred in the IM continuity clinic during residents’ 4-week ambulatory rotation. Every IM resident was evaluated by all faculty and staff members (receptionists, nurses, medical assistants) with whom they work regularly. During the third week of the rotation, evaluators received an e-mail with a web link to appropriate evaluations.

Project Implementation

In June 2006, the project was introduced to the faculty and they were instructed on the use of the online form. Expectations were outlined, including completion of one online professionalism evaluation for each resident during the third week of his or her outpatient rotation, and delivery of feedback to assigned residents during the fourth week of the rotation using the “Individual Summary Report.”

The project was addressed separately with receptionists, nurses, and medical assistants. Staff were instructed in the use of the online evaluation form, and expectations for their participation were clarified.


Upon entering residency, each trainee is assigned an outpatient advisor. Individual faculty serve as advisors to 4–8 residents during their training. These assignments were made prior to initiating this study.

During week 4 of a resident’s ambulatory rotation, a paper copy of his or her “Individual Summary Report” was given to the appropriate faculty advisor. The report was accompanied by a letter instructing faculty to use the report to guide formative feedback. No other specific instructions were given.


Faculty completed pre- and post-intervention surveys. Surveys used a Likert scale to determine frequency with which feedback is delivered to trainees, importance of feedback, and level of comfort/skill providing feedback (Appendix). Faculty completed pre-intervention surveys in June 2006, prior to any introduction to the project. They completed post-intervention surveys 6 months into the project. This study was reviewed by the Mount Sinai School of Medicine Institutional Review Board and received exemption.

Data Analysis

SPSS software (version 14.0; SPSS, Chicago, IL, USA) was used for statistical analyses. The Wilcoxon signed ranks test was used to identify differences in faculty responses on pre- and post-intervention surveys.


Faculty and staff completed most monthly assigned evaluations (74%). Compliance rates by job title are listed in Table 1. In the first 6 months of implementation, 1,264 evaluations were generated, 749 were completed, and 256 were suspended, indicating evaluators had too little contact to complete the evaluation. The remainder were incomplete.
Table 1

Compliance Rates with Professionalism Evaluations at 6 Months into Implementation

Job Title

Compliance (%)

Medical assistant


Physician/faculty member






All faculty members (n = 15) responded to pre- and post-intervention surveys. Following use of the professionalism evaluation form, faculty reported more skill in providing general feedback (p < .05) and feedback specifically about professionalism (p < .01) compared with pre-intervention responses (Table 2). Faculty also reported more comfort giving feedback about professionalism (p < .05), but no change in comfort giving general feedback to trainees. Post-intervention surveys identified a nonsignificant trend toward reporting increased frequency of feedback delivery about professional behaviors (p = .086). There was no change in faculty responses to non-professionalism-related items, such as the importance of feedback about medical knowledge or interpersonal communication skills or comfort in providing feedback in these areas.
Table 2

Results of Faculty Pre- and Post-Intervention Surveys

Survey item

Mean Pre*

Mean Post

p value

Self-reported skill in giving general feedback to interns/residents




Self-reported skill in feedback about professionalism to interns/residents




Comfort giving general feedback




Comfort giving feedback about


 Medical knowledge




 Interpersonal communication skills








Frequency giving general feedback




Frequency giving feedback about


 Medical knowledge




 Interpersonal communication skills








Importance of feedback about


 Medical knowledge




 Interpersonal communication skills








NS = not significant when p > .10

*Means are based on data from a 7-point Likert scale, with higher scores being more favorable in each category

Data analyzed using Wilcoxon ranks sum test


In this pilot study, we implemented a 360-degree evaluation of the professional behaviors of IM residents in our outpatient practice. Our goals were twofold: to determine the feasibility of use of the instrument in our training program and to determine if using the instrument to both identify professional behaviors and guide feedback would impact faculty skills and attitudes about feedback on professionalism.

First, we were able to introduce and implement the 360-degree evaluation with little resistance from faculty or staff and obtain high evaluation completion rates. Receptionists had the lowest response rate (48%), with all other groups having a high response rate. The instrument is most valuable if it provides information from multiple evaluators, and we have shown that this is possible with little effort, even in our practice with over 100 trainees.

While the best approaches to teaching and evaluating professionalism remain unclear, it has been noted that 360-degree assessments of professionalism are useful for the evaluation of residents.13 Other groups have demonstrated the feasibility of implementing 360-degree evaluations in residency programs for the assessment of a variety of ACGME competencies,1417 and our findings reinforce the tool’s feasibility in a training environment. To our knowledge, this study is the first to demonstrate that information from 360-degree evaluations can impact feedback delivery in a learning environment.

Using the professionalism evaluation instrument to guide feedback to IM residents positively impacted our faculty’s experience giving feedback. Specifically, utilization of the NBME’s Assessment of Professional Behaviors evaluation and summary report increased faculty’s self-reported skill and comfort in giving feedback about professionalism.

Faculty who participated in the study were exposed to the instrument several times a month for six consecutive months and gave multiple monthly feedback sessions to residents. It is likely that repetitive exposure to the evaluation and feedback process explains the positive impact on faculty, even in the absence of targeted faculty training.

Improving faculty comfort and skill in providing feedback about professionalism is an imperative first step in increasing conversations with trainees about professional behaviors. This is also a necessary step in overcoming the many barriers that prevent faculty from providing trainees with feedback in general, and specifically about professionalism.

Measuring and enhancing the professionalism of physicians-in-training is important for several reasons. It has been demonstrated that unprofessional behavior in medical school, particularly irresponsibility and poor adaptability, predicts subsequent disciplinary action by state medical boards.18,19 Identifying unprofessional behaviors during training may allow for remediation and future monitoring. Further, the ACGME requires residency programs to implement curricula on professionalism and demonstrate that their graduates have professionalism “proficiency.”2

Our study has several important limitations. First, our intervention took place at a single academic center and generalizability to other settings is unknown. Second, our sample size was small and we lacked statistical power to detect small differences in attitudes. Despite these limitations, we had an excellent response rate to our survey and we were able to demonstrate a significant effect of our intervention.

Perhaps the greatest limitation to our study is the lack of a control group, which was omitted for logistical reasons. The number of core faculty who provide resident feedback was too small to divide further into control and intervention groups without severely limiting our statistical power to determine differences between groups. With our encouraging preliminary results, designing a follow-up study to include a control group is an anticipated next step.

This pilot program is a first step in evaluating and improving the professionalism of IM residents in our program and in developing an expectation that faculty give feedback about professional behaviors. Moving forward, we will evaluate the impact of professionalism assessments on resident attitudes. We hope that ongoing measurement of professional behaviors will continue to increase discussions about professionalism and positively impact faculty and resident attitudes.


The authors thank the NBME® for use of the Assessment of Professional Behaviors instrument and Dr. Stephen Clyman, Margaret Farrell, and Dr. Matthew Holtman for their assistance in the planning, implementation, and design of this project. We are grateful for the assistance of the staff and faculty at Internal Medicine Associates and the residents in the Internal Medicine Residency Program at Mount Sinai Medical Center.

Conflicts of interest

None disclosed.


Dr. Stark was supported by grants from the Empire Clinical Research Investigator Program, New York State Department of Health and U.S. Department of Health and Human Services, Health Resources and Services Administration CFDA 93–895.

Copyright information

© Society of General Internal Medicine 2008