Journal of General Internal Medicine

, Volume 31, Issue 7, pp 755–761 | Cite as

Communication Skills Training for Physicians Improves Patient Satisfaction

  • Adrienne Boissy
  • Amy K. Windover
  • Dan Bokar
  • Matthew Karafa
  • Katie Neuendorf
  • Richard M. Frankel
  • James Merlino
  • Michael B. Rothberg
Original Research

ABSTRACT

BACKGROUND

Skilled physician communication is a key component of patient experience. Large-scale studies of exposure to communication skills training and its impact on patient satisfaction have not been conducted.

OBJECTIVE

We aimed to examine the impact of experiential relationship-centered physician communication skills training on patient satisfaction and physician experience.

DESIGN

This was an observational study.

SETTING

The study was conducted at a large, multispecialty academic medical center.

PARTICIPANTS

Participants included 1537 attending physicians who participated in, and 1951 physicians who did not participate in, communication skills training between 1 August 2013 and 30 April 2014.

INTERVENTION

An 8-h block of interactive didactics, live or video skill demonstrations, and small group and large group skills practice sessions using a relationship-centered model.

MAIN MEASURES

Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS), Clinician and Group Consumer Assessment of Healthcare Providers and Systems (CGCAHPS), Jefferson Scale of Empathy (JSE), Maslach Burnout Inventory (MBI), self-efficacy, and post course satisfaction.

KEY RESULTS

Following the course, adjusted overall CGCAHPS scores for physician communication were higher for intervention physicians than for controls (92.09 vs. 91.09, p < 0.03). No significant interactions were noted between physician specialty or baseline CGCAHPS and improvement following the course. Significant improvement in the post-course HCAHPS Respect domain adjusted mean was seen in intervention versus control groups (91.08 vs. 88.79, p = 0.02) and smaller, non-statistically significant improvements were also seen for adjusted HCAHPS communication scores (83.95 vs. 82.73, p = 0.22). Physicians reported high course satisfaction and showed significant improvement in empathy (116.4 ± 12.7 vs. 124 ± 11.9, p < 0.001) and burnout, including all measures of emotional exhaustion, depersonalization, and personal accomplishment. Less depersonalization and greater personal accomplishment were sustained for at least 3 months.

CONCLUSIONS

System-wide relationship-centered communication skills training improved patient satisfaction scores, improved physician empathy, self-efficacy, and reduced physician burnout. Further research is necessary to examine longer-term sustainability of such interventions.

KEY WORDS

communication patient experience patient satisfaction CGCAHPS HCAHPS burnout empathy physician 

BACKGROUND

There is growing evidence that patient experience impacts clinical health outcomes, and, in turn, how organizations deliver care. This is driven in part by the Centers for Medicare and Medicaid Service (CMS) requirements to publicly report patient experience scores (Hospital Consumer Assessment of Healthcare Providers and Systems; HCAHPS) as a condition for full reimbursement of hospital services.1 HCAHPS scores, which measure inpatient care, are tied to the discharging physician. Because the average inpatient sees at least 3.6 physicians during a hospital stay,2 and patients are frequently unaware of different physicians’ roles in their care,3 HCAHPS may be a poor measure of experience and satisfaction with a specific physician. In contrast, the outpatient experience survey (Clinician and Group Consumer Assessment of Healthcare Providers and Systems; CGCAHPS) is provider-specific, may soon be required, and may provide more reliable data on which to base improvement efforts.4 Because experience scores are an important element of determining reimbursement under value-based purchasing, hospitals and physicians have a strong incentive to improve HCAHPS and CGCAHPS scores. However, there are few proven methods for doing so.

Physician communication constitutes one important element of both HCAHPS and CGCAHPS scores, and it is the only metric that directly relates to the care provided by physicians. Multiple studies have shown that communication skills can be improved with effective training, and that effective communication improves medical outcomes, safety, patient adherence, patient satisfaction, and provider satisfaction and efficiency.5, 6, 7, 8, 9, 10, 11, 12, 13, 14 Organization-wide communication skills improvement programs are rare and reports are limited to case studies.15 We designed and implemented an experiential, relationship-centered communication skills course and measured its impact on patient satisfaction, physician empathy, burnout, and self-efficacy in a large, multispecialty academic medical center.

METHODS

Participants

We included all employed, attending physicians at the Cleveland Clinic who were mandated to attend an 8-h, internally offered, experiential communication skills training, between 1 August 2013 and 30 April 2014, during their regular work hours. The Cleveland Clinic is a nonprofit multispecialty academic medical center that employs approximately 3,220 physicians and scientists and 1,793 residents and fellows in training. Physicians who had participated in an earlier version of the 8-h experiential training (1 August 2010 to 31 July 2013) and those who had not yet taken the course were included as controls. Physicians were excluded if they did not have direct patient contact (e.g., pathologists), were residents or fellows, or if they did not have at least five pre- and five post-HCAHPS scores (for the HCAHPS analysis), or five pre- and five post-CGCAHPS scores (for the CGCAHPS analysis) (Fig. 1).
Figure 1.

Eligible study and control populations. *Maslach Burnout Inventory (MBI), Jefferson Scale of Empathy (JSE), Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS), Clinician and Group Consumer Assessment of Healthcare Providers and Systems (CGCAHPS).

Data was entered into a registry approved by the Cleveland Clinic Institutional Review Board. All participants had the option to exclude their data from the registry. The study, which used existing data, was deemed exempt.

Training Intervention

In 2013, the Cleveland Clinic Center for Excellence in Healthcare Communication developed an 8-h experiential communication skills training called R.E.D.E. (pronounced “ready”) to Communicate: Foundations of Healthcare Communication (FHC)SM. FHC is based on the R.E.D.E.SM model, a conceptual framework for teaching and evaluating relationship-centered healthcare communication (eFigure 1) that emphasizes genuine relationship as a vital therapeutic agent. The R.E.D.E.SM model applies empirically validated communication skills to three phases of Relationship: Establishment, Development, and Engagement.16 FHC (eFigure 2) is a CME-accredited program that focuses on experiential skills practice, the elements of which have been shown to be effective in improving physician communication.17,18 Each course was co-facilitated by two practicing clinicians trained in relationship-centered communication, adult learning theory, performance assessment, and group facilitation. Each group contained no more than 12 participants who proceeded through a series of interactive didactic presentations, live or video-based skill demonstrations, and small group skills practice sessions aligned with the three phases of the model. A lengthier skills practice integrating all three R.E.D.E.SM phases followed, and was based on communication challenges experienced in participants’ clinical practices.

Measures

Physicians were asked to complete pre- and post-course surveys on the day of training and at 3 months post-course. The surveys included demographic information, self-assessment of communication skills, knowledge and attitudes, the Jefferson Scale of Empathy (JSE), the Maslach Burnout Inventory (MBI), and post-course satisfaction. The JSE and MBI were used with permission.

For each participant, we also extracted physician information from a database maintained by our office of professional staff affairs that included gender, race/ethnicity, years in practice, specialty/subspecialty, and setting.

We conducted separate analyses for HCAHPS scores and CGCAHPS scores. Physicians who had both types of scores were included in both analyses. For each analysis, we controlled for secular changes associated with other patient experience initiatives by creating a comparison group of physicians who did not take the course during the specified time period. Each control physician was matched by assigning a pseudo course date to correspond with an intervention physician’s course date. We then collected HCAHPS and CGCAHPS scores for physician communication for 6 months before and 6 months after the course date (or corresponding pseudo course date) using their National Provider Identification (NPI) numbers.

STATISTICAL ANALYSIS

To assess the impact of our intervention, we assessed CGCAHPS scores 6 months before and after the assigned date for the intervention and control groups. Differences in pre- and post-scores were adjusted for baseline differences in gender, race, years in practice, and baseline scores.

We then performed two subgroup analyses to examine whether the impact of the course varied by specialty or initial CGCAHPs scores. We compared the adjusted mean values at 6 months for participants and controls and examined the interaction between intervention and baseline CGCAHPS tertile. A significant interaction would indicate a difference between baseline tertile groups as to the effect the intervention had on their adjusted 6-month CGCAHPS score. We performed a separate similar analysis for specialty category.

To assess the impact of our intervention on hospitalized patient satisfaction, we performed the same analyses for HCAHPS.

Data are presented as mean ± standard deviation or median [25th, 75th percentiles] for continuous variables and N (%) for categorical variables. Univariate analysis was performed to compare pre- and post- course survey results for physicians by Wilcoxon sign rank sum test for continuous variables, and by McNemar test for top box comparison of categorical variables. Top box refers to an “always” designation on a four-point Likert scale (HCAHPS), and a “Yes definitely” designation on a three-point scale (CGCAHPS). We analyzed each physician’s score as the percent of top box scores they received for that category. Analysis was performed using SAS software (version 9.3, Cary, NC). An overall significance level of 0.05 was used for all comparisons. When adjusting for multiple comparisons, a Bonferroni correction was used.

RESULTS

Between 1 August 2013 to 30 April 2014, 1543 participants completed the FHC course. Of these, six (0.4 %) declined to participate in the study, for a final sample of 1537 (response rate 99.6 %). Due to item-specific non-response, available response counts are listed in the results where appropriate. Table 1 shows the characteristics of FHC participants and controls for the HCAHPS and CGCAHPS analyses. For CGCAHPS and HCAHPS groups, intervention physicians had more years in practice and were more likely to be male.
Table 1.

Demographic Characteristics and Baseline Scores of Intervention and Control Groups

 

CGCAHPS

HCAHPS

Intervention (N = 443)

Control (N = 478)

 

Intervention (N = 204)

Control (N = 230)

 

Factor

n

N (%)

n

N (%)

p value

n

N (%)

n

N (%)

p value

Years In Practice

438

20.5 [13.0, 30.0]

378

15.0 [8.0,25.0]

< 0.001b

202

17.0 [10.0, 27.0]

203

12.0 [6.0, 20.0]

< 0.001b

Gender

439

 

400

 

0.04c

202

 

203

 

0.05c

 Female

 

95 (21.6)

 

111 (27.8)

  

45 (22.3)

 

63 (31.0)

 

 Male

 

344 (78.4)

 

289 (72.3)

  

157 (77.7)

 

140 (69.0)

 

Race

438

 

394

 

0.09c

200

 

201

 

0.21c

 White/Caucasian

 

345 (78.8)

 

298 (75.6)

  

146 (73.0)

 

150 (74.6)

 

 African American

 

7 (1.6)

 

12 (3.0)

  

4 (2.0)

 

5 (2.5)

 

 Hispanic/Latino

 

11 (2.5)

 

21 (5.3)

  

5 (2.5)

 

12 (6.0)

 

 Other

 

75 (17.1)

 

63 (16.0)

  

45 (22.5)

 

34 (16.9)

 

Specialty Category

414

 

400

 

0.61c

187

 

200

 

0.11c

 Surgical

 

153 (37.0)

 

141 (35.3)

  

100 (53.5)

 

123 (61.5)

 

 Medical

 

261 (63.0)

 

259 (64.8)

  

87 (46.5)

 

77 (38.5)

 

Baseline Score

443

 

478

 

0.74c

204

 

230

 

0.20c

 Bottom Tertile

 

137 (30.9)

 

152 (31.8)

  

61 (29.9)

 

79 (34.3)

 

 Middle Tertile

 

147 (33.2)

 

166 (34.7)

  

69 (33.8)

 

86 (37.4)

 

 Upper Tertile

 

159 (35.9)

 

160 (33.5)

  

74 (36.3)

 

65 (28.3)

 

Are You a Surgeon?

439

 

400

 

0.56c

201

 

203

 

0.11c

 No

 

259 (59.0)

 

228 (57.0)

  

86 (42.8)

 

71 (35.0)

 

 Yes

 

180 (41.0)

 

172 (43.0)

  

115 (57.2)

 

132 (65.0)

 

*Values presented as Median [P25, P75] or N (column %)

p values: a = ANOVA, b = Kruskal-Wallis test, c = Pearson’s chi-square test, d = Fisher’s Exact test

After adjusting for gender, race/ethnicity, specialty, years in practice, and baseline scores, post course date overall mean CGCAHPS scores were higher for intervention physicians than for controls (92.09 vs. 91.09, p = 0.03) (Table 2). Specifically, the domains of ‘Conveyed clear information’ and ‘Know patient’s medical history,’ achieved statistical significance. Subgroup analysis by specialty revealed no significant interaction between specialty and the impact of course exposure (p < 0.05) (eTable 1). A second subgroup analysis by tertile of baseline scores revealed no significant interaction between baseline scores and the impact of course exposure (eTable 2).
Table 2.

Multivariable Adjustment for 6-Month Post-CGCAHPS Scores

6 m CGCAHPS Score

Intervention

Mean* (95 % CI)

Control

Mean* (95 % CI)

p value

Clear Information

91.76 (90.22, 93.31)

90.09 (88.57, 91.60)

0.01

Spend Enough Time

91.11 (89.83, 92.40)

91.11 (89.85, 92.38)

0.99

Know Medical History

88.17 (86.49, 89.85)

86.30 (84.65, 87.95)

0.01

Explain

92.61 (91.31, 93.91)

91.55 (90.26, 92.83)

0.06

Listen

93.81 (92.64, 94.97)

92.97 (91.83, 94.12)

0.10

Respect

94.92 (93.86, 95.98)

94.13 (93.08, 95.17)

0.09

Total CGCAHPS

92.09 (91.01, 93.17)

91.09 (90.03, 92.15)

0.03

*Adjusted for Gender, Ethnicity, Specialty, Years of Practice, and Baseline CGCAHPS score

For the HCAHPS analysis, intervention physicians were more likely to be male, with more years in practice, and higher baseline scores than controls (Table 1). After adjusting for gender, race/ethnicity, specialty, years in practice, and baseline HCAHPS scores, intervention physicians had higher overall post-course date scores, but the difference was not statistically significant (83.95 vs. 82.73, p = 0.24) (Table 3). Intervention physicians did, however, show greater improvement in the domain of respect (91.08 vs. 88.79, p = 0.02).
Table 3.

Multivariable Adjustment for 6-Month Post-HCAHPS Scores

6 m HCAHPS Score

Intervention

Mean* (95 % CI)

Control

Mean* (95 % CI)

p value

Explain

77.35 (74.13, 80.56)

76.38 (73.23, 79.53)

0.50

Listen

83.13 (80.37, 85.89)

82.79 (80.11, 85.47)

0.78

Respect

91.08 (88.94, 93.21)

88.79 (86.71, 90.88)

0.02

Total HCAHPS

83.95 (81.68, 86.22)

82.73 (80.52, 84.94)

0.22

*Adjusted for Gender, Ethnicity, Specialty, Years of Practice, and Baseline HCAHPS score

Figure 2 compares intervention physicians’ perceptions of the course. Before taking the course, only 20 % of physicians ‘strongly agreed’ that the course would be a valuable use of their time, whereas after the course, 58 % ‘strongly agreed’ that it had been valuable. Less than 1 % found that it was not valuable after attending the course. Pre-course communication self-efficacy was generally high, except for managing time and patient emotion (eTable 3). Despite this, post-course self-efficacy significantly increased in all 13 domains (p < 0.001).
Figure 2.

Participants’ satisfaction with the course. *All comparisons are highly significant (p < 0.001).

At baseline, physicians self-reported moderate levels of burnout and low levels of empathy (Table 4). Following the course, scores on all three domains of burnout (emotional exhaustion, depersonalization, and personal achievement) and empathy improved significantly. NPI-matched follow-up scores were available at 3 months for 16 % of physicians. Improvements in all measures except emotional exhaustion were sustained at 3 months. There were no meaningful differences between the responders and non- responders for 3-month surveys (eTable 4.14.2).
Table 4.

Physician Empathy and Burnout Mean Scores Pre-, Post-, and 3 Months Post-Course

 

Pre Class

Post Class

3 m Follow-up

Pre vs. Post

p value

Post vs. 3 m

p value

Factor

n

Mean ± SD

n

Mean ± SD

n

Mean ± SD

Total Jefferson

955

116.4 ± 12.7

955

124.1 ± 11.9

143

122.5 ± 13.1

< 0.001

0.13

Maslach Burnout

Emotional Exhaustion

947

20.2 ± 10.4

947

17.2 ± 10.9

147

19.3 ± 10.6

< 0.001

0.04

 Depersonalization

942

6.1 ± 5.0

942

5.5 ± 4.5

147

5.7 ± 5.6

0.003

0.55

 Accomplishment

945

8.3 ± 6.1

945

7.7 ± 6.2

144

7.4 ± 5.4

0.04

0.65

*p values: ANOVA

DISCUSSION

In this observational study in a large health system, experiential, relationship-centered communication skills training effectively improved outpatient scores and one domain of inpatient scores. The course appeared equally effective for surgeons and non-surgeons, and the improvement did not differ depending on baseline patient experience scores. Physicians reported high levels of satisfaction with the course in terms of it being a valuable use of time, teaching skills that were relevant and feasible to implement in their practice, and changing their attitudes and enhancing their knowledge and skills. The course significantly improved physician self-efficacy related to performing specific relationship-centered communication skills that have been previously shown to improve patient and provider experience. In addition, physicians reported significant improvement in measures of empathy and burnout, which were sustained for at least 3 months following the course.

Previous research examining the impact of communication skills training on patient satisfaction has demonstrated modest but inconsistent improvement.19 A case study by Stein et al. (2005) reported statistically significant improvement in outpatient satisfaction scores, measured with a regional outpatient member/patient satisfaction survey (not CGCAHPS), for four out of six provider cohorts (n ~ 483) who completed an intensive 5-day interactive communication skills course.15 However, the majority of studies have been small (< 130 physicians) and have not demonstrated statistically significant improvements in inpatient or outpatient experience or satisfaction measures.20, 21, 22, 23, 24 In addition, many of these training efforts have been confined to specific populations25, 26, 27, 28, 29, 30, 31, 32 or have been with trainees.33, 34, 35, 36, 37 To date, more complex interventions and/or courses aimed at specific conditions have shown the greatest likelihood of improving patient experience.19 To our knowledge, this is the first study of a communication skills training intervention implemented for all physicians in a large multispecialty setting, and which uses CMS’ measures of patient experience. The current study demonstrates the capacity for a straightforward and short-term experiential communication skills training to improve provider-specific measures of patient satisfaction for up to 6 months. Whereas previous research found that providers with lower scores tended to benefit more than those with higher scores,15 our training model appeared to benefit all providers, irrespective of baseline patient satisfaction scores.

In the current healthcare climate, physicians are continually being asked to do more with less and are experiencing increased burnout as a result.38 Conscientious steps to improve physician satisfaction and engagement are therefore vital for improving the quality of patient care39 and conveying the important message that healthcare providers are valued and respected as persons. We employed the same process to build relationships with physicians attending the training as the one we encouraged them to use in relating to patients. As a result, physicians who attended the course reported significant and lasting improvements in burnout and empathy.

Our study has several important implications. For institutions working to raise their patient satisfaction scores or to impact the patient experience more broadly, investing in communication training may offer a good return on investment. Because of the narrow range in scores nationally, even small improvements may translate into large percentile changes. Absolute improvements of 1-2 points, as seen in our study, could translate into increases up to 14 percentile points. Under value-based purchasing contracts, changes of this magnitude could be worth a sizeable percent of Medicare revenues. Our findings also suggest that widespread training is beneficial, regardless of baseline patient satisfaction scores, since physicians with the highest patient satisfaction scores were just as likely to show improvement as those with the lowest scores. Finally, unlike many innovations that add to pressures associated with job dissatisfaction and burnout, our intervention led to improvement in both empathy and burnout. Future studies should address ways to maintain and strengthen these gains.

Our study has a number of important limitations. First, due to its observational nature, we could not rule out other causes for the improvement in scores among those who took the course. We attempted to control for secular trends in patient experience by including a contemporary control group and adjusting for measured differences, but there may have been additional unmeasured confounders. Second, we included some self-reported outcomes, and reporting was not anonymous. Physicians may have exhibited a social desirability bias in their responses. This seems unlikely, as they were not hesitant to initially express their skepticism about the course. Also, our study included only one organization with a largely employed model. Not all organizations can mandate training, but the study nonetheless has important implications. Cleveland Clinic’s experience in bringing communication skills training to physicians can serve as a model for others considering similar initiatives. Large organizations that also use an employed physician model, including the Mayo Clinic, the Veterans Health Administration, and Kaiser Permanente, together care for millions of patients. Others have invested substantially in efforts to improve patient experience. Spread of a model similar to ours could bring substantial change to the quality and outcomes of communication and relationships in today’s medical practice environment. Whether results can be generalized to other settings will need to be tested. Our sample size was limited by the number of physicians with sufficient numbers of HCAHPS or CGCAHPS surveys. Although it would have been desirable to have larger numbers of surveys per physician, and more physicians overall, these are the measures on which hospital reimbursement is currently based and they are the chief target of hospital administrators. Finally, our response rate at 3 months was low and may not be representative of all participants. Further work will be required to show whether the impact of the course on empathy and burnout are sustained.

In conclusion, an experiential communication skills training based on the R.E.D.E.SM model of relationship-centered communication successfully improved measures of patient satisfaction, as well as participating physicians’ self-reported empathy and burnout.

Notes

Acknowledgements

Contributors

Lu Wang assisted with early data analysis.

Funders

The study did not receive grant funding.

Prior Presentations

Neither this manuscript nor one with any part of its essential substance, tables or figures has been or will be published or submitted elsewhere.

Compliance with Ethical Standards

Conflict of Interest

All authors have worked or currently work for The Cleveland Clinic Foundation. The R.E.D.E. to CommunicateSM: Foundations of Healthcare Communication course is a commercial product of the Cleveland Clinic. No authors receive personal revenue from the sale of the course. James Merlino, MD is currently employed by Press Ganey Associates, Inc. as the President and Chief Medical Officer in Strategic Consulting.

Adrienne Boissy, MD, MA: Nothing to disclose

Amy K. Windover, PhD: Nothing to disclose

Dan Bokar: Nothing to disclose

Matthew Karafa, PhD: Nothing to disclose

Lu Wang: Nothing to disclose

Katie Neuendorf, MD: Nothing to disclose

Richard M. Frankel, PhD: Nothing to disclose

James Merlino, MD: Employed by Press Ganey Associates, Inc. as the President and Chief Medical Officer in Strategic Consulting

Michael B. Rothberg, MD, MPH: Nothing to disclose

Supplementary material

11606_2016_3597_MOESM1_ESM.docx (38 kb)
ESM 1(DOCX 37.7 kb)

REFERENCES

  1. 1.
    Hospital Consumer Assessment of Healthcare Providers and Systems. Centers for Medicare & Medicaid Services, Baltimore, MD. http://www.hcahpsonline.org. 2013. http://www.hcahpsonline.org. Accessed 1/14/2016.
  2. 2.
    Stevens JP, Nyweide D, Maresh S, Zaslavsky A, Shrank W, Md MD, et al. Variation in Inpatient Consultation Among Older Adults in the United States. J Gen Intern Med. 2015;30(7):992–9. doi:10.1007/s11606-015-3216-7.CrossRefPubMedGoogle Scholar
  3. 3.
    Windish DM, Olson DP. Association of patient recognition of inpatient physicians with knowledge and satisfaction. J Healthc Qual. 2011;33(3):44–9. doi:10.1111/j.1945-1474.2010.00123.x.CrossRefPubMedGoogle Scholar
  4. 4.
    Dyer N, Sorra JS, Smith SA, Cleary PD, Hays RD. Psychometric properties of the Consumer Assessment of Healthcare Providers and Systems (CAHPS(R)) Clinician and Group Adult Visit Survey. Med Care. 2012;50(Suppl):S28–34. doi:10.1097/MLR.0b013e31826cbc0d.CrossRefPubMedPubMedCentralGoogle Scholar
  5. 5.
    Marvel MK, Epstein RM, Flowers K, Beckman HB. Soliciting the patient’s agenda: have we improved? JAMA. 1999;281(3):283–7.CrossRefPubMedGoogle Scholar
  6. 6.
    Brock DM, Mauksch LB, Witteborn S, Hummel J, Nagasawa P, Robins LS. Effectiveness of intensive physician training in upfront agenda setting. J Gen Intern Med. 2011;26(11):1317–23.CrossRefPubMedPubMedCentralGoogle Scholar
  7. 7.
    Mauksch LB, Dugdale DC, Dodson S, Epstein R. Relationship, communication, and efficiency in the medical encounter: creating a clinical model from a literature review. Arch Intern Med. 2008;168(13):1387–95. doi:10.1001/archinte.168.13.1387.CrossRefPubMedGoogle Scholar
  8. 8.
    Abraham NS, Naik AD, Street RL Jr. Shared decision making in GI clinic to improve patient adherence. Clin Gastroenterol Hepatol: Off Clin Pract J Am Gastroenterol Assoc. 2012;10(8):825–7. doi:10.1016/j.cgh.2012.06.001.CrossRefGoogle Scholar
  9. 9.
    Mazor KM, Beard RL, Alexander GL, Arora NK, Firneno C, Gaglio B, et al. Patients’ and family members’ views on patient-centered communication during cancer care. Psycho-Oncology. 2013;22(11):2487–95. doi:10.1002/pon.3317.CrossRefPubMedGoogle Scholar
  10. 10.
    Robinson JD, Hoover DR, Venetis MK, Kearney TJ, Street RL Jr. Consultations between patients with breast cancer and surgeons: a pathway from patient-centered communication to reduced hopelessness. J Clin Oncol: Off J Am Soc Clin Oncol. 2013;31(3):351–8. doi:10.1200/JCO.2012.44.2699.CrossRefGoogle Scholar
  11. 11.
    Stewart M, Brown JB, Donner A, McWhinney IR, Oates J, Weston WW, et al. The impact of patient-centered care on outcomes. J Fam Pract. 2000;49(9):796–804.PubMedGoogle Scholar
  12. 12.
    Stewart M, Brown JB, Hammerton J, Donner A, Gavin A, Holliday RL, et al. Improving communication between doctors and breast cancer patients. Ann Fam Med. 2007;5(5):387–94.CrossRefPubMedPubMedCentralGoogle Scholar
  13. 13.
    Roter DL, Hall JA, Kern DE, Barker LR, Cole KA, Roca RP. Improving physicians’ interviewing skills and reducing patients’ emotional distress. A randomized clinical trial. Arch Intern Med. 1995;155(17):1877–84.CrossRefPubMedGoogle Scholar
  14. 14.
    Kelley JM, Kraft-Todd G, Schapira L, Kossowsky J, Riess H. The influence of the patient-clinician relationship on healthcare outcomes: a systematic review and meta-analysis of randomized controlled trials. PLoS One. 2014;9(4):e94207. doi:10.1371/journal.pone.0094207.CrossRefPubMedPubMedCentralGoogle Scholar
  15. 15.
    Stein T, Frankel RM, Krupat E. Enhancing clinician communication skills in a large healthcare organization: a longitudinal case study. Patient Educ Couns. 2005;58(1):4–12.CrossRefPubMedGoogle Scholar
  16. 16.
    Windover A, Boissy A, Rice T, Gilligan T, Velez V, Merlino J. The REDE model of healthcare communication: Optimizing relationship as a therapeutic agent. J Patient Experience. 2014;1(1):8–13.CrossRefGoogle Scholar
  17. 17.
    Berkhof M, van Rijssen HJ, Schellart AJ, Anema JR, van der Beek AJ. Effective training strategies for teaching communication skills to physicians: an overview of systematic reviews. Patient Educ Couns. 2011;84(2):152–62. doi:10.1016/j.pec.2010.06.010.CrossRefPubMedGoogle Scholar
  18. 18.
    Merckaert I, Libert Y, Razavi D. Communication skills training in cancer care: where are we and where are we going? Curr Opin Oncol. 2005;17(4):319–30.CrossRefPubMedGoogle Scholar
  19. 19.
    Dwamena F H-RM, Gaulden CM, Jorgenson S, Sadigh G, Sikorskii A, Lewin S, Smith RC, Coffey J, Olomu A, Beasley M. Interventions for providers to promote a patient-centred approach in clinical consultations. Cochrane Database of Syst Rev. 2012(12). doi:10.1002/14651858.CD003267.pub2.
  20. 20.
    Frostholm L, Fink P, Oernboel E, Christensen KS, Toft T, Olesen F, et al. The uncertain consultation and patient satisfaction: the impact of patients’ illness perceptions and a randomized controlled trial on the training of physicians’ communication skills. Psychosom Med. 2005;67(6):897–905. doi:10.1097/01.psy.0000188403.94327.5b.CrossRefPubMedGoogle Scholar
  21. 21.
    O’Leary KJ, Darling TA, Rauworth J, Williams MV. Impact of hospitalist communication-skills training on patient-satisfaction scores. J Hosp Med: Off Publ Soc Hosp Med. 2013;8(6):315–20. doi:10.1002/jhm.2041.CrossRefGoogle Scholar
  22. 22.
    Brown JB, Boles M, Mullooly JP, Levinson W. Effect of clinician communication skills training on patient satisfaction. A randomized, controlled trial. Ann Intern Med. 1999;131(11):822–9.CrossRefPubMedGoogle Scholar
  23. 23.
    Fossli Jensen B, Gulbrandsen P, Dahl FA, Krupat E, Frankel RM, Finset A. Effectiveness of a short course in clinical communication skills for hospital doctors: results of a crossover randomized controlled trial (ISRCTN22153332). Patient Educ Couns. 2011;84(2):163–9. doi:10.1016/j.pec.2010.08.028.CrossRefPubMedGoogle Scholar
  24. 24.
    Fujimori M, Shirai Y, Asai M, Kubota K, Katsumata N, Uchitomi Y. Effect of communication skills training program for oncologists based on patient preferences for communication when receiving bad news: a randomized controlled trial. J Clin Oncol: Off J Am Soc Clin Oncol. 2014;32(20):2166–72. doi:10.1200/JCO.2013.51.2756.CrossRefGoogle Scholar
  25. 25.
    Bays AM, Engelberg RA, Back AL, Ford DW, Downey L, Shannon SE, et al. Interprofessional communication skills training for serious illness: evaluation of a small-group, simulated patient intervention. J Palliat Med. 2014;17(2):159–66. doi:10.1089/jpm.2013.0318.CrossRefPubMedPubMedCentralGoogle Scholar
  26. 26.
    Gulbrandsen P, Jensen BF, Finset A, Blanch-Hartigan D. Long-term effect of communication training on the relationship between physicians’ self-efficacy and performance. Patient Educ Couns. 2013;91(2):180–5. doi:10.1016/j.pec.2012.11.015.CrossRefPubMedPubMedCentralGoogle Scholar
  27. 27.
    Fallowfield L, Lipkin M, Hall A. Teaching senior oncologists communication skills: results from phase I of a comprehensive longitudinal program in the United Kingdom. J Clin Oncol: Off J Am Soc Clin Oncol. 1998;16(5):1961–8.Google Scholar
  28. 28.
    Schell JO, Green JA, Tulsky JA, Arnold RM. Communication skills training for dialysis decision-making and end-of-life care in nephrology. Clin J Am Soc Nephrol: CJASN. 2013;8(4):675–80. doi:10.2215/CJN.05220512.CrossRefPubMedGoogle Scholar
  29. 29.
    Rao JK, Anderson LA, Inui TS, Frankel RM. Communication interventions make a difference in conversations between physicians and patients: a systematic review of the evidence. Med Care. 2007;45(4):340–9. doi:10.1097/01.mlr.0000254516.04961.d5.CrossRefPubMedGoogle Scholar
  30. 30.
    Levinson W, Roter D. The effects of two continuing medical education programs on communication skills of practicing primary care physicians. J Gen Intern Med. 1993;8(6):318–24.CrossRefPubMedGoogle Scholar
  31. 31.
    Canivet D, Delvaux N, Gibon AS, Brancart C, Slachmuylder JL, Razavi D. Improving communication in cancer pain management nursing: a randomized controlled study assessing the efficacy of a communication skills training program. Support Care Cancer : Off J Multinat Assoc Support Care Cancer. 2014. doi:10.1007/s00520-014-2357-2 [doi]
  32. 32.
    Gibon AS, Merckaert I, Lienard A, Libert Y, Delvaux N, Marchal S, et al. Is it possible to improve radiotherapy team members’ communication skills? A randomized study assessing the efficacy of a 38-h communication skills training program. Radiother Oncol: J Eur Soc Ther Radiol Oncol. 2013;109(1):170–7. doi:10.1016/j.radonc.2013.08.019.CrossRefGoogle Scholar
  33. 33.
    Smith RC, Lyles JS, Mettler J, Stoffelmayr BE, Van Egeren LF, Marshall AA, et al. The effectiveness of intensive training for residents in interviewing. A randomized, controlled study. Ann Intern Med. 1998;128(2):118–26.CrossRefPubMedGoogle Scholar
  34. 34.
    Yedidia MJ, Gillespie CC, Kachur E, Schwartz MD, Ockene J, Chepaitis AE, et al. Effect of communications training on medical student performance. JAMA : J Am Med Assoc. 2003;290(9):1157–65. doi:10.1001/jama.290.9.1157.CrossRefGoogle Scholar
  35. 35.
    Deveugele M, Derese A, De Maesschalck S, Willems S, Van Driel M, De Maeseneer J. Teaching communication skills to medical students, a challenge in the curriculum? Patient Educ Couns. 2005;58(3):265–70. doi:S0738-3991(05)00171-0 [pii].CrossRefPubMedGoogle Scholar
  36. 36.
    Bragard I, Etienne AM, Merckaert I, Libert Y, Razavi D. Efficacy of a communication and stress management training on medical residents’ self-efficacy, stress to communicate and burnout: a randomized controlled study. J Health Psychol. 2010;15(7):1075–81. doi:10.1177/1359105310361992.CrossRefPubMedGoogle Scholar
  37. 37.
    Lienard A, Merckaert I, Libert Y, Bragard I, Delvaux N, Etienne AM, et al. Transfer of communication skills to the workplace during clinical rounds: impact of a program for residents. PLoS One. 2010;5(8):e12426. doi:10.1371/journal.pone.0012426.CrossRefPubMedPubMedCentralGoogle Scholar
  38. 38.
    Dyrbye LN, West CP, Satele D, Boone S, Tan L, Sloan J, et al. Burnout among U.S. medical students, residents, and early career physicians relative to the general U.S. population. Acad Med: J Assoc Am Med Coll. 2014;89(3):443–51. doi:10.1097/ACM.0000000000000134.CrossRefGoogle Scholar
  39. 39.
    Romani M, Ashkar K. Burnout among physicians. Libyan J Med. 2014;9:23556. doi:10.3402/ljm.v9.23556.CrossRefPubMedGoogle Scholar

Copyright information

© Society of General Internal Medicine 2016

Authors and Affiliations

  • Adrienne Boissy
    • 1
  • Amy K. Windover
    • 1
  • Dan Bokar
    • 1
  • Matthew Karafa
    • 2
  • Katie Neuendorf
    • 1
  • Richard M. Frankel
    • 1
    • 3
  • James Merlino
    • 4
  • Michael B. Rothberg
    • 5
  1. 1.Office of Patient Experience, Center for Excellence in Healthcare CommunicationCleveland ClinicClevelandUSA
  2. 2.Quantitative Health SciencesCleveland ClinicClevelandUSA
  3. 3.Indiana University School of MedicineIndianapolisUSA
  4. 4.Press Ganey Associates, Inc.ChicagoUSA
  5. 5.Center for Value-Based Care Research, Medicine InstituteCleveland ClinicClevelandUSA

Personalised recommendations