Journal of General Internal Medicine

, Volume 30, Issue 4, pp 476–482

Practice Context Affects Efforts to Improve Diabetes Care for Primary Care Patients: A Pragmatic Cluster Randomized Trial

  • L. Miriam Dickinson
  • W. Perry Dickinson
  • Paul A. Nutting
  • Lawrence Fisher
  • Marjie Harbrecht
  • Benjamin F. Crabtree
  • Russell E. Glasgow
  • David R. West
Original Research

Abstract

Background

Efforts to improve primary care diabetes management have assessed strategies across heterogeneous groups of patients and practices. However, there is substantial variability in how well practices implement interventions and achieve desired outcomes.

Objective

To examine practice contextual features that moderate intervention effectiveness.

Design

Secondary analysis of data from a cluster randomized trial of three approaches for implementing the Chronic Care Model to improve diabetes care.

Participants

Forty small to mid-sized primary care practices participated, with 522 clinician and staff member surveys. Outcomes were assessed for 822 established patients with a diagnosis of type 2 diabetes who had at least one visit to the practice in the 18 months following enrollment.

Main Measures

The primary outcome was a composite measure of diabetes process of care, ascertained by chart audit, regarding nine quality measures from the American Diabetes Association Physician Recognition Program: HgA1c, foot exam, blood pressure, dilated eye exam, cholesterol, nephropathy screen, flu shot, nutrition counseling, and self-management support. Data from practices included structural and demographic characteristics and Practice Culture Assessment survey subscales (Change Culture, Work Culture, Chaos).

Key Results

Across the three implementation approaches, demographic/structural characteristics (rural vs. urban + .70(p = .006), +2.44(p < .001), −.75(p = .004)); Medicaid: <20 % vs. ≥20 % (−.20(p = .48), +.75 (p = .08), +.60(p = .02)); practice size: <4 clinicians vs. ≥4 clinicians (+.56(p = .02), +1.96( p < .001), +.02(p = .91)); practice Change Culture (high vs. low: −.86(p = .048), +1.71(p = .005), +.34(p = .22)), Work Culture (high vs. low: −.67(p = .18), +2.41(p < .001), +.67(p = .005)) and variability in practice Change Culture (high vs. low: −.24(p = .006), −.20(p = .0771), −.44(p = .0019) and Work Culture (high vs. low: +.56(p = .3160), −1.0(p = .008), −.25 (p = .0216) were associated with trajectories of change in diabetes process of care, either directly or differentially by study arm.

Conclusions

This study supports the need for broader use of methodological approaches to better examine contextual effects on implementation and effectiveness of quality improvement interventions in primary care settings.

KEY WORDS

Diabetes Contextual effects Multilevel modeling 

INTRODUCTION

For the majority of patients in the U.S. with type 2 diabetes, care is provided by primary care practices.1 Many of these patients have multiple conditions and require coordinated, comprehensive, patient-centered approaches to their healthcare in order to optimize health outcomes and improve or sustain quality of life.2,3 There is a large body of evidence to support the effectiveness of the Chronic Care Model (CCM) as a framework to guide practice redesign efforts to improve care for patients with chronic diseases.4, 5, 6, 7, 8

Although results from quality improvement (QI) efforts and cluster randomized trials are promising, implementation of practice transformations based on the CCM and patient-centered medical home (PCMH) in primary care practices has been challenging.9, 10, 11 The considerable heterogeneity in outcomes of practice-focused interventions suggests that practice context, such as structural and demographic features and practice culture, may influence successful program implementation.4,9,12, 13, 14, 15, 16, 17, 18, 19, 20 However, studies of primary care practices rarely follow up primary analyses with careful examination of how practice context interacts with intervention approaches to affect outcomes.

The Colorado Enhancing Practice, Improving Care (EPIC) cluster randomized trial was designed to compare the effectiveness of three approaches for implementing the Chronic Care Model in 40 small to mid-sized primary care practices in an effort to improve diabetes care: 1) practice facilitation using a Reflective Adaptive Process (RAP) approach; 2) practice facilitation using a Continuous Quality Improvement (CQI) approach; and 3) practice “self-direction” (SD) providing Chronic Care Model information and resources but no facilitation. There was improvement in all three arms; the greatest improvement was in the CQI arm, followed by SD and RAP.4 However, there was substantial variability among practices within the same arm, with some successfully adopting and sustaining change and others unable to do so. Consequently, in this report, we ask if practice context may explain some of this variation. Multilevel modeling, recommended as an important strategy for analyzing contextual effects, is used in this study20 to allow us to disentangle cluster level (e.g., practice, organization, etc.) effects from patient-level effects.12 Specifically, we hypothesized that aspects of inner and outer practice context as described by Damschroder et al.,19 including rural location, practice size, EHR presence, Medicaid percentage, and practice culture, would moderate intervention effectiveness. An additional innovative methodological feature includes analysis of the impact of variation in self-reported perspectives regarding practice culture among practice team members on patient-level process-of-care outcomes.21, 22, 23

METHODS

Study Design and Practice/Patient Recruitment

The setting for the study was 40 small to mid-sized independent mixed-payer primary care practices (n = 32) and community health centers (n = 8) in Colorado. Practices were recruited through multiple contact methods targeting interested primary care clinicians (family medicine and general internal medicine), especially those in the State Networks of Colorado Ambulatory Practices and Partners (SNOCAP), a collaborative of practice-based research networks. Practices were stratified by urban/rural location, practice size, and type (community health center vs. other), and randomized so that the distribution of practice characteristics would be similar across arms. Here, we report on patient chart audits at baseline 9 and 18 months following practice enrollment and baseline clinician and staff surveys. The study was approved by the University of Colorado Institutional Review Board and registered with ClinicalTrials.gov (Protocol Registration Receipt NCT00414986). Further details of the study are reported elsewhere,4 but a brief description of the three study arms is given below.

Practice Interventions

Practices in the RAP arm received practice facilitation using the Reflective Adaptive Process change model based on complexity theory.9,24, 25, 26 The RAP approach focused on changing organizational functioning in order to improve diabetes care. The conceptual model underlying this approach assumes that improving organizational capacity to make and sustain change is fundamental in achieving practice improvements and implementing changes. The facilitation intervention for this arm lasted six months, with an average of 7.4 meetings (range 4–11), and facilitators were available for consultation for up to 12 months.4,9

Practices in the Continuous Quality Improvement (CQI) arm received practice facilitation based on the Model for Improvement.27, 28, 29, 30, 31 The CQI facilitators provided a structure and process for quality improvement using CQI tools that focused on sequential Plan-Do-Study-Act (PDSA)27 cycles guided by quality measurement data. Implementation of a system for obtaining reliable quality measures was a time-consuming first step, and the length of the intervention was allowed to vary (up to 18 months; mean, 15 months and 9.7 meetings), depending on practice needs. Although the CQI approach differed considerably from the RAP approach described above, conceptually, the importance of an organizational culture that supports the change process in the practice was emphasized in both. Facilitators in both the CQI and RAP arms were external to the practice.

Practices randomized to the self-directed (SD) arm received limited feedback on their baseline practice culture and level of implementation of the Chronic Care Model based on practice surveys, but no facilitation. Self-directed practices were then given access to a website with information about quality improvement and the Chronic Care Model28 for diabetes.

Measures

Patient Outcomes

Process of diabetes care, the primary outcome, consisted of documentation, ascertained by chart audit, of performance based on nine quality measures from the American Diabetes Association Physician Recognition Program: HgA1c, foot exam, blood pressure, dilated eye exam, cholesterol, nephropathy screen, flu shot, nutrition counseling, and self-management support.32 Each practice generated a list of all patients with diabetes who had had at least one visit to the practice in the 18 months prior to practice enrollment and at least one visit during the 18 months following enrollment. A random sample of charts was audited, with a target of at least 20–25 patients per practice, sufficient to provide >80 % power to detect a 0.44 effect size difference between any two arms, with an intraclass correlation as high as 10 %; 822 chart audits were completed. IRB-approved procedures for de-identifying the data were followed. Each item was considered up to date if it occurred within the 12 months before the end of each audit period (baseline, 9 months, 18 months). Although this timing resulted in some overlap in patient assessment periods (e.g., a test one month before baseline could count as up to date at nine months), nine-month time blocks were necessary to meet project timeline requirements. The composite score for diabetes process of care, ranging from 0 to 9, consisted of the total number of up-to-date ADA quality measures at the end of each audit period.

Patient-Level Covariates

Patient characteristics collected in chart audits included age, gender, and chronic medical and psychiatric conditions. Race and ethnicity were generally not recorded in the medical record, and therefore were not included in the analyses.

Practice Characteristics

Each practice provided basic information at baseline, including practice setting and organization type, patient demographics, patient volume, and presence of an EHR. All clinicians and staff members (i.e., every employee in every practice) were asked to complete the Practice Culture Assessment (PCA) survey at baseline (response rate 64 %). Surveys were anonymous so that individual identity was protected. Twenty-nine percent of surveys were from clinicians (family physician, general internist, nurse practitioner, physician assistant), and the remainder from nursing staff and other practice employees. The PCA is a 22-item tool designed to measure perceptions of practice culture potentially related to practice functioning and successful implementation of practice quality improvement. Individual items and subscale development for the PCA are reported elsewhere; internal consistency is noted below.4 Subscales include: a) Change Culture (CC), 10 items that deal with how the practice does collaborative quality improvement, problem resolution, and change management (alpha = .91); b) Work Culture (WC), eight items assessing how the members of the practice work together to achieve a pleasant and productive practice environment with high-quality care (alpha = 0.91); and c) Chaos (CH), four items that assess the level of instability, disruption, and disorganization in the practice (alpha = .78). Subscale scores could range from 0 to 100, and higher scores indicate greater endorsement of the subscale dimension (i.e., more positive attitude toward practice change, better work culture, greater chaos). Means and standard deviations were computed across respondents within each practice to create practice-level variables.

Statistical Analysis

Descriptive statistics were generated for patient sociodemographic measures, diabetes process of care, and practice characteristics. To understand the potential for confounding due to associations among practice characteristics, bivariate relationships were examined using chi-squared tests and t tests.

Effects of Baseline Practice Characteristics

The outcome for all analyses was the patient-level diabetes process of care score (POC) over time, as described above, with repeated measures on patients at baseline and at 9 and 18 months (2,399 unique data points). Multilevel modeling (general linear mixed models that are both longitudinal and hierarchical, with random effects for patient and practice) was used for analysis (SAS PROC MIXED). Covariates included age, gender, and medical and psychiatric comorbidities. The effect of each practice characteristic on patient-level POC scores over time was examined in separate models. To determine whether baseline practice characteristics (PC) moderated intervention effectiveness, we included a three-way interaction term (PC x arm x time) and all relevant two-way interactions in the model (PC x arm, time x arm, PC x time).21, 22, 23 For PCA subscales, we provide estimates of the effect at the overall mean score and 10 points below and above the mean. We also examined the effects of variability in PCA scores, measured by the within-practice SD for each subscale, adjusting for the effects of practice-level mean subscale scores, to determine if lack of agreement among clinicians and staff affected practice trajectories.22 Analyses were performed using SAS version 9.4.

RESULTS

Practice and Patient Characteristics

Baseline practice and patient characteristics are described in Table 1. There were no significant bivariate relationships among rural vs. urban setting, practice size, presence of EHR, or Medicaid percentage. Nor did baseline Change Culture scores and Chaos scores differ significantly by EHR presence, Medicaid percentage, size, rural location, or respondent’s position (clinician vs. other). Baseline Work Culture scores did not differ significantly by EHR presence, Medicaid percentage, rural location, or respondent’s position (clinician vs. other), but smaller practices had higher Work Culture scores than larger practices (77.1 vs. 71.8, p = .05), and clinicians had higher Work Culture scores than other practice members (70.6 vs. 61.4, p < .001).
Table 1.

Baseline Patient and Practice Characteristics

 

RAP (n = 15)

CQI (n = 10)

SD (n = 15)

N (%)

N (%)

N (%)

Practices

 Rural

4 (27 %)

2 (20 %)

4 (27 %)

 EHR

4 (27 %)

5 (50 %)

7 (47 %)

 Practice Size (# clinicians) §

   < 4

8 (53.3 %)

3 (30 %)

10 (66.7 %)

   > =4

7 (46.7 %)

7 (70 %)

5 (33.3 %)

 Medicaid

   < 20 %

11 (73 %)

8 (80 %)

10 (67 %)

   > = 20 %

4 (27 %)

2 (20 %)

5 (33 %)

Practice Culture

 PCA Change Culture, mean (sd) ‡

64.0 (10.0)

69.6 (7.0)

68.2 (11.4)

 PCA Work Culture, mean (sd) ‡

68.1 (9.1)

68.8 (8.0)

67.9 (14.2)

 PCA Chaos, mean (sd) ‡

36.4 (10.7)

30.7 (11.7)

36.3 (12.9)

Patient Chart Audit Sample

RAP (N = 312)

CQI (N = 189)

SD (N = 321)

Mean (sd) or %

Mean (sd) or %

Mean (sd) or %

 Diabetes process-of-care score (range 0–9 in all groups)*

4.54 (2.07)

3.58 (2.33)

3.63 (2.09)

 Gender, % male

44.2 %

52.9 %

50.5 %

 Age (years)

60.5 (12.6)

61.9 (12.1)

60.0 (13.2)

 Number of medical comorbidities**

2.1 (1.2)

2.0 (1.3)

2.0 (1.1)

 Presence of psychiatric comorbidity†

20.5 %

14.3 %

13.1 %

 Baseline HgA1c (n = 636)

7.18 (1.59)

7.35 (1.76)

7.69 (2.00)

 Baseline systolic BP (n = 799)

128.3(16.4)

131.8 (17.7)

132.9 (19.7)

 Baseline diastolic BP (n = 799)

76.9 (10.9)

78.5 (12.2)

78.0 (11.9)

 Baseline total cholesterol (n = 703)

174.5 (42.6)

185.8 (49.3)

184.8 (50.4)

*Number of up-to-date ADA performance items: HgA1c, foot exam, blood pressure, dilated eye exam, cholesterol, nephropathy screen, flu shot, nutrition counseling, and self-management support

**Includes arthritis, connective tissue disease, gastrointestinal problems, coronary disease, hyperlipidemia, hypertension, liver disease, pulmonary disease, neurological disease, PVD, renal disease, stroke, dementia, and cancer in the past three years

†depression, substance abuse, other psychiatric dx

‡sd based on aggregated practice means (e.g., means of means)

§number of clinicians: includes MD, DO, NP, PA

p < .01

p < .05

Demographic and Structural Practice Characteristics (PC)

Moderating effects of practice characteristics on patient-level diabetes POC were tested by examining the three-way interaction term (PC x arm x time) described previously. Model-based estimates shown in Table 2 are estimates of change in diabetes POC score by arm and practice characteristic. Rural vs. urban setting, Medicaid percentage, and practice size, but not presence of an EHR, significantly moderated intervention effects across study arms. Within RAP and CQI arms, there was greater improvement in diabetes POC in rural vs. urban (RAP:+.70, CQI:+2.44) and smaller vs. larger practices (RAP:+.56; CQI:+1.96), whereas in SD, practices improvement was greater in urban settings (rural vs. urban: −.75). A lower percentage of Medicaid patients in the practice was associated with greater improvement in SD practices (<20 % vs. ≥20 %: +.60).
Table 2.

Change in Diabetes Process of Care by Study Arm and Practice Characteristic

Practice Characteristic

 

RAP

CQI

SD

Differential Intervention Effect*

Location

Rural

.78

3.11

.23

F(2,2352) = 26.06, p < .001

Urban

.08

.67

.98

 

p value within arm

 

.006

<.001

.004

 

Size

<4 clinicians

0.59

2.49

0.76

F(2,2349) = 11.36, p < .001

> = 4 clinicians

.03

.53

0.74

 

p value within arm

 

.02

<.001

.91

 

EHR

No

0.41

1.37

0.79

F(2,2352) = 0.26, p = .7720

Yes

0.06

1.24

0.69

 

p value within arm

 

.17

.71

.62

 

% Medicaid

<20 %

0.23

1.51

0.92

F(2,2352) = 3.00, p = .0498

≥20 %

0.43

0.76

0.32

 

p value within arm

 

.48

.08

.02

 

Practice Culture Assessment

Change Culture

Low†

.70

.28

.59

F(2,2349) = 6.74, p = .0012

Medium†

.28

1.14

.76

 

High†

-.16

1.99

.93

 

p value within arm

 

.048

.005

.22

 

Work Culture

Low†

.69

.13

.50

F(2,2349) = 9.07, p = .0001

Medium†

.36

1.33

.84

 

High†

.02

2.54

1.17

 

p value within arm

 

.18

<.001

.005

 

Chaos

Low†

.40

1.51

.87

F(2,2349) = 0.63, p = .5331

Medium†

.28

1.13

.78

 

High†

.11

.75

.69

 

p value within arm

 

.62

.11

.53

 

*Three-way interaction term (PC x arm x time) indicates that the moderator had a differential effect on outcomes over time across the intervention arms;

†for PCA scales, this represents change at the overall practice mean score and 10 points below and above the mean.

‡Covariates include age, gender, and medical and psychiatric comorbidities.

Practice Culture

Baseline Change Culture and Work Culture significantly moderated intervention effects (Table 2). CQI practices with higher Change Culture scores had greater improvement, but the opposite relationship was observed in RAP practices. In both CQI and SD arms, practices with higher Work Culture scores had greater improvement. Chaos scores were not associated with improvement in POC, either overall or differentially by study arm.

Variability in Practice Culture

We also examined the effects of greater within-practice variability in baseline PCA scores (indicating a lower level of agreement among respondents from a particular practice) on improvement in diabetes POC over time, adjusting for the effects of mean scores. Figure 1 shows change in POC estimated at low (25th percentile) and high (75th percentile) variability on Change Culture and Work Culture subscales. Greater variability in Change Culture scores was associated with less improvement in POC in RAP (−.24, p = .006) and SD (−.44, p = .0019) practices. This effect did not achieve statistical significance in the CQI arm (−.20, p = .077), and the test of differential effects across study arms was not significant (PC x arm x time, p = .46). Variability in Work Culture scores significantly moderated intervention effectiveness (PC x arm x time: F(2,1561) = 8.93, p < .001). Within the CQI and SD arms, greater variability was associated with less improvement in POC (CQI: −1.0, p = .008; SD: −.25, p = .02), but not in the RAP arm (.56, p = .32). Variability in Chaos scores (not shown) did not moderate (p = .58) or have a direct effect on change in POC in any arm (RAP: p = .56, CQI: p = .40, SD: p = .50).
Fig. 1

Effects of Variability in Practice Culture Scores on Improvement in Diabetes POC. Bars represent change in diabetes POC estimates at the 25th (low variability) and 75th (high variability) percentiles in PCA subscale standard deviations.

DISCUSSION

Practice context plays an important role in successful implementation of effective interventions in primary care. In this cluster randomized trial of three approaches to implementing the CCM model for diabetes, practice structural/demographic characteristics and practice culture significantly affected change in diabetes process of care, often with variation in effects by study arm. Specifically, smaller and rural CQI and RAP practices displayed greater improvement in diabetes POC, whereas rural and higher-percentage Medicaid SD practices displayed less improvement. CQI practices with higher Change Culture scores and CQI and SD practices with higher Work Culture scores had greater improvement; conversely, RAP practices with higher Change Culture scores showed less improvement in POC. In both the CQI and SD arms, practices with greater variability across respondents in Work Culture scores (i.e., less agreement) had less improvement in POC scores.

Other studies investigating relationships between diabetes/chronic disease care and practice characteristics (including EHR,33 Medicaid percentage,34 community health centers,35, 36, 37 and practice size38,39) have reported mixed results. Interestingly, we find that rural and smaller practices, which often do not have as many connections to resources as urban practices, showed significantly greater improvement in both the CQI- and RAP-facilitated arms, while urban practices, usually with more available resources, saw more improvement in the self-directed arm. This pattern highlights the importance that practice facilitation may have for smaller, rural, and independent practices.

The role of practice culture in successful implementation of practice-based interventions is particularly relevant for practice transformation initiatives in primary care. Previous work shows the importance of internal relationships and effective teams14,40, 41, 42, 43 for providing excellent chronic disease care and achieving the hallmarks of patient-centered medical homes. Positive Change Culture and Work Culture among clinicians and staff members may reflect practice capacities that are essential for successful implementation of practice transformation initiatives such as CQI approaches. Different findings in RAP practices suggest other dynamics at work in this arm. Notably, the CQI intervention focused on implementation of a diabetes registry, used registry quality measure data, when available, to guide their QI efforts, and followed a fairly structured facilitation approach. The RAP intervention did not focus on quality measure data, and was expressly designed to improve the relationship systems within the practice25 as a pathway to better patient care and sustainable improvement. This approach allowed more practice latitude and imposed less structure with regard to the change process. In several situations, this led to the expenditure of time and effort on activities with laudable goals, but that were somewhat tangential to the objectives of the project and, in some cases, were ineffectively approached and poorly executed. This may help explain why RAP practices with higher Change Culture scores had less improvement.

Congruent with our findings, another study using the RAP approach to improve adherence to clinical guidelines in 25 primary care practices reported that successful RAP teams often initially addressed “unacknowledged tensions,” a process that resulted in improved communication in 12 practices,26 although primary outcomes did not improve significantly.44 A more focused and structured approach (with tailoring for individual practice culture, circumstances, and needs)—at least over the duration of our project—produced stronger improvement in diabetes quality measures than a less structured and more open-ended approach.

Importantly, these findings suggest that the relationship between practice culture and successful practice transformation efforts is complex and that further study is needed, with close attention to practice relationships and dynamics in light of the nature of the intervention and desired outcomes.

Variability in perceived practice culture among clinicians and staff within practices was also predictive in this study. Variability may indicate differences between line staff and practice leaders in perceptions of practice culture (as could be seen in hierarchical practices), or it could reflect disagreement among clinicians; this level of detail could not be analyzed in the present study. Methodologically, it is important to consider absolute levels (means) as well as variability (standard deviations) in responses to this type of practice culture assessment as potential contextual factors that may moderate intervention effects in future studies.

Several limitations are worthy of note. Although large for a cluster randomized trial, practice sample size was too small for a full exploration of complex interactions among practice characteristics. Targeting practices that are members of research networks may introduce selection bias and limit generalizability, as these practices are more likely to be early adopters of quality improvement innovations, although selection bias should be similar across study arms. Additionally, practice trajectories in response to the intervention were more homogeneous in the RAP arm, with less improvement overall, thus limiting our ability to detect differences by practice characteristics.4 Practice Culture Assessment data were from self-reports of clinicians and staff members and could have been subject to bias. In addition, analytic approaches are vulnerable to the usual issues of possible spurious findings, as well as the potential for missing important relationships due to low power for interactions. Finally, there are likely other factors that might have impacted intervention effectiveness at the patient or practice level that were not assessed in this study (e.g., literacy, engagement, distress, environmental support).

In conclusion, the results of this study offer some insight into which practices do better with certain practice transformation efforts and under what circumstances. Our results support findings from other studies in which practices with organizational attributes that support greater change capacity and better work relationships were likely to be better prepared to successfully implement practice transformation interventions.14,16,17 We need to more fully understand the role of practice context/culture—including variability in perceptions—to enhance practice transformation efforts. Our findings have major implications for PCMHs and other major primary care practice transformation initiatives and the methods used to evaluate the success of such initiatives.

Notes

Acknowledgments

Support

Funding for this work was supported by the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK DF067083) and the National Institute of Mental Health (NIMH MH069809-04).

Prior Presentations

Some of this data was presented at the North American Primary Care Research Group meeting in New Orleans, in November, 2012.

Conflict of Interest

L. Miriam Dickinson has no conflict of interest.

W. Perry Dickinson had no conflict of interest.

Paul A. Nutting is employed by the Center for Research Strategies, a research and evaluation business in Denver.

Lawrence Fisher has no conflict of interest.

Marjie Harbrecht has no conflict of interest.

Benjamin F. Crabtree has no conflict of interest.

Russell E. Glasgow has no conflict of interest.

David R. West has no conflict of interest.

REFERENCES

  1. 1.
    Services, U.S.D.o.H.a.H. National Ambulatory Medical Care Survey Online: http://www.cdc.gov/nchs/data/ad/ad346.pdf 2004. Accessed 11/13/14.
  2. 2.
    Wagner EH, Austin BT, Von Korff M. Improving outcomes in chronic illness. Manage Care Q. 1996;4:12–25.Google Scholar
  3. 3.
    Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness. JAMA. 2002;288(14):1775–1779.CrossRefPubMedGoogle Scholar
  4. 4.
    Dickinson WP, Dickinson LM, Nutting PA, Emsermann C, Tutt B, Crabtree BF, Fisher L, Harbrecht M, Gottsman A, West DR. Practice facilitation to improve diabetes care in primary care: A report from the EPIC randomized clinical trial. Ann Fam Med. 2014;12(1):8–16.CrossRefPubMedCentralPubMedGoogle Scholar
  5. 5.
    Nutting PA, Dickinson WP, Dickinson LM, Nelson CC, King DK, Crabtree BF, Glasgow RE. Use of chronic care model elements is associated with higher-quality care for diabetes. Ann Fam Med. 2007;5(1):14–20.CrossRefPubMedCentralPubMedGoogle Scholar
  6. 6.
    Parchman ML, Noel PH, Culler SD, Lanham HJ, Leykum LK, Romero RL, Palmer RF. A randomized trial of practice facilitation to improve the delivery of chronic illness care in primary care: initial and sustained effects. Implement Sci. 2013;8:93.CrossRefPubMedCentralPubMedGoogle Scholar
  7. 7.
    Stellefson M, Dipnarine K, Stopka C. The chronic care model and diabetes management in US primary care setting. Prev Chron Dis. 2013;10:E26.Google Scholar
  8. 8.
    Coleman K, Austin BT, Brach C, Wagner EH. Evidence on the chronic care model in the new millennium. Health Aff. 2009;28(1):75–85.CrossRefGoogle Scholar
  9. 9.
    Shaw EK, Howard J, West DR, Crabtree BF, Nease DE, Tutt B, Nutting PA. The role of the champion in primary care change efforts. J Am Board Fam Med. 2012;25(5):676–685.CrossRefPubMedCentralPubMedGoogle Scholar
  10. 10.
    Crabtree BF, Nutting PA, Miller WL, McDaniel RR, Stange KC, Jaen CR, Stewart EE. Primary care practice transformation is hard work: Insights from a 15-year developmental program of research. Med Care. 2011;49:S28–S35.CrossRefPubMedCentralPubMedGoogle Scholar
  11. 11.
    Danz MS, Hempel S, Lim YW, Shaman R, Motala A, Stockdale S, Shekelle P, Rubenstein L. Incorporating evidence review into quality improvement: meeting the needs of innovators. BMJ Qual Saf. 2013;22:931–939.Google Scholar
  12. 12.
    Dickinson LM, Dickinson WP, Rost K, deGruy F, Emsermann C, Forshaug D, Nutting PA, Meredith L. Clinician burden and depression treatment: Disentangling patient- and clinician-level effects of medical comorbidity. J Gen Intern Med. 2008;23(11):1763–1769.CrossRefPubMedCentralPubMedGoogle Scholar
  13. 13.
    Li R, Simon J, Bodenheimer T, Gillies RR, Casalino L, Schmittdiel J, Shortell SM. Organizational factors affecting the adoption of diabetes care management processes in physician organizations. Diabetes Care. 2004;27(10):2312–2316.CrossRefPubMedGoogle Scholar
  14. 14.
    Noel PH, Lanham HJ, Palmer RF, Leykum LK, Parchman ML. The importance of relational coordination and reciprocal learning for chronic illness care within primary care teams. Health Care Manag Rev. 2013;38(1):20–28.CrossRefGoogle Scholar
  15. 15.
    Kaissi A, Kralewski J, Curoe A, Dowd B, Silversmith J. How does the culture of medical group practices influence the types of programs used to assure quality of care? Health Care Manag Rev. 2004;29(2):129–138.CrossRefGoogle Scholar
  16. 16.
    Bosch M, Dijkstra R, Wensing M, van der Weijden T, Grol R. Organizational culture, team climate and diabetes care in small office-based practices. BMC Health Serv Res. 2008;8:180.CrossRefPubMedCentralPubMedGoogle Scholar
  17. 17.
    Nembhard IM, Singer SJ, Shortell SM, Rittenhouse D, Casalino LP. The cultural complexity of medical groups. Health Care Manag Rev. 2012;37(3):200–213.CrossRefGoogle Scholar
  18. 18.
    Shortell SM, Gillies R, Siddique J, Casalino LP, Rittenhouse D, Robinson JC, McCurdy RD. Improving chronic illness Care: A longitudinal cohort analysis of large physician organizations. Med Care. 2009;47(9):932–939.CrossRefPubMedGoogle Scholar
  19. 19.
    Damschroder LJ, Aron DC, Keith RE, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.CrossRefPubMedCentralPubMedGoogle Scholar
  20. 20.
    Alexander JA, Herald LR. Methods and metrics challenges of delivery systems research. Implement Sci. 2012;7:15.CrossRefPubMedCentralPubMedGoogle Scholar
  21. 21.
    Kraemer HC, Wilson GT, Fairburn CG, Agras WS. Mediators and moderators of treatment effects in randomized clinical trials. Arch Gen Psychiatry. 2002;59:877–883.CrossRefPubMedGoogle Scholar
  22. 22.
    Bryk AS, Raudenbush SW. Hierarchical Linear Models: Applications and Data Analysis Methods. Second ed. Newbury Park: Sage Publications; 2000.Google Scholar
  23. 23.
    Hedeker D, Gibbons R, Longitudinal Data Analysis. Hoboken, Hew Jersey: Wiley & Sons; 2006, pp 69–76Google Scholar
  24. 24.
    Stroebel CK, McDaniel RR, Crabtree BF, Miller WL, Nutting PA, Stange KC. How complexity science can inform a reflective process for improvement in primary care practices. Joint Comm J Qual Patient Saf. 2005;31:438–446.Google Scholar
  25. 25.
    Miller WL, Crabtree BF, McDaniel R, Stange KC. Understanding change in primary care practice using complexity theory. J Fam Pract. 1998;46(5):369–376.PubMedGoogle Scholar
  26. 26.
    Balasubramanian BA, Chase SM, Nutting PA, Cohen DJ, Strickland PAO, Crosson JC, Miller WL, Crabtree BF. Using learning teams for reflective adaptation (ULTRA): insights from a team-based change management strategy in primary care. Ann Fam Med. 2010;8:425–432.CrossRefPubMedCentralPubMedGoogle Scholar
  27. 27.
    Institute for Healthcare Improvement. How to Improve. Available at: http://www.ihi.org/resources/Pages/HowtoImprove/default.aspx. Accessed 11/13/14.
  28. 28.
    Improving Chronic Illness Care. The Chronic Care Model. Available at: http://www.improvingchroniccare.org. Accessed 11/13/14.
  29. 29.
    Batalden PB, Stoltz PK. A framework for the continual improvement of health care: building and applying professional and improvement knowledge to test changes in daily work. Jt Comm J Qual Improv. 1993;19(10):424–447. discussion 448–452.PubMedGoogle Scholar
  30. 30.
    Berwick DM, Godfrey AB, Roessner J. Curing health care: new strategies for quality improvement. 1st ed. San Francisco: Jossey-Bass; 1990.Google Scholar
  31. 31.
    Shortell SM, O'Brien JL, Carman JM, et al. Assessing the impact of continuous quality improvement/total quality management: concept versus implementation. Health Serv Res. 1995;30(2):377–401.PubMedCentralPubMedGoogle Scholar
  32. 32.
    American Diabetes Association website: http://www.ncqa.org/tabid/139/Default.aspx. Last accessed 11/13/14.
  33. 33.
    Crosson JC, Ohman-Strickland PA, Cohen DJ, Clark EC, Crabtree BF. Typical electronic health in primary care practices and the quality of diabetes care. Ann Fam Med. 2012;10:221–227.CrossRefPubMedCentralPubMedGoogle Scholar
  34. 34.
    Carrier ER, Schneider E, Hoangmai H, Bach PB. Association between quality of care and the sociodemographic composition of physicians’ patient panels: A repeat cross-sectional analysis. J Gen Intern Med. 2011;26(9):987–94.CrossRefPubMedCentralPubMedGoogle Scholar
  35. 35.
    Bailey SR, O’Malley JP, Gold R, Heintzman J, Likumahuwa S, DeVoe JE. Diabetes care quality is highly correlated with patient panel characteristics. J Am Board Fam Med. 2013;26:669–679.CrossRefPubMedCentralPubMedGoogle Scholar
  36. 36.
    Russell GM, Dahrouge S, Hogg W, Geneau R, Muldoon L, Tuna M. Managing chronic disease in Ontario primary care: The impact of organizational factors. Ann Fam Med. 2009;7:309–318.CrossRefPubMedCentralPubMedGoogle Scholar
  37. 37.
    Damberg CL, Shortell SM, Raube K, Gillies RR, Rittenhouse D, McCurdy RK, Casaline LP, Adams J. Relationship between quality improvement processes and clinical performance. Am J Manage Care. 2010;16(8):601–606.Google Scholar
  38. 38.
    Beaulier MD, Haggerty J, Tousignant P, Barnsley J, Hogg W, Geneau R, Hudon E, Duplain R, Denis JL, Bonin L, del Grande C, Dragieva N. Characteristics of primary care practices associated with high quality of care. CMAJ. 2013;185(12):ES90–ES96.Google Scholar
  39. 39.
    Vamos EP, Pape UJ, Bottle A, Hamilton FL, Curcin V, Ng A, Molokhia M, Car J, Majeed A, Millett C. Association of practice size and pay-for-performance incentives with the quality of diabetes management in primary care. Can Med Assoc J. 2011;183(12):E809–E815.CrossRefGoogle Scholar
  40. 40.
    Miller WL, Crabtree BF, Nutting PA, Stange KC, Jaen CR. Primary Care practice development: A relationship-centered approach. Ann Fam Med. 2010;8:S68–S79.CrossRefPubMedCentralPubMedGoogle Scholar
  41. 41.
    Lanham HJ, McDaniel RR, Miller WL, Stange KC, Tallia AF, Nutting PA. How improving practice relationships among clinicians and nonclinicians can improve quality in primary care. Joint Comm J Qual Patient Saf. 2009;35(9):457–466.Google Scholar
  42. 42.
    Shortell SM, Marsteller JA, Lin M, Pearson ML, Wu SY, Mendel P, Cretin H, Rosen M. The role of perceived team effectiveness in improving chronic illness care. Med Care. 2004;42(11):1040–1048.CrossRefPubMedGoogle Scholar
  43. 43.
    Nutting PA, Crabtree BF, McDaniel RR. Small primary care practices face four hurdles – including a physician-centric mind-set –- in becoming medical homes. Health Aff. 2012;31(11):2417–2422.CrossRefGoogle Scholar
  44. 44.
    Balasubramanian BA, Ohman Strickland PA, Crabtree BF. Using learning teams for reflective adaptation: results from a quality improvement intervention to improve adherence to guidelines for diabetes and hypertension in primary care practices. In: North American Primary Care Research Group; October 23–27, 2007. Vancouver, BC: Family Medicine; 2008;40.Google Scholar

Copyright information

© Society of General Internal Medicine 2014

Authors and Affiliations

  • L. Miriam Dickinson
    • 1
  • W. Perry Dickinson
    • 1
  • Paul A. Nutting
    • 2
  • Lawrence Fisher
    • 3
  • Marjie Harbrecht
    • 4
  • Benjamin F. Crabtree
    • 5
  • Russell E. Glasgow
    • 1
  • David R. West
    • 1
  1. 1.Department of Family MedicineUniversity of Colorado School of MedicineAuroraUSA
  2. 2.Center for Research StrategiesDenverUSA
  3. 3.Department of Family and Community MedicineUniversity of California, San FranciscoSan FranciscoUSA
  4. 4.HealthTeamWorksLakewoodUSA
  5. 5.Robert Wood Johnson Medical School, Department of Family Medicine & Community HealthThe State University of New JerseyNew BrunswickUSA

Personalised recommendations