Journal of General Internal Medicine

, Volume 29, Issue 5, pp 715–722

The Association of Hospital Characteristics and Quality Improvement Activities in Inpatient Medical Services


    • Center for Organizational Leadership and Management Research (COLMR)Boston VA Healthcare System
    • Boston University School of Management
  • David Mohr
    • Center for Organizational Leadership and Management Research (COLMR)Boston VA Healthcare System
  • Mark Meterko
    • Center for Organizational Leadership and Management Research (COLMR)Boston VA Healthcare System
    • Boston University School of Public Health
  • Kelly Stolzmann
    • Center for Organizational Leadership and Management Research (COLMR)Boston VA Healthcare System
  • Peter Kaboli
    • The Comprehensive Access and Delivery Research and Evaluation (CADRE) CenterIowa City VA Healthcare System
    • Division of General Internal Medicine, Department of Internal MedicineUniversity of Iowa Carver College of Medicine
Original Research

DOI: 10.1007/s11606-013-2759-8

Cite this article as:
Restuccia, J.D., Mohr, D., Meterko, M. et al. J GEN INTERN MED (2014) 29: 715. doi:10.1007/s11606-013-2759-8



Quality of U.S. health care has been the focus of increasing attention, with deficiencies in patient care well recognized and documented. However, relatively little is known about the extent to which hospitals engage in quality improvement activities (QIAs) or factors influencing extent of QIAs.


To identify 1) the extent of QIAs in Veterans Administration (VA) inpatient medical services; and 2) factors associated with widespread adoption of QIAs, in particular use of hospitalists, non-physician providers, and extent of goal alignment between the inpatient service and senior managers on commitment to quality.


Cross-sectional, descriptive study of QIAs using a survey administered to Chiefs of Medicine (COM) at all 124 VA acute care hospitals. We conducted hierarchical regression, regressing QIA use on facility contextual variables, followed by use of hospitalists, non-physician providers, and goal alignment/quality commitment.


Outcome measures pertained to use of a set of 27 QIAs and to three dimensions—infrastructure, prevention, and information gathering—that were identified by factor analysis among the 27 QIAs overall.


Survey response rate was 90 % (111/124). Goal alignment/quality commitment was associated with more widespread use of all four QIA categories [infrastructure (b = 0.42; p < 0.001); prevention (b = 0.24; p < 0.001); information gathering (b = 0.28; p = <0.001); and overall QIA (b = 0.31; p < 0.001)], as was greater use of hospitalists [infrastructure (b = 0.55; p = 0.03); prevention (b = 0.61; p < 0.001); information gathering (b = 0.75; p = 0.01); and overall QIAs (b = 0.61; p < 0.001)]; higher occupancy rate was associated with greater infrastructure QIAs (b = 1.05, p = 0.02). Non-physician provider use, hospital size, university affiliation, and geographic region were not associated with QIAs.


As hospitals respond to changes in healthcare (e.g., pay for performance, accountable care organizations), this study suggests that practices such as use of hospitalists and leadership focus on goal alignment/quality commitment may lead to greater implementation of QIAs.


quality improvementhospital medicineinpatient medicineveterans


Quality of U.S. health care has been the focus of increasing attention, with deficiencies in patient care well-recognized and documented.1-6 The 2001 Institute of Medicine (IOM) report, Crossing the Quality Chasm, called for health care system reform to achieve the aims of safe, effective, efficient, timely, patient-centered, and equitable health care.3 In response, public and private organizations developed and implemented broad-based quality improvement (QI) initiatives, particularly within hospitals.7-10 In 2002, a consortium of organizations advocated developing and maintaining a national public database containing condition-specific measures of hospital performance to serve as indicators of hospital quality.11 The Agency for Healthcare Research and Quality (AHRQ) released the annual National Healthcare Quality Reports and launched an effort to develop reliable patient safety indicators.11,12 Beginning in the 1990s, the Veterans Health Administration (VA) also began a QI initiative that in many ways was a precursor of reforms recommended by the IOM.13

Despite a dramatic increase in attention and activity devoted to QI and patient safety after the release of the IOM report, progress towards a redesigned system has been slow and insufficient.12,14,15 In addition, relatively little is known about the extent to which hospitals are engaging in QI activities (QIAs) and pursuing structural and process changes to improve care delivery. One recent review highlighted the adoption of two specific QI approaches, Lean and Six Sigma, and reported that although both approaches can aid hospitals in addressing a variety of acute care issues, the impact was difficult to judge, with little evidence showing sustainability.16 Further, in a survey of 470 hospital quality managers, Cohen et al. found that many were actively engaged in improvement efforts, but these activities varied in extent, method, and impact. Hospitals with higher levels of perceived quality, as reflected in assessments by quality managers, were more likely to embrace QI as a strategic priority, and employed QIAs, practices, and strategies intended to improve patient outcomes.17

Beyond use of QI tools, little is known about factors influencing extent of QIAs, such as use of hospitalists and non-physician providers [e.g., nurse practitioners (NPs) and physician assistants (PAs)]. Hospitalists—physicians who specialize in the practice of hospital medicine—were first described in 1996 and the care model has been adopted widely to improve quality and efficiency [e.g., length of stay (LOS)].18,19 The Society of Hospital Medicine has promoted the involvement of hospitalists in general QI work20 and has mentored implementation (e.g., glucose control,21 venous thromboembolism,22 and care transitions23). A study surveying 179 California hospital Chief Executive Officers (CEOs) found that 72 % of hospitalists were involved in QI projects.24 In some hospitals, in particular VA, hospitalist job descriptions include participation in QIAs and physician performance-based compensation is tied to quality measures.25

Studies have supported the premise that hospitalists improve inpatient efficiency without harmful effects on quality or patient satisfaction. A systematic review of 65 articles by White and Glazier found that, in general, hospitalist care was associated with lower LOS and costs. However, based on process measures (26 studies) and clinical outcomes (51 studies), quality of care provided by hospitalists was similar to that by non-hospitalist physicians.26 Thus, there is mixed evidence on whether hospitalists improve quality, and no evidence on whether hospitalists lead to greater efforts to improve quality.

Another practice change for inpatient care is increased use of non-physician providers.27-29 While hospitalists often rotate “on” and “off” service, the consistent presence of non-physician providers on inpatient medical teams may be an important contributor to the extent of QIAs engaged. Although QIAs by non-physician providers have been reported in surgery30 and perceived value has been added to QI,31 an association between non-physician providers and QIAs has not been explored. Together, adoption of hospitalists and/or non-physician providers represents the importance of coordination of care13,32 and of interprofessional and multidisciplinary teams33 to provide inpatient care, neither of which have been investigated in great detail in terms of their contribution to QIAs.

Other factors posited to facilitate QIAs are “top management commitment to quality”34 and “goal alignment.”35 Specifically, goal alignment between a medical center’s senior leaders and a clinical service (e.g., medicine) may lead to QIAs due to a commitment to quality through evaluation and/or compensation based on quality measure performance. However, with the exception of the study by Cohen,17 there is little empirical evidence that commitment and alignment increase efforts to improve quality. Thus, the overall objective of this study was to identify the extent of QIAs in VA inpatient medical services and factors associated with more widespread adoption of QIAs; in particular, use of hospitalists, non-physician providers, and extent of goal alignment and commitment to quality of care.


Study Design and Participants

We conducted a cross-sectional, descriptive study of quality improvement activities (QIAs) using a survey of Chiefs of Medicine (COM) at all 124 Department of Veterans Affairs (VA) acute care hospitals. COMs were selected as they represent physicians with the greatest potential to understand the hospital structure and processes of care, but also have insights into systems-level QIAs and have opinions on senior leadership alignment and commitment. Email invitations were sent between June 2010 and September 2011 with a link to a confidential, electronic web-based survey. Up to four email reminders were sent, following a modified Dillman approach with multiple non-respondent contacts.36


The survey was developed using previously developed surveys of hospital QIAs by the study team, prior internal surveys, and content areas identified as being potentially associated with inpatient quality.17,37-39 Survey questions covered four domains: 1) respondent characteristics; 2) inpatient medicine structure and staffing; 3) policies, procedures, and processes; and 4) perceptions of internal processes.

The 27 QIAs contained on the survey were primarily derived from the Quality Improvement Activities Survey developed by Cohen et al.,17 who included QIAs judged as the most important and/or frequent based on adoption of questions from established surveys;37,38 on their experience as evaluation team members for the Robert Wood Johnson Pursuing Perfection program; on consultation with clinicians, managers, and researchers involved in QI; and on the questions regarding QI activities endorsed by the Institute for Healthcare Improvement (IHI) in its 100,000 Lives Campaign.40 The research team on the current study made minor revisions to the list contained in the Cohen et al. article to reflect QIAs felt to be most important in VA. We asked seven physicians with experience in VA and in quality improvement experts and a researcher expert in organizational studies to evaluate the instrument used in this study with regard to the extent to which QIAs were sufficiently clear, comprehensive in scope, and applicable to VAMCs. Only minor changes were indicated based on this evaluation.


Quality Improvement Activities (QIAs)

For the 27 QIAs, respondents rated the extent of use using a Likert scale ranging from 0 for “not used at all”, to 4 for “used hospital-wide,” as well as an option for “Don’t know/not sure.”

To reduce the number of variables and attendant inflation in the probability of a chance significant finding, we used a two-step process to identify and score a set of factors. First, exploratory factor analysis (EFA) was used to identify a potential reduced set of dimensions. Orthogonal rotation using the Varimax algorithm was applied and multiple correlations were used in the diagonal of the variance/covariance matrix to approximate the communality between items and dimensions. Two rules were applied to decide the number of factors to retain and rotate: the Kaiser-Guttman rule (i.e., factors with eigenvalues > 1), and Catell’s scree plot method; four factors were tentatively identified.

Second, we applied multi-trait scaling analysis (MTA) to the four-factor model.41,42 Based on the logic of the multitrait-multimethod strategy,43 MTA involves careful examination of the relationships among sets of items hypothesized to represent multiple dimensions. Specifically, correlations are computed for each item both with its own assigned scale and to all other proposed dimensions. The pattern of convergence and discrimination among these correlations provides evidence for the validity of the hypothesized dimensions.

MTA flags instances of probable or definite scaling failures (i.e., items that are respectively either more or significantly more highly correlated with some other scale(s) than they are to their initial assigned scale). Based on this empirical evidence, items are then re-assigned to other dimensions or dropped from the analysis so as to improve convergent validity (consistency within dimensions) and discriminant validity (distinctions between dimensions). MTA is then run again using the revised item-to-scale assignments, and the cycle is repeated until no further improvements in validity can be achieved. Cronbach’s alpha coefficients are computed as part of MTA, and monitored throughout the process of item re-assignment to ensure adequate scale internal consistency reliability for group comparisons (α > =0.70).

We applied MTA to the original four factors identified by EFA, followed by three iterations during which four items were re-assigned to different dimensions. The final set of item-to-scale assignments demonstrated 100 % convergent validity, 75 % significant item discriminant validity, and 100 % overall (significant and non-significant) discriminant validity; internal consistency reliabilities were excellent, ranging from 0.87 to 0.91. We derived scale scores for each of these four dimensions by computing the un-weighted average of the relevant items; only respondents who answered half or more of the items in a given scale received a score for that dimension. Finally, we also computed an overall index of QIA as the un-weighted average of all 27 items. This was done only for those respondents who answered half or more of the items.

Hospitalist use was determined by the question “Does your VAMC use Hospitalists on the inpatient medical service?” We created a dichotomous variable for each of the following three response options: “Yes—Only hospitalists care for medical inpatients;” “Yes—Hospitalists and other attendings care for medical inpatients;” and “No—Hospitalists do not care for medical inpatients.” These categories were based upon prior work that described use of hospitalists in VA.19 Non-physician provider use was based on whether the COM indicated using non-physician providers on the inpatient medicine service.

Senior leadership included a Director, Associate Director for Patient Care, Associate Director for Operations, and Chief of Staff. A scale focusing on “alignment and commitment” among senior leadership was determined using responses to four items: 1) commitment to highest quality of care; 2) clear sense of direction among senior leaders; 3) goal alignment between senior leadership and inpatient medical service; and 4) goal agreement between senior leadership and inpatient medicine service. Responses were rated on a 5-point Likert scale ranging from 1 for “strongly disagree” to 5 for “strongly agree.” A scale score was computed when respondents answered at least two of four questions (α = 0.91).

Models also included contextual variables that could influence the extent of implementation of QIAs using survey data and other VA administrative sources.11,44-47 Hospital level variables included number of inpatient medicine service operating beds, occupancy rate (i.e., the percentage of available inpatient medicine service beds occupied), and hospital operating years, as age may influence adoption of QI projects.47Teaching affiliation was modeled dichotomously from membership in the Council of Teaching Hospitals (COTH). Two variables for geographic characteristics were included: urban or rural area (using hospital zip code) and four geographical regions (using census definitions).

Statistical Analysis

MTA was performed on the set of QIAs with subsequent examination of correlations among the resulting dimensions for convergent validity. Descriptive statistics of relevant study variables were examined. Ordinary least squares (OLS) regression was used where individual QIA composite scale scores were regressed against non-physician provider use, hospitalist use, alignment and commitment, and facility characteristics. We used a four-step hierarchical regression model where the first step included the set of contextual variables, followed by non-physician provider use, hospitalists use, and alignment and commitment in the second, third, and fourth steps, respectively. After entering contextual variables, we entered dependent variables of primary interest in ascending order based on expected magnitude of contribution to the overall model.

Hierarchical regression allows examination of the increase in the proportion of variance explained (R2 statistic) by the additional variables at each step. The contribution of each subsequent variable and change in R2 statistic identifies the influence of that variable on the overall model beyond the variables previously entered. Collinearity statistics were assessed. All analyses were performed using SAS Version 9.2 (Cary, NC). The study was approved by the Institutional Review Boards (IRB) of the Boston Iowa City VA Healthcare Systems.


118 of 124 COMs responded to the survey, with seven excluded for not answering at least half of the items in one or more dimensions. The effective survey response rate was 90 % (111/124), resulting in a highly representative sample of the extent of QIAs among VA hospitals. Respondents also had a high rate of completing the survey, with 88.7 % of all core questions answered. Facility-level characteristics are displayed in Table 1. Four QIA dimensions were initially derived from the MTA process. However, Pearson correlation coefficients suggested overlap between two dimensions. Thus, we combined items from both dimensions to form a combined dimension for a total of three QIA dimensions (infrastructure, prevention, and information gathering). Table 2 presents the items and their factor structure.
Table 1

Descriptive Characteristics of VHA Hospitals


Frequency (%)

Mean (SD)

Medical operating beds


57.87 (34.19)

Occupancy rate


0.75 (0.19)

Operating years


63.1 (15.54)



18 (16 %)



25 (23 %)



45 (41 %)



23 (21 %)




91 (82 %)



30 (18 %)


COTH affiliation


64 (58 %)



47 (42 %)


Other provider usea


47 (42 %)



64 (58 %)


Use of hospitalists

 Only hospitalist

36 (32 %)


 Mix attending and hospitalists

59 (53 %)


 No hospitalists

16 (15 %)


Alignment and commitment


2.67 (0.92)

Quality improvement activities



3.38 (0.65)



2.27 (0.80)

 Information gathering


2.25 (0.81)

 Overall quality improvement activities


2.69 (0.65)

aOther provider includes nurse practitioners and physician assistants

Table 2

Items Composing Quality Improvement Activities (QIAs) Dimension, and the Alignment and Commitment Dimension

Dimension (k)

Item Text

Response Scalea

Infrastructure (8)

Chronic disease registries


Clinical (improvement) collaboratives


Planned care for chronic illness (Wagner’s chronic disease model)


Shared clinical governance by nurses and physicians


Case manager, social worker or other clinical staff to coordinate or manage patient care


Going on multidisciplinary rounds


Evidence-based practice guidelines/clinical pathways


Disease-specific or condition-specific improvement projects


Prevention (10)

Taking actions to prevent central line infections


Taking actions to prevent decubitus ulcers


Taking actions to prevent surgical site infections


Taking actions to prevent ventilator-associated pneumonia


Taking actions to prevent falls


Order sets


Rapid response teams


Taking actions to prevent adverse drug events


Medication reconciliation


Pharmacists placed in patient care units


Information Gathering (9)

Work process redesign or re-engineering (e.g. Six Sigma or Lean or Rapid Process Improvement Workshops (RPIW))


Activities to improve workforce recruitment, retention, and development


Benchmarking within the hospital


Benchmarking with other hospitals


Learning best practices from other industries


Patient flow improvement strategies


Patient advisory groups


Profiling of individual provider performance


Management ‘walk-arounds’ to identify quality problems or issues


Goal Alignment/Quality Commitment (4)

A clear sense of direction exists among the senior leadership (Quadrad) at this facility.


Goals of senior leadership at the facility (Quadrad) level and the inpatient medicine service are aligned.


This facility is committed to the highest quality patient care.


How much agreement do you perceive between the goals of senior leadership at the facility (Quadrad) level and those of the inpatient medicine service?


aAll items used 5-point Likert-type response scales as follows:

Use. 0 = Not used at all/1 = Used minimally/2 = Used moderately/3 = Used Widely/4 = Used Hospital-Wide. A “Don’t know/not sure” option was also available; this was scored as missing data

Agree. 1 = Strongly disagree/2 = Disagree/3 = Neither agree nor disagree/4 = Agree/5 = Strongly Agree

Extent. 1 = No or almost no agreement/2 = A little agreement/3 = Some agreement/4 = A great deal of agreement/5 = Complete or almost complete agreement

The three QIA dimensions were identified as infrastructure, prevention, and information gathering. Infrastructure (k = 8; α = 0.89) involved QIAs focused on internal design activities (e.g., chronic disease registries, governance structures). Prevention (k = 10; α = 0.92) was characterized by activities aimed to reduce negative incidents (e.g., central line infections, surgical site infections) and information gathering (k = 9; α = 0.88) by benchmarking and learning from best practices. The overall QIA dimension consisted of all items and showed high internal consistency (k = 27; α = 0.94). The most frequently used QIAs were those related to prevention (mean = 3.38), followed by information gathering (mean = 2.27), and infrastructure (mean = 2.25) (Table 1). Pearson correlations among the measures of QIAs ranged from r = 0.53 to 0.68.

Initial analysis involved regressing each of the three QIA dimensions and overall QIA dimension on the facility contextual variables, followed by each of the three predictor variable types of interest (Table 3). Results of the contextual variables showed only occupancy rate significantly associated with greater use of infrastructure QIAs (b = 1.05, p = 0.02). In the second step of the hierarchical regression, parameter estimates were negative for other provider use, ranging from -0.15 to -0.19, but no significant association was observed. In the third step, we entered two hospitalist use variables. A significant positive association was found between one or both measures and QIA scores. In comparison to not using any hospitalists, facilities who used a mix of hospitalists and non-hospitalist physicians showed significantly higher scores on prevention (b = 0.62; p < 0.001); information gathering (b = 0.63; p < 0.001); and overall QIA (b = 0.51; p <0.001), but not for infrastructure (b = 0.36; p = 0.12). Further, in comparison to facilities not using any hospitalists on the inpatient medicine service, facilities who used only hospitalists showed a positive association with all four QIA categories [infrastructure (b = 0.55; p = 0.03); prevention (b = 0.61; p < 0.001); information gathering (b = 0.75; p = 0.01); and overall QIAs (b = 0.61; p < 0.001)]. The hospitalist variable step explained an additional 0.05 to 0.09 proportion of variance.
Table 3

Quality Improvement Activities (QIAs) Regressed on Hospital Characteristics




Information gathering

Overall QIA

Parameter estimate (standard error)

Step 1 – Contextual


1.43 (0.55)*

4.00 (0.44)**

2.46 (0.59)**

2.67 (0.45)**

  Medical operating beds

0.00 (0.00)

0.00 (0.00)

0.00 (0.00)

0.00 (0.00)

  Occupancy rate

1.05 (0.43)*

-0.55 (0.34)

0.32 (0.46)

0.25 (0.35)

  Operating years

0.00 (0.01)

-0.01 (0.00)

-0.01 (0.01)

0.00 (0.00)

  Northeast region

-0.06 (0.25)

-0.10 (0.20)

-0.26 (0.27)

-0.14 (0.21)

  Midwest region

0.21 (0.23)

-0.02 (0.19)

0.12 (0.25)

0.12 (0.19)

  Southern region

0.39 (0.21)

0.08 (0.17)

0.15 (0.23)

0.25 (0.18)

  Urban area

0.06 (0.22)

0.22 (0.18)

-0.04 (0.24)

0.08 (0.18)

  COTH affiliation

0.30 (0.20)

0.30 (0.16)

0.17 (0.22)

0.30 (0.17)

  R2 step 1





Step 2

  Non-physician providera

-0.15 (0.16)

-0.11 (0.13)

-0.19 (0.17)

-0.16 (0.13)

  ΔR2 step 2





Step 3b

  Only hospitalists

0.55 (0.25)*

0.61 (0.20)**

0.75 (0.28)**

0.61 (0.21)**

  Mixed hospitalists

0.36 (0.23)

0.62 (0.18)**

0.63 (0.25)*

0.51 (0.19)**

  ΔR2 step 3





Step 4

  Alignment and commitment

0.42 (0.07)**

0.24 (0.06)**

0.28 (0.09)**

0.31 (0.06)**

  ΔR2 step 4










aNon-physician provider includes nurse practitioners and physician assistants

bReferent group: hospitals without hospitalists

*p < 0.05; **p < 0.01

In the fourth and final step of the regression model, alignment and commitment were entered. The measure was significantly associated with higher QIA scores for all measures: infrastructure (b = 0.42; p < 0.001); prevention (b = 0.24; p < 0.001); information gathering (b = 0.28; p = <0.001); and overall QIA (b = 0.31; p < 0.001). Alignment and commitment explained the largest amount of incremental variance among the predictor variables, and ranged from 0.08 to 0.21 among the four QIA dimensions.


In this study of VA hospitals, we found three major associations with QIAs. First, although most VA hospitals have devoted substantial effort to QIAs associated with “prevention,” there are fewer reported QIAs associated with “infrastructure” and “information gathering”. Interestingly, five years previously, Cohen et al. also found prevention QIAs to be more frequently used in private sector hospitals.17 It is unclear why prevention is more frequently used than infrastructure development or information gathering. One reason may be that most of the indicators of hospitals’ medical quality contained on public websites such as Hospital Compare tend to be services that are preventive in nature. This finding suggests that public reporting of quality indicators may have resulted in hospital efforts to improve quality, and is supported in the literature.48,49 Lower rates of adoption of infrastructure (e.g., multidisciplinary rounds, clinical pathways) and information gathering (e.g., benchmarking, patient advisory groups) may represent opportunities for targeted evidence-based interventions.

A second major finding was the facility’s commitment to quality, and alignment between senior medical center leadership and inpatient medicine service leadership was the strongest predictor of QIAs in our models. This finding suggests that goal alignment helps enable implementation and wide-spread use of QI efforts, and is another area with room for improvement in VA hospitals lacking high level alignment. Use of hospitalists has been viewed as a means of helping to achieve alignment. For example, an article in the Journal of Hospital Medicine concludes, “As our system and its incentives continue to progress toward alignment with value-based high-quality care, hospitalists should lead change and facilitate solutions to transform our health care system to one that provides high-value care to all.”20 We asked COMs that used hospitalists to rate 11 items on how much influence each one had on the decision to initiate or expand use of hospitalists. The top three rated items were all related to QI (improve clinical outcomes, coordination of care, and patient satisfaction).

The third major finding is that medical centers that employ hospitalists to provide inpatient medical care were also more involved with QIAs. Hospitalists’ availability on inpatient units, specialization on provision of inpatient care, and familiarity with the institution’s processes and personnel may facilitate physician involvement in such activities. In contrast, use of non-physician providers had no association with QIAs, perhaps because they are employed primarily in the provision of patient care and not for QI. Among the hospitals studied, only nine reported using non-physician providers to champion QI projects. We did not ask the COMs whether hospitalists were used to champion QI projects, but 51 COMs (46 %) did respond that improving clinical outcomes was a “large” or “major” influence to use hospitalists and 83 (75 %) said hospitalists had a “positive” or “highly positive impact” on quality of care.


This study has a number of limitations. First, the cross-sectional nature of these data can only demonstrate associations, not causality. Second, there was no direct measure of the intensity of QIAs or when they were initiated. In addition, the extent of QIA use in the hospital was not measured by observation or means other than the COM survey. Third, although we had a 90 % response rate, the survey depended upon only one key informant, the COM. However, since the COM is not part of the senior leadership, they were not in a position of commenting on their own commitment. Fourth, QIAs are largely dependent upon multidisciplinary teams, which were not measured directly and of which hospitalists and non-physician providers are only one component. Fifth, this study was conducted in VA and may not be generalizable to other settings. Lastly, while our models explain a substantial amount of variance in QIAs, there remains substantial inter-hospital variation (with R2 ranging from 0.22 to 0.41); the majority of variance remains unexplained. We hypothesize that much of it could be due to unmeasured variables such as resource allocation to quality improvement, a culture emphasizing innovation and teamwork,50 and hospital adaptation of VHA’s electronic health information system to support QIAs.51


In this study of the U.S.’s largest integrated healthcare system, the use of a hospitalist model of care was associated with more QIA implementation across three separate domains: infrastructure, prevention, and information gathering. Further, use of QIAs was relatively higher in hospitals that used a model that only employed hospitalists compared to hospitals that used a mix of hospitalists and non-hospitalists for inpatient medicine service. The other key finding was that goal alignment between senior managers and the inpatient medical service was associated with greater use of QIAs, suggesting the benefit of efforts to communicate goals and align them throughout the organization. Future research is needed using a longitudinal study design, to better determine causality and to investigate the relationship between contextual factors and QIAs on quality performance.


The work reported here was supported by the Department of Veterans Affairs, Veterans Health Administration, Health Services Research and Development Service (REA 09-220), and the Comprehensive Access & Delivery Research and Evaluation (CADRE) Center at the Iowa City VAMC (HFP 04-149) and the Center for Organizational Leadership and Management Research (COLMR) at the Boston VA Healthcare System (HFP 04-145). We acknowledge and appreciate the intellectual contribution made by Dr. Alan Cohen, Dr. Michael Shwartz and Jed Horwitt who, along with Dr. Restuccia, developed the Quality Improvement Activities Survey from which many of the questions in our survey were drawn. We also wish to thank Dr. Caitlin Brennan and Dr. James Burgess for their careful review of the manuscript and insightful comments. Views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs.

Conflict of Interest

The authors declare that they do not have a conflict of interest.

Copyright information

© Society of General Internal Medicine 2014