Source of Data
Claims from the years 2005 to 2009 for 100 % of Texas Medicare beneficiaries were used, including Medicare beneficiary summary files, Medicare Provider Analysis and Review (MedPAR) files, Outpatient Standard Analytical Files (OutSAF), and Medicare Carrier files. Diagnosis related groups (DRG) associated information, including weights, Major Diagnostic Categories (MDC), and geometric mean length of stay, were obtained from Centers for Medicare & Medicaid Services (https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html).
Identification of Hospitalists
Hospitalists are defined as generalist physicians (general practitioner, family physician, internist or geriatrician) who had at least 100 evaluation-and-management (E&M) billings in a given year and generated at least 90 % of their total E&M billings in that year from inpatient services.17 Inpatient E&M billings were identified by Current Procedural Terminology (CPT) codes 99221-99223, 99231-99233 and 99251-99255. Outpatient E&M billings were identified by CPT codes 99201-99205, 99211-99215 and 99241-99245 from Carrier files.17 In sensitivity analyses, we varied the minimum number of E&M billings required for identification of hospitalists, and also the percentage of those bills from inpatient services. This had relatively small effect on the number of hospitalists identified. For example, raising the number of E&M charges to 200 from 100 decreased the number of hospitalists identified from 1,099 to 1,068, while reducing the percentage of E&M charges from 90 % to 75 % increased the number from 1,099 to 1,123.
Establishment of the Study Cohort
This process is outlined in Table 1. From 2008 and 2009 MedPAR files, we started with all admissions and selected hospital admissions with a medical DRG from acute care hospitals in Texas. We excluded admissions with obstetric services, major trauma and intensive care unit (ICU) services. We excluded admissions with ICU stays, because the algorithm for identifying hospitalists cannot distinguish regular hospitalists from generalist physicians who are full-time intensive care physicians. We next identified admissions cared for by hospitalists. To identify these admissions, we first identified all the treating physicians for each hospitalization by linking inpatient E&M billings in the Carrier files to the admission record in MedPAR files. If all of the E&M billings by generalist physicians for a given admission were from hospitalists, the admission was considered an admission cared for by hospitalists. Among those admissions, we selected those in which one hospitalist was responsible for > 50 % of all hospitalist charges. For patients with more than one admission in a given year, we randomly selected one admission per patient per year, in order to avoid clustering at the patient level. In additional analyses with 30-day readmission rate as the outcome, we included all admissions for those patients with multiple admissions in a year. The results were almost identical. We further excluded patients who were enrolled in health maintenance organizations (HMOs) or did not have continuous Medicare Parts A and B coverage in the 12 months prior to the admission of interest, because such individuals may have incomplete information on covariates (such as comorbidity). This resulted in 138,761 admissions in the initial study cohort. From these, we selected admissions associated with a major hospitalist who cared for at least 30 admissions during the study period, leaving 131,710 admissions and 1,099 hospitalists. Hofer et al.35 has shown that provider-level performance measures have a reliability greater than 0.8 for a panel of 100 patients with an intraclass correlation coefficient (ICC) of 0.04. Depending on the particular outcome, additional selection criteria described in the Study Outcomes section were applied. We also built a cohort in the same manner from 2006 and 2007 MedPAR files, in order to study the consistency in performance of the hospitalists across two time periods.
We categorized beneficiaries by age, gender and ethnicity using Medicare beneficiary summary files. We used the Medicaid indicator as a proxy of low income. Information on weekday vs. weekend admission, emergent admission, and DRG were obtained from MedPAR files. Elixhauser medical conditions were identified using the claims from MedPAR, Carrier and OutSAF files in the year prior to that of the admission of interest.36 We also assessed whether a patient had a primary care physician (PCP). A PCP was defined as a general practitioner, family physician, internist or geriatrician who saw the patient on three or more occasions in an outpatient setting (CPT E&M codes 99201-99205 and 99211-99215) in the prior year.37 Total hospitalizations and outpatient visits in the prior year were identified from MedPAR files and Carrier files, respectively.
Hospital length of stay was obtained from MedPAR files. For each admission, we calculated a difference in length of stay by subtracting the geometric mean length of stay for that DRG obtained from the Center for Medicare and Medicaid Services from the actual length of stay. This measure intrinsically controls for case mix among hospitalists, because geometric mean length of stay differs for each DRG. We excluded outliers more than three standard deviations from the norm in order to approximate the normal distribution and analyze with a hierarchical general linear model, leaving 129,491 admissions, and 1,099 hospitalists.
Mortality within 30 days of admission was calculated from date of death in the Medicare beneficiary summary file. These analyses included all 131,710 admissions and 1,099 hospitalists in the cohort. We chose mortality within 30 days of admission rather than from discharge to avoid biases in different hospital length of stay among hospitalists. However, our analyses of 30-day post discharge mortality produced almost identical results.
We calculated the rate of admissions discharged home and the rate discharged to a Skilled Nursing Facility (SNF), obtained from MedPAR files. We excluded those who were discharged dead, transferred to another acute care hospital or had stayed in a nursing facility any time in the three months prior to the admission of interest, leaving 99,522 admissions and 990 hospitalists.
ER visits were identified by CPT E&M codes 99281-99285 and 99288 from Carrier files. To study readmissions and ER visits within 30 days of discharge, we excluded those who were discharged dead or transferred to another acute care hospital, or died in the 30 days post discharge without an event (readmission or ER visit), leaving 108,547 admissions and 1,019 hospitalists in the study cohort for 30-day readmission, and 108,226 admissions and 1,018 hospitalists for 30-day ER visits. Readmissions and ER visits were not mutually exclusive; i.e., most readmissions also had an ER visit.
Multilevel analyses were used to account for the clustering of patients within hospitalists and hospitalists within hospitals. For differences in length of stay, a hierarchical general linear model was used. For other outcomes, we used hierarchical generalized linear models with binomial distribution. The hospitalist-specific estimates were derived from two-level models adjusted with patient characteristics and then plotted by rank, and from three-level models including hospitals. Patient characteristics included age, race/ethnicity, gender, Medicaid eligibility, emergency admission, weekend admission, DRG weight, MDC, Elixhauser medical condition (29 individual indicators), number of hospitalizations, number of physician visits and having a PCP in the year prior to the admission of interest. For the model analyzing differences in length of stay, DRG weight was not adjusted because it was a within-DRG comparison. Because some hospitalists cared for admissions at more than one hospital in the three-level models, we assigned hospitalists to the hospital in which > 50 % of their E&M charges occurred, and excluded admissions by those hospitalists to other hospitals. All analyses were performed with SAS version 9.2 (SAS Inc., Cary, NC). The threshold models for the partitioned variances were performed with MLwiN version 2.02.38