INTRODUCTION

Although the fundamental principles of practice-based learning and improvement (PBLI) have been discussed for decades,1 its inclusion as 1 of 6 core competencies specified by the Accreditation Council of Graduate Medical Education (ACGME) has sparked attention within residency programs across the United States. PBLI competence includes the ability to “analyze practice experience and perform practice-based improvement activities...”, and “obtain and use information about their own population of patients...”2 The potential value of data describing the clinical practice experiences of trainees is clear. In describing the necessary elements to facilitate PBLI training, Manning includes “a database making it possible to study practice” and “the opportunity to discuss practice data with colleagues”.3 If competence is a “habit”,4,5 then use of these databases coupled with multiple occasions to discuss the findings will be important components of a PBLI curriculum. Ogrinc6 notes that access to health data for setting quality improvement priorities facilitates PBLI teaching. Most clinical information systems, however, are optimized to access individual patient records; although it may be possible to use them to “study practice”, it is not easy.

We previously conducted a needs assessment and created a Web-based prototype with mock population-based reports describing residents’ panel patients.7 In this paper, we describe the development, implementation, and evaluation of an innovative tool to facilitate resident PBLI learning by providing aggregate views of ambulatory clinical practice experiences.

AIMS

Our project included 3 aims: (1) develop an information system to support PBLI training by providing aggregate patient data to residents, (2) introduce this tool into the ambulatory curriculum of our Internal Medicine residency program, and (3) evaluate the tool’s usefulness and impact on short-term learning.

SETTING

UMA Clinic

The Internal Medicine (IM) residency program at the University of Virginia Health System (UVaHS) provides training for approximately 90 residents, each of whom cares for a panel of 50–80 ambulatory patients. The University Medical Associates (UMA) clinic serves as the primary care practice site for residents whose clinics are organized into firms with a dedicated attending physician.

Clinical Data Repository

The Clinical Data Repository (CDR) at the UVaHS is a Web-enabled, enterprise-wide data warehouse containing records on more than 900,000 patients and 12 million inpatient and outpatient encounters.8 The CDR integrates administrative (e.g., financials, claims) and clinical (e.g., laboratory results, medications) data from multiple UVaHS information systems, enabling authorized users to directly query the database using a locally developed interface. Patient identifiers are stored on a highly secure server requiring 2-factor authentication, and a separate server stores de-identified data. Although the CDR was created to facilitate clinical investigation, we have expanded its use to support quality assessment and education.9,10

PROGRAM DESCRIPTION

SPARC Development

The Systems and Practice Analysis for Resident Competencies (SPARC) application is a Web-based information system that provides de-identified population-based reports on residents’ patient panels. Our original specification was based on a needs assessment conducted with 14 Internal Medicine residents and several faculty members. Housestaff identified a variety of information that would be helpful for evaluating their clinical experiences, including demographic information for their patient panels and population-based disease screening and management reports with information about compliance with therapy.7 For SPARC, a physician informaticist (JL) and programmer (KS) worked with 2 IM faculty members (JV, MN) serving as domain experts to create the measurement reports and build a functional interface for a fully operational information system. We obtained periodic feedback from a larger group of stakeholders as needed.

Although the CDR contained most of the required data, we used additional sources to link patients to residents. Panel membership data were obtained from the clinic scheduling system, and we also drew data from a local diabetes registry managed within the resident clinic. This Microsoft Access database contains clinically rich data including measurements of blood pressure and body mass.

System development took 6 months, and required 0.3 full-time equivalents (FTE) of programmer time, 0.05 FTE of clinical faculty time, and 0.2 FTE of physician informaticist time. Development costs were low as we were able to use the hardware and software tools available through the CDR. Open-source tools were used extensively in the development of SPARC: data were linked and transformed with Perl and SQL and stored in a MySQL database on Linux-based machines running the Tomcat application server. Servlets were written in Java. JavaScript-enhanced Hypertext Markup Language (HTML) pages were dynamically generated with templates using Velocity.

The SPARC Tool

SPARC reports are divided into 3 categories: (1) aggregate reports of demographic and clinical patient characteristics; (2) preventive medicine reports on immunizations, cholesterol testing, and breast, colorectal, and cervical cancer screening, and (3) disease management reports for diabetes, congestive heart failure, and chronic obstructive pulmonary disease (see Fig. 1). Residents can compare their data to all residents in their same firm, same year, or to all UMA residents and faculty.

Figure 1
figure 1

Breast Cancer Screening Report. Actual data for 1 disguised resident is shown, displaying the denominator for the particular measure and the number that met the criteria for the numerator. Residents can view historic rates (updated quarterly) for their panel to assess changes over time.

SPARC includes reports of diagnoses commonly coded for each panel, based on ICD9-CM codes drawn from the CDR. To facilitate query and reporting in the CDR, we supplement the ICD9-CM hierarchy with the Clinical Classifications Software (CCS) categories.11 For SPARC, we needed a classification with fewer categories to allow easier comparison between residents. By elimination and aggregation, we reduced the number of CCS categories from 250 to 139. Users can “drill down” to the individual ICD9-CM codes that comprise each category.

We provide detailed definitions for all measures in the tool to facilitate interpretation. As SPARC’s focus is to support PBLI learning rather than patient care, we display only de-identified data. If needed, (e.g., for a focused chart review), residents may request identifiers through a separate, manual CDR process that allows for their provision for quality improvement initiatives.

SPARC data are limited to events that occurred within UVaHS, and are typically 2–3 months out of date, primarily because of the associated lag time for data transfer to the CDR. Some of the desired data were unavailable within our institution, including prescription-related information.

Use in the Curriculum

Occurring as a required small group ambulatory block activity over the first 2 training years, residents complete an 11-session systems-based practice and PBLI seminar series of graduated difficulty. In 3 PGY-1 year seminars, residents learn the principles of quality improvement, systems thinking, and human factor analysis. In 8 seminars spread over the PGY-2 year, residents complete an error investigation, conduct root cause analyses, discuss health policy, practice billing, and participate in a clinical health economics simulation.12 As a final curriculum experience for each resident, learners use SPARC to conduct a population-based analysis and design a quality improvement initiative. In this multipart exercise, faculty members introduce the principles of population-based medicine, and the SPARC project director, an informatics-trained physician, demonstrates SPARC. Trainees then use SPARC to analyze their practice data and identify specific strengths and weaknesses of their ambulatory practices. Based on the analysis, they prepare a process flow map and root cause analysis of a self-identified deficiency and develop a formal Quality Improvement (QI) proposal. The residents’ projects culminate in a presentation to their peers and written 4- to 5-page report. In this first SPARC year, third year residents naïve to SPARC also performed the complete exercise described above, which was not available to them as second year residents.

PROGRAM EVALUATION

Methods

To assess SPARC’s usefulness, residents received an anonymous Web-based survey 2 weeks after module completion using a Likert scale and open-ended questions to measure ease of use and utility of the system as a teaching tool and for quality improvement. To assess the impact on PBLI learning, our analysis of SPARC is primarily contained within pre-, mid-, and postcurriculum assessments of resident competency including a 33-item knowledge assessment and short answer QI scenarios. The complete assessment package including the SPARC subcomponent is currently undergoing final stage validation. We conducted an initial analysis of short-term learning from SPARC using an 8-item 5-point Likert scale questionnaire targeting self-rated competence in specific PBLI tasks, administered before and after the module. Pre-post differences were analyzed using the distribution-free sign test. Study completers were compared to noncompleters with respect to gender and year of training (Pearson’s Chi-Square) and pretest scores (Mann–Whitney).

RESULTS

During the 6-month intervention period, 51 of 61 second and third year residents participated in the curriculum. Thirty-three residents (65%) completed the usefulness survey. Twenty-six residents (51%) completed pre- and postquestionnaires. The remaining 25 trainees (noncompleters) only submitted the prequestionnaire.

Usefulness Survey

The 33 respondents included 16 second and 17 third year residents. Most respondents (94%) agreed that SPARC was a useful tool for learning about PBLI, and would help them improve their patient care. Only 46% of respondents believed the data in SPARC “accurately reflect my practice”. Many perceived strengths of the system were described: “eye-opening data documenting exactly where your practice is lacking”, and “Ability to compare one’s practice with those of one’s peers... ” Respondents also identified weaknesses: “It was difficult to know how to interpret data for smaller numbers of patients...”, and “It assumes people get all their medical care at UVA and look at performance based on that.”

Impact on Learning

A comparison of pre- vs. postscores showed statistically significant (p < .001) increases on all 8 items of the instrument. Mean scores with standard deviations are shown in the Table 1. Noncompleters were no different from study completers with regard to gender (p = .918, Pearson’s Chi-Square), year of training (p = 0.877, Pearson’s Chi-Square), or mean responses to the pretest questions (p = .312, Mann–Whitney).

Table 1 Impact on PBLI Learning, Pre vs. Post (n = 26)

DISCUSSION

We developed and implemented SPARC, a Web-based tool for residents to use to evaluate their outpatient continuity practice experiences. Data from our enterprise-wide CDR were supplemented with local data sources that link patients and residents and provide detailed clinical data for diabetic patients.

Residents found SPARC to be useful for learning about PBLI and believed it would improve the care they provide to their patients. Residents also reported increased confidence in their ability to conduct PBLI tasks. In addition, they expressed skepticism about the accuracy of the data, recognizing the limitations of information gathered for administrative purposes. Although our results reached statistical significance, interpretation is limited by self-assessment bias, a significant proportion of study noncompleters, and the lack of a control group. Most noncompleters were unable to complete the study because of scheduling conflicts. Our evaluation is also limited by the lack of long-term outcome data on resident knowledge and performance, which will be the focus of further study.

We focused our outcomes on resident learning instead of clinical outcomes for several reasons. The ultimate aim of PBLI is improved patient outcomes, but valid attribution of patient prevention or chronic illness outcomes to an individual resident’s work is confounded by a variety of issues. These factors include the validity of drawing conclusions based on small resident panel sizes and effects of time lags inherent in measuring an individual resident’s performance within a 3-year residency. PGY-1 resident outcomes may be contaminated by the prior resident’s contribution, whereas advanced residents may be close to program completion by the time they have accumulated enough valid individually attributable data, too late to use the information to effect change in many patient outcomes of interest. Future studies focused on short duration medical illness outcomes may remedy these limitations. Despite these problems, ambulatory profiling may be one of the best opportunities because episodes of care can be linked to a specific provider, unlike the current team-based inpatient setting where no resident can be held solely accountable.

Our findings may be especially useful for organizations that have existing clinical data warehouses or are seeking to justify their adoption. We used widely available electronic data combined with open-source tools that are easily available at no cost, increasing the likelihood that our tool can be adopted by others. Because of mounting pressure from professional societies, payors, and educational accreditation groups, residents need to acquire the skills and knowledge to regularly analyze and act upon their practice data. We believe tools like SPARC may be an important component of a PBLI curriculum that directly engages residents and helps them acquire these important skills.