Table 1 demonstrates the roles and notes we defined at our institution. For the initial implementation of the Radiologue system, we have chosen to create a “wet reading” system. A purpose-built wet reading system already existed elsewhere in our institution, and the radiologists (residents and attendings) were already familiar with the wet reading workflow. On-call residents and fellows use this system to communicate their impressions to the healthcare providers regarding CTs and MRIs performed at night on weekdays and on weekends. The residents enter their “wet readings” on a web form accessible directly from their PACS stations. The form allows the residents to mark the urgency of the note as Critical, Medium Priority, and Low Priority (which is the default setting). Healthcare providers (residents and attendings in the Emergency Department) access the system directly from the Electronic Medical Record (EMR), which displays the worklist of pending studies for their patient. Healthcare providers access “wet readings” by clicking on color-coded buttons next to the study of interest. Critical results are displayed in bright-red color, to attract attention.
Table 1 The roles defined within Radiologue, with their associated note types and permitted actions
Attending radiologists participate in the “wet reading” system by entering “review” notes. When entering a “review”, attending radiologists have four options to rate the level of disagreement with the “wet reading” note: “Agree”, “Minor Discrepancy”, “Major Discrepancy”, and “Good Job”. When there is a disagreement, attending radiologists provide correct interpretations and teaching points in the text of the review note. The application displays these notes to the healthcare providers in a similar fashion to the “wet reading” notes. Reviews with “Major Discrepancy” appear on the worklist as bright-red buttons.
The application contains a Quality Assurance function, which allows residents to query the wet readings with which the attending radiologists disagreed, and to read the associated “review” notes. This educational function helps residents learn from their mistakes. This function is utilized both for individual learning and in the formal Resident Quality Assurance conference, which is held at our institution every month. The log of time-stamped information access, which our system generated, has also been very helpful in discussions of problem cases in the setting of hospital-wide Trauma Quality Assurance conferences. Usually, a clear sequence of events (including timing, content, and recipient of communication) can be reconstructed from these logs. We found such clarity helpful in defending radiologists’ actions.
The data collected in the Radiologue application constitute records of the formal quality assurance proceedings, therefore, its discovery in the court of law is prohibited under California Evidence Code Section 1157. In addition to security measures mandated by legislation governing protected health information (HIPAA), the application includes several additional measures to prevent accidental discovery of sensitive quality assurance information. Although the application utilizes identifying information to keep track of the cases, for the purposes of presentation, the identifying data are stripped. Also, outside the quality assurance context, the “wet reading” and “review” notes are hidden from users when the final report for the radiological exam is entered into the medical record. This final report, which is usually dictated by a resident or fellow and approved by an attending radiologist, constitutes the ultimate diagnostic result and the summary of preceding communication.
The application incorporates a dashboard, which displays workflow information derived from the access log (Fig. 5). Access to this dashboard is strictly limited to the departmental management. Numbers of critical results, major disagreements with wet readings, and time delay in providing wet readings are tracked and graphed in real time. Because the application utilizes tracking data from the RIS, the dashboard can calculate and display operational derivatives, such as throughput information. Patient-specific protected health information is stripped from all data used for management purposes.
As of this writing, Radiologue has been in continuous operation for 19 months. It has been readily adopted by the Radiology staff and by the clinicians. Currently, the system has 1,822 users, including 149 radiologists (residents rotating 14 at any one time, fellows rotating seven at any one time, and 14 attendings), 13 radiology managers (which include chief technologists), and 1,650 clinicians (residents, attendings, and nurses). The system has been accessed 39,674 times by radiologists and 34,926 times by clinicians.