Use of a Handheld Computer Application for Voluntary Medication Event Reporting by Inpatient Nurses and Physicians
- First Online:
- Cite this article as:
- Dollarhide, A.W., Rutledge, T., Weinger, M.B. et al. J GEN INTERN MED (2008) 23: 418. doi:10.1007/s11606-007-0404-0
- 654 Downloads
To determine the feasibility of capturing self-reported medication events using a handheld computer-based Medication Event Reporting Tool (MERT).
DESIGN AND PARTICIPANTS
Handheld computers operating the MERT software application were deployed among volunteer physician (n = 185) and nurse (n = 119) participants on the medical wards of four university-affiliated teaching hospitals. Participants were encouraged to complete confidential reports on the handheld computers for medication events observed during the study period.
MEASUREMENTS AND MAIN RESULTS
Demographic variables including age, gender, education level, and clinical experience were recorded for all participants. Each MERT report included details on the provider, location, timing and type of medication event recorded. Over the course of 2,311 days of clinician participation, 76 events were reported; the median time for report completion was 231 seconds. The average event reporting rate for all participants was 0.033 reports per clinician shift. Nurses had a significantly higher reporting rate compared to physicians (0.045 vs 0.026 reports/shift, p = .02). Subgroup analysis revealed that attending physicians reported events more frequently than resident physicians (0.042 vs 0.021 reports/shift, p = .03), and at a rate similar to that of nurses (p = .80). Only 5% of MERT medication events were reported to require increased monitoring or treatment.
A handheld-based event reporting tool is a feasible method to record medication events in inpatient hospital care units. Handheld reporting tools may hold promise to augment existing hospital reporting systems.
KEY WORDSincident reporting medication errors computers handheld
Since the publication of the Institute of Medicine’s (IOM) To Err Is Human, national attention has been focused on medical errors and especially hospital safety.1 However, progress toward developing safer health care systems has been slow, impeded in large part by the difficulty in detecting hazards using conventional monitoring or reporting mechanisms.2 In its review of the scope and impact of medication error in hospitals, the IOM estimates that 400,000 preventable drug-related injuries still occur each year, the majority of which are undetected.3 New methods of identifying safety hazards are needed to generate the data to inform system redesign.4
Methods currently available for monitoring patient safety in hospitalized patients include clinician event reporting, direct observation, patient surveys, and extraction of practice data from medical records or computerized databases. Direct observation has been shown to have the highest rate of event detection, but may be prohibitively expensive for routine monitoring.2,5 Patient surveys can identify certain types of events such as adverse drug reactions, but are insensitive to potential events or medication events that are unrecognizable to patients.6 Chart review relies on the retrospective analysis of medical record documents, which may limit the rate of event detection because of deficiencies in documentation.7 In addition, although automated database trigger tools may increase the yield of event detection, they are often limited to lab-related measures and depend on computerized medical record databases not yet available in many clinical settings.8, 9, 10, 11, 12, 13, 14, 15, 16
Medication event reporting remains a central element of many hospital safety monitoring systems. Event reports (sometimes called “incident reports”) offer providers at the sharp end of patient care a means to describe and document safety events that result from system failures.17,18 Clinician-reported events are rarely false-positive, the influence of recall bias is limited,2 and they offer the greatest amount of clinically relevant data gathered at the time of the actual event. However, consistent underreporting by practicing clinicians limits the effectiveness of this component of safety monitoring.19, 20, 21 Analysis of currently available self-report methods demonstrates that such reports may capture as little as 0.04% of recognizable hospital events.2,22
Emerging handheld technologies could serve to augment clinician self-reporting of medication events in the hospital setting. Handheld computers offer a number of potential advantages over stationary computers including convenience, portability, confidentiality, and point-of-care reporting. Yet, despite an increasing number of clinical applications,23 handheld technology has been used on a very limited basis to facilitate medical event reporting.24,25 To determine the feasibility of a handheld computer-assisted reporting system, we report on the development and deployment of a Medication Event Reporting Tool (MERT) among physicians and nurses on inpatient medical units.
A total of 119 nurses and 185 physicians were recruited from four university-affiliated hospitals. Physician participants were enrolled from both general medicine and pediatric service wards and included attending and resident physicians. Physicians were recruited in rotating units of ward teams that were typically comprised of attendings, residents, and interns. Although teams were recruited to participate as a unit, individual clinician participation was strictly voluntary. Nurse participants were enrolled from medicine, surgery, and intensive care units from three of these hospitals. All participant reports were voluntary and confidential, and did not replace the standard event reporting systems currently in place at participating hospital sites. All participants provided written informed consent, and Institutional Review Board (IRB) approval was obtained for data collection at each hospital.
Survey Structure for MERT Tool
1. Who was involved in this event? (provider type)
2. In regard to this event, I would rate my involvement as (responsibility by degree)
a. Completely responsible
b. Mostly responsible
c. Somewhat responsible
d. Minimally responsible
e. Not at all responsible
3. What was the result of this event?
a. Error occurred; med not given to patient
b. Error occurred; med given to patient
i. No harm to patient
ii. No harm but increased monitoring
iii. Temporary harm not requiring treatment
iv. Temporary harm requiring treatment
v. Temporary harm prolonging hospital stay
vi. Permanent harm
vii. Near-death event
vii. Patient death
4. The medication involved in this event was (medication list)
5. When did this event occur? (date/time record)
6. What route was used? (route list)
7. Which steps in the medication process were involved?
a. Prescribing (steps list)
b. Transcribing/documenting (steps list)
c. Dispensing (steps list)
d. Administration (steps list)
e. Delays (steps list)
f. Administration-pump specific
8. What causes or factors contributed to this event?
a. Communication (causes list)
b. Information systems (causes list)
c. Equipment or devices (causes list)
d. Patient or clinical context (causes list)
e. Clinician (causes list)
f. Staffing and workload (causes list)
g. Organizational (causes list)
Nurses and physicians were recruited to carry a handheld computer for 1-week work intervals. Participants were eligible to participate for more than 1 study interval based on hospital staffing assignments. Participants received a nominal financial incentive of 4 dollars per day for their participation in the study, but there was no direct incentive tied to the completion of MERT reports. Baseline demographic and professional experience information including descriptive variables of participants’ sex, age, professional experience, and educational level were recorded for all participants by the handheld application through an initial sign-on process. Participants were prompted through an audible alarm feature of the software program, which then subsequently elicited responses from participants and collected survey data. Daily workload information was captured at the beginning of each shift. Random repeating daily surveys collected dynamic characteristics such as work activity, stress, and perceived work demands. These ecologic momentary assessments were elicited from study participants randomly over 90-minute intervals throughout their work shift (data reported separately). At the end of each computer alarm and survey, an opportunity to launch the event reporting application was provided. Participants were also given instructions on how to initiate the MERT application at any time during the study period they wished to report immediately an observed medication event. Event reporting rates were determined as the number of reports generated per shift of clinician participation.
For continuous variables, Student’s t tests were used to determine statistically significant differences between participant groups. A chi-square test of proportions was performed to examine the association between education level and MERT use by nurse participants, and also to compare the response rate differences between participant groups. All statistical calculations were performed using SPSS version 12.0. Statistical comparisons were completed as 2-tailed tests, using an alpha level of 0.05 for determining significance.
MERT “Reporter” Versus “Non-reporter” Provider Characteristics
All MD (n = 185) Mean (SD)
Res MD (n = 144) Mean (SD)
Attending MD (n = 41) Mean (SD)
Nurse (n = 119) Mean (SD)
Sex (% male)
Average Age (years)*
Average Experience (mo)†
Average Study Days‡
MERT Event Reports by All Providers
Number of reports
Total days of participation
MERT reports per shift
Physician study participants were overall younger and less experienced on their units compared to nurse participants. Physician reporters had significantly more days of study participation than physicians who did not complete an event report (12.9 vs 7.3 days; p < .001). There was no difference in age or unit experience for physician reporters versus nonreporters. Nurse reporters compared to nonreporters were similar in age, experience, and days of participation. There was a trend toward nurse reporters having a higher educational level compared with nurse nonreporters (p = .06). Physician reporters overall had more average days of participation with the handheld tool compared to nurse reporters (12.9 vs 7.5 days; p = .016).
Nurses had significantly higher reporting rates compared to all physicians (0.045 vs 0.026 reports/shift, p = .02). This difference was primarily due to the relatively lower reporting rate for resident physicians. Both attending physicians and nurses reported events significantly more often than resident physicians (0.042 vs 0.021 reports/shift, p = .03 and .045 vs 0.021 reports/shift, p = .003, respectively). In contrast, reporting rates for nurses and attending physicians were similar (0.045 vs 0.042 reports/shift, p = .80). MERT reports were generated in an even distribution across the days of participation. An assessment of the distribution of MERT events suggested a nonsignificant trend toward increased reporting rates during later days of participation (p = .07).
Events Reported by Physician and Nurse Subjects (N = 76)
Yes (any degree)
Error made, med not given
Error made, med given
No harm, monitoring
Temporary harm with treatment
Process Step Involved
Causes or Factors
Equipment or devices
Patient or clinical context
Staffing and workload
The MERT software application was developed to enable convenient, real-time reporting of medication events on a handheld computer in the hospital setting. Study participants using the handheld tool required less than 4 minutes on average to complete an event report, indicating a response burden, which compares favorably to that described for other computerized event reporting systems.24 Age, clinical experience, and education level of participants did not discriminate reporters versus nonreporters, suggesting acceptance of this tool across a range of hospital clinicians.
Nurses used the handheld application to report medication events at a greater rate than the physicians in this study. This was not a surprising finding as hospital event reporting has historically suffered from significant participation bias, with nursing staff submitting up to 89% of all reports received in some settings,26 and typically very few reports gathered from senior physicians. Subgroup analysis of our data, however, demonstrated that attending physicians and nurses actually displayed very similar rates of event reporting. The lower rate of resident reporting largely accounted for the observed reporting rate difference between nurse and physician groups. These data would suggest that handheld systems may represent an opportunity to mitigate the impact of participation bias and gather event reports more broadly across provider disciplines, particularly from attending physicians.
Resident physicians filed event reports at a rate significantly lower than both attending physicians and nurses in our study. Extended-duration work shifts have put resident physicians at particular risk for medical error leading to adverse patient events.27 Although some studies demonstrate that house staff are willing to report events when appropriately queried,28,29 actively engaging resident physicians in event reporting has been a substantial challenge. Reporting systems are sometimes difficult for house staff to navigate, and lack of clinical experience may interfere with successful reporting.30 In addition, resident physicians experience substantially higher perceived work stress levels in the hospital environment,31,32 and fear of professional or legal reprisal may limit willingness to file event reports.33 Yet, resident physicians are perhaps more likely to utilize handheld computer technology,23 and adapting reporting tools to their needs could enhance their reporting. In addition, house staff were the group most likely to report events for which they were personally responsible (71%). These data underscore the importance of developing tools to facilitate event reporting from resident physicians, and reducing the barriers inherent to conventional reporting systems.
Event reports collected by our handheld computer application reflected a broad spectrum of both actual adverse drug events (ADEs) and near misses (i.e., potential ADEs). Sixty nine percent of events captured by MERT were recorded before medication was administered, while 27% of events resulted in no discernible patient injury after the administration of medication. This observed reporting of near-miss medication events with MERT was greater than that reported by other monitoring systems, which have captured near misses in approximately 35% of events.34 Near misses are an important component of successful reporting systems, as they typically occur at a greater frequency than ADEs and allow for quantitative analysis of system failures that ultimately lead to adverse outcomes.18 Emphasizing the reporting of near-miss events tends to reduce the culture of blame when harmful events do occur, and promotes the study of recovery strategies to build more resilient safety systems.35,36
Finally, the results of this study suggest that this novel handheld tool may hold promise to improve critical event reporting in hospitals as part of an integrated health information technology. There is substantial evidence that inpatient medication events are grossly underreported,2,19,33 and traditional reporting methods are often not well integrated into hospital data systems. Handheld computers offer the potential to bundle clinical care applications on a single portable device, and to prompt users for critical event data reporting. As hospitals increasingly integrate electronic health information technology, bundling an event reporting mechanism into clinical applications offers a unique opportunity to simplify the process and potentially increase the yield of event reports gathered in hospital settings. Future study should critically evaluate the quantitative impact of a handheld event reporting tool in comparison to traditional reporting systems, as hospitals increasingly adopt novel health information technology.
There are limitations to interpreting the results and generalizability of this study, as this was an unblinded evaluation of a newly developed software application. The study participants were physicians and nurses on inpatient units at academic centers, and use in community hospitals will need to be evaluated. A handheld computer-based system would likely only be feasible at hospitals with IT resources to support data download, software modification and maintenance. Considerable attention would need to be given to encrypting and maintaining sensitive clinical information while more broadly integrating this novel technology, and the associated security and information technology (IT) costs of a larger application of MERT were not specifically evaluated in this study. The format of event reports captured on handheld computers also remains limited in its ability to collect narrative data, although advances in scripting and voice recognition software hold promise to allow expanded narrative event reporting on handheld devices in the near future. In addition, the overall percentage of events captured by this handheld event reporting application likely remains quite small, still representing only the “tip of the iceberg” of hospital events. An ideal event reporting system would also include incentives for voluntary reporting, place emphasis on a systems perspective for event analysis, and promote culture change to value event reporting as a nonpunitive means to provide critical feedback and improve patient safety.32,34
A statistical limitation of this study is that our sample size made it impractical to adjust for increases in family-wise error rates introduced by completing multiple tests. The failure to correct statistically for multiple comparisons makes it possible that some of our statistically significant findings occurred by chance. However, the importance of this limitation is mitigated by the consistent pattern of results we observed across multiple groups and the congruence of our results with that found in the literature. Observed reporting rates could have been also artificially inflated by the novelty of the handheld tool, by a volunteer bias introduced by the recruitment of clinician volunteers for the study, and by the nature of the study itself. Data was not collected for clinicians choosing not to participate in this study. Comparison of MERT-generated reports with events identified by concurrent direct observation would help to establish the effectiveness of this event reporting modality in detecting medication events in the hospital setting. Future studies may determine whether event reporting rates would be sustained in routine practice with the use of the handheld device.
Notwithstanding these limitations, several features of the handheld MERT including ready accessibility and ease of use suggest that this approach may prove an effective complement to existing hospital event reporting systems. In our study, both physician and nursing participants were able to effectively utilize the handheld tool to generate medication event reports on inpatient hospital units. Event reports were collected with a reasonable response burden, and reflected a spectrum of both potential and actual medication events. Future work will need to examine if deploying MERT-enabled handheld devices could translate into a sustainable increase in overall medication event detection in hospital settings, and provide substantive opportunities to improve patient safety.
The authors wish to thank Tod Kuykendall and Jason Slagle for their invaluable technical assistance in the development of the handheld computer application. We also express our gratitude to project site leaders Erin Stucky, MD, Greg Maynard, MD, and Martha Shively, RN. This project was supported by grant number 1 UC1 HS014283-01 from the Agency for Healthcare Research and Quality (AHRQ).
Conflict of Interest