Abstract
Professional peer review of random prior radiologist’s interpretations is mandated by the Joint Commission on Accreditation of Healthcare Organizations (JCAHO). The JCAHO expects documentation of 5% rate of random peer-review cases. Countless hours are spent in departments fulfilling these requirements. The integration of the peer-review process into the radiologist’s interpretation workflow was expected to increase the percentage of documented peer review, yet decrease the time and effort for this documentation radStation clinical review workstations are deployed at every reading station. When a requisition is barcoded, radStation retrieves the patient’s clinical information and automatically displays the prior comparison report. If the radiologist agrees with the prior report, a single click on a “quality assurance’ agree box documents the agreement. In the case of a discordance, an additional dialog box automatically appears and the radiologist enters the reason for disagreement and then submits the case as a discrepancy. The system holds the discordance for 3 to 5 working days, then notifies the original radiologist via E-mail that a prior interpretation has been submitted for peer review, lists the submitted discrepancy reason, and provides a link to display the discordant report. The peer-review database is separate from the existing radiology information system (RIS). At the end of every month, summary reports of all peer-review activity are generated automatically. Initial benchmarks of our deployed system anticipate documentation of long-term random peer-review rate at greater than 50% of interpreted cases. The system enhances the peer-review process by integrating it with the normal interpretation workflow. The time to complete peer review using radStation is less than 1 second per normal case and less than 60 seconds for a discordant case. The E-mail notification system is fully automent in the data collection. This system has completely replaced a manual paper-based system. The integration of peer review directly into the radiologist’s interpretation workstation greatly enhances the capability to easily exceed JCAHO standards. The overall increase in peer-review documentation should continue to improve the ability to document a consistent high quality of patient care.
Similar content being viewed by others
References
McEnery KW, Suitor CT, Hildebrand S, et al: Radiologist’s clinical information review workstation interfaced with digital dictation system J Digit Imaging 13:45–48, 2000 (suppl 1)
Privileged & Confidential Peer Review Information form, in A guide to Continuous Quality Improvement American College of Radiology, Reston, VA, 1996, p. 7.12
Deitch CH, Chan WC, Sunshine JH, et al: Quality assessment and improvement: What radiologists do and think. Am J Roentgenol, 163:1245–1254, 1994
Carson GC, Twiford TW: Microcomputers and database software for monitoring quality assurance activities. Semin Ultrasound CT MR 2:91–93, 1992
Frank MS, Mann FA, Gillespy T: Quality assurance: A system that integrates a digital dictation system with a computer data base. Am J Roentgenol 161:1101–1103, 1993
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
McEnery, K.W., Suitor, C.T., Hildebrand, S. et al. Integration of radiologist peer review into clinical review workstation. J Digit Imaging 13 (Suppl 1), 101–104 (2000). https://doi.org/10.1007/BF03167636
Issue Date:
DOI: https://doi.org/10.1007/BF03167636