One Year’s Results from a Server-Based System for Performing Reject Analysis and Exposure Analysis in Computed Radiography
Rejected images represent both unnecessary radiation exposure to patients and inefficiency in the imaging operation. Rejected images are inherent to projection radiography, where patient positioning and alignment are integral components of image quality. Patient motion and artifacts unique to digital image receptor technology can result in rejected images also. We present a centralized, server-based solution for the collection, archival, and distribution of rejected image and exposure indicator data that automates the data collection process. Reject analysis program (RAP) and exposure indicator data were collected and analyzed during a 1-year period. RAP data were sorted both by reason for repetition and body part examined. Data were also stratified by clinical area for further investigation. The monthly composite reject rate for our institution fluctuated between 8% and 10%. Positioning errors were the main cause of repeated images (77.3%). Stratification of data by clinical area revealed that areas where computed radiography (CR) is seldom used suffer from higher reject rates than areas where it is used frequently. S values were log-normally distributed for examinations performed under either manual or automatic exposure control. The distributions were positively skewed and leptokurtic. S value decreases due to radiologic technology student rotations, and CR plate reader calibrations were observed. Our data demonstrate that reject analysis is still necessary and useful in the era of digital imaging. It is vital though that analysis be combined with exposure indicator analysis, as digital radiography is not self-policing in terms of exposure. When combined, the two programs are a powerful tool for quality assurance.
Key wordsComputed radiography data collection data mining quality assurance quality control radiography statistic analysis radiation dose reject analysis repeat analysis exposure analysis
The authors would like to thank Dawn Chalaire for her editorial assistance.
- 1.United States Food and Drug Administration: Code of federal regulations, 21CFR900.12(e)(3)(ii), 2008Google Scholar
- 2.National Council on Radiation Protection and Measurements. Report 99. Quality assurance for diagnostic imaging. Bethesda: NCRP, 1988Google Scholar
- 3.American Association of Physicists in Medicine. Report 74. Quality control in diagnostic radiology. Madison: Medical Physics, 2002Google Scholar
- 4.American College of Radiology: ACR Technical Standard for Diagnostic Medical Physics Performance Monitoring of Radiographic and Fluoroscopic Equipment, Reston: ACR, 2006, pp. 1139–1142Google Scholar
- 5.Gray JE, Winkler NT, Stears J, Frank ED: Quality Control in Diagnostic Imaging, Gaithersburg: Aspen, 1983Google Scholar
- 7.GE Medical Systems: Revolution XQ/i digital radiographic imaging system. Pub. 98-5502:1–8, 1998Google Scholar
- 14.Polman R, Jones AK, Willis CE, Shepard SJ: Reject analysis tool. Proc SIIM 2008:38–40, 2008Google Scholar
- 15.Burkhart RL: Quality Assurance Programs for Diagnostic Radiology Facilities. In: U.S. Department of Health E, and Welfare ed: Washington, D.C.: Government Printing Office; 1980:21Google Scholar
- 18.Gur D, Fuhman CR, Feist JH: Natural migration to a higher dose in CR imaging. Proceedings of the Eight European Congress of Radiology 154, 1993Google Scholar
- 21.Practical Extraction and Report Language (Perl). www.perl.org. Accessed 22 June 2009.
- 23.Crow EL, Shimizu K Eds. Lognormal Distributions: Theory and Applications, 1st ed. New York: CRC, 1988Google Scholar
- 24.Chauvenet WA: A Manual of Spherical and Practical Astronomy, 5th edition. Philadelphia: Lippincott, 1863Google Scholar
- 25.Carroll QB: Fuchs’s Radiographic Exposure and Quality Control, 7th edition. Springfield: Thomas, 2003Google Scholar
- 27.American College of Radiology: General Radiology Improvement Database Metrics. https://nrdr.acr.org/portal/HELP/GRID/ACR_GRID_metrics.pdf. Accessed 5 February 2009