Journal of Digital Imaging

, Volume 24, Issue 2, pp 243–255 | Cite as

One Year’s Results from a Server-Based System for Performing Reject Analysis and Exposure Analysis in Computed Radiography

  • A. Kyle Jones
  • Raimund Polman
  • Charles E. Willis
  • S. Jeff Shepard
Article

Abstract

Rejected images represent both unnecessary radiation exposure to patients and inefficiency in the imaging operation. Rejected images are inherent to projection radiography, where patient positioning and alignment are integral components of image quality. Patient motion and artifacts unique to digital image receptor technology can result in rejected images also. We present a centralized, server-based solution for the collection, archival, and distribution of rejected image and exposure indicator data that automates the data collection process. Reject analysis program (RAP) and exposure indicator data were collected and analyzed during a 1-year period. RAP data were sorted both by reason for repetition and body part examined. Data were also stratified by clinical area for further investigation. The monthly composite reject rate for our institution fluctuated between 8% and 10%. Positioning errors were the main cause of repeated images (77.3%). Stratification of data by clinical area revealed that areas where computed radiography (CR) is seldom used suffer from higher reject rates than areas where it is used frequently. S values were log-normally distributed for examinations performed under either manual or automatic exposure control. The distributions were positively skewed and leptokurtic. S value decreases due to radiologic technology student rotations, and CR plate reader calibrations were observed. Our data demonstrate that reject analysis is still necessary and useful in the era of digital imaging. It is vital though that analysis be combined with exposure indicator analysis, as digital radiography is not self-policing in terms of exposure. When combined, the two programs are a powerful tool for quality assurance.

Key words

Computed radiography data collection data mining quality assurance quality control radiography statistic analysis radiation dose reject analysis repeat analysis exposure analysis 

References

  1. 1.
    United States Food and Drug Administration: Code of federal regulations, 21CFR900.12(e)(3)(ii), 2008Google Scholar
  2. 2.
    National Council on Radiation Protection and Measurements. Report 99. Quality assurance for diagnostic imaging. Bethesda: NCRP, 1988Google Scholar
  3. 3.
    American Association of Physicists in Medicine. Report 74. Quality control in diagnostic radiology. Madison: Medical Physics, 2002Google Scholar
  4. 4.
    American College of Radiology: ACR Technical Standard for Diagnostic Medical Physics Performance Monitoring of Radiographic and Fluoroscopic Equipment, Reston: ACR, 2006, pp. 1139–1142Google Scholar
  5. 5.
    Gray JE, Winkler NT, Stears J, Frank ED: Quality Control in Diagnostic Imaging, Gaithersburg: Aspen, 1983Google Scholar
  6. 6.
    Chu WK, Ferguson S, Wunder B, Smith R, Vanhoutte JJ: A two-year reject/retake profile analysis in pediatric radiology. Health Phys 42:53–59, 1982PubMedCrossRefGoogle Scholar
  7. 7.
    GE Medical Systems: Revolution XQ/i digital radiographic imaging system. Pub. 98-5502:1–8, 1998Google Scholar
  8. 8.
    Honea R, Blado ME, Ma Y: Is reject analysis necessary after converting to computed radiography? J Digit Imaging 15(Suppl 1):41–52, 2002PubMedCrossRefGoogle Scholar
  9. 9.
    Nol J, Isouard G, Mirecki J: Digital repeat analysis; setup and operation. J Digit Imaging 19:159–166, 2006PubMedCrossRefGoogle Scholar
  10. 10.
    Peer S, Peer R, Walcher M, Pohl M, Jaschke W: Comparative reject analysis in conventional film–screen and digital storage phosphor radiography. Eur Radiol, 9:1693–1696, 1999CrossRefGoogle Scholar
  11. 11.
    Weatherburn GC, Bryan S, West M: A comparison of image reject rates when using film, hard copy computed radiography and soft copy images on picture archiving and communication systems (PACS) workstations. Br J Radiol 72:653–660, 1999PubMedGoogle Scholar
  12. 12.
    Prieto C, Vano E, Ten JI, et al: Image retake analysis in digital radiography using DICOM header information. J Digit Imaging 2008 doi:10.1007/s10278-008-9135-y PubMedGoogle Scholar
  13. 13.
    Foos DH, Sehnert WJ, Reiner B, et al: Digital radiography reject analysis: Data collection methodology, results, and recommendations from an in-depth investigation at two hospitals. J Digit Imaging 22:89–98, 2009PubMedCrossRefGoogle Scholar
  14. 14.
    Polman R, Jones AK, Willis CE, Shepard SJ: Reject analysis tool. Proc SIIM 2008:38–40, 2008Google Scholar
  15. 15.
    Burkhart RL: Quality Assurance Programs for Diagnostic Radiology Facilities. In: U.S. Department of Health E, and Welfare ed: Washington, D.C.: Government Printing Office; 1980:21Google Scholar
  16. 16.
    Sagel SS, Jost RG, Glazer HS, et al: Digital mobile radiography. J Thorac Imag 5:36–48, 1990CrossRefGoogle Scholar
  17. 17.
    Freedman M, Pe E, Mun SK: The potential for unnecessary patient exposure from the use of storage phosphor imaging systems. SPIE 1897:472–479, 1993CrossRefGoogle Scholar
  18. 18.
    Gur D, Fuhman CR, Feist JH: Natural migration to a higher dose in CR imaging. Proceedings of the Eight European Congress of Radiology 154, 1993Google Scholar
  19. 19.
    Seibert JA, Shelton DK, Moore EH: Computed radiography X-ray exposure trends. Acad Radiol 3:313–318, 1996PubMedCrossRefGoogle Scholar
  20. 20.
    Stewart BK, Kanal KM, Perdue JR, Mann FA: Computed radiography dose data mining and surveillance as an ongoing quality assurance improvement process. AJR 189:7–11, 2007PubMedCrossRefGoogle Scholar
  21. 21.
    Practical Extraction and Report Language (Perl). www.perl.org. Accessed 22 June 2009.
  22. 22.
    Willis CE, Leckie RG, Carter JR, et al: Objective measures of quality assurance in a computed radiography-based radiology department. SPIE 2432:12, 1995CrossRefGoogle Scholar
  23. 23.
    Crow EL, Shimizu K Eds. Lognormal Distributions: Theory and Applications, 1st ed. New York: CRC, 1988Google Scholar
  24. 24.
    Chauvenet WA: A Manual of Spherical and Practical Astronomy, 5th edition. Philadelphia: Lippincott, 1863Google Scholar
  25. 25.
    Carroll QB: Fuchs’s Radiographic Exposure and Quality Control, 7th edition. Springfield: Thomas, 2003Google Scholar
  26. 26.
    Kuzmak PM, Dayhoff RE: Minimizing Digital Imaging and Communications in Medicine (DICOM) Modality Worklist patient/study selection errors. J Digit Imaging 14:153–157, 2001PubMedCrossRefGoogle Scholar
  27. 27.
    American College of Radiology: General Radiology Improvement Database Metrics. https://nrdr.acr.org/portal/HELP/GRID/ACR_GRID_metrics.pdf. Accessed 5 February 2009
  28. 28.
    Minnigh TR, Gallet J: Maintaining quality control using a radiological digital X-ray dashboard. J Digit Imaging 22:84–88, 2009PubMedCrossRefGoogle Scholar

Copyright information

© Society for Imaging Informatics in Medicine 2009

Authors and Affiliations

  • A. Kyle Jones
    • 1
  • Raimund Polman
    • 1
  • Charles E. Willis
    • 1
  • S. Jeff Shepard
    • 1
  1. 1.Department of Imaging PhysicsThe University of Texas M. D. Anderson Cancer CenterHoustonUSA

Personalised recommendations