Article

Journal of Digital Imaging

, Volume 22, Issue 1, pp 89-98

First online:

Digital Radiography Reject Analysis: Data Collection Methodology, Results, and Recommendations from an In-depth Investigation at Two Hospitals

  • David H. FoosAffiliated withClinical Applications Research Laboratory, Carestream Health Inc. Email author 
  • , W. James SehnertAffiliated withClinical Applications Research Laboratory, Carestream Health Inc.
  • , Bruce ReinerAffiliated withDepartment of Radiology, Maryland VA Healthcare System
  • , Eliot L. SiegelAffiliated withDepartment of Radiology, Maryland VA Healthcare System
  • , Arthur SegalAffiliated withDepartment of Radiology, Rochester General Hospital
  • , David L. WaldmanAffiliated withDepartment of Imaging Sciences, University of Rochester Medical Center

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Reject analysis was performed on 288,000 computed radiography (CR) image records collected from a university hospital (UH) and a large community hospital (CH). Each record contains image information, such as body part and view position, exposure level, technologist identifier, and—if the image was rejected—the reason for rejection. Extensive database filtering was required to ensure the integrity of the reject-rate calculations. The reject rate for CR across all departments and across all exam types was 4.4% at UH and 4.9% at CH. The most frequently occurring exam types with reject rates of 8% or greater were found to be common to both institutions (skull/facial bones, shoulder, hip, spines, in-department chest, pelvis). Positioning errors and anatomy cutoff were the most frequently occurring reasons for rejection, accounting for 45% of rejects at CH and 56% at UH. Improper exposure was the next most frequently occurring reject reason (14% of rejects at CH and 13% at UH), followed by patient motion (11% of rejects at CH and 7% at UH). Chest exams were the most frequently performed exam at both institutions (26% at UH and 45% at CH) with half captured in-department and half captured using portable x-ray equipment. A ninefold greater reject rate was found for in-department (9%) versus portable chest exams (1%). Problems identified with the integrity of the data used for reject analysis can be mitigated in the future by objectifying quality assurance (QA) procedures and by standardizing the nomenclature and definitions for QA deficiencies.

Key words

Reject analysis quality assurance digital radiography