Digital Radiography Reject Analysis: Data Collection Methodology, Results, and Recommendations from an In-depth Investigation at Two Hospitals
- First Online:
- Cite this article as:
- Foos, D.H., Sehnert, W.J., Reiner, B. et al. J Digit Imaging (2009) 22: 89. doi:10.1007/s10278-008-9112-5
Reject analysis was performed on 288,000 computed radiography (CR) image records collected from a university hospital (UH) and a large community hospital (CH). Each record contains image information, such as body part and view position, exposure level, technologist identifier, and—if the image was rejected—the reason for rejection. Extensive database filtering was required to ensure the integrity of the reject-rate calculations. The reject rate for CR across all departments and across all exam types was 4.4% at UH and 4.9% at CH. The most frequently occurring exam types with reject rates of 8% or greater were found to be common to both institutions (skull/facial bones, shoulder, hip, spines, in-department chest, pelvis). Positioning errors and anatomy cutoff were the most frequently occurring reasons for rejection, accounting for 45% of rejects at CH and 56% at UH. Improper exposure was the next most frequently occurring reject reason (14% of rejects at CH and 13% at UH), followed by patient motion (11% of rejects at CH and 7% at UH). Chest exams were the most frequently performed exam at both institutions (26% at UH and 45% at CH) with half captured in-department and half captured using portable x-ray equipment. A ninefold greater reject rate was found for in-department (9%) versus portable chest exams (1%). Problems identified with the integrity of the data used for reject analysis can be mitigated in the future by objectifying quality assurance (QA) procedures and by standardizing the nomenclature and definitions for QA deficiencies.