Skip to main content

Advertisement

Log in

Investigation of the Variability in the Assessment of Digital Chest X-ray Image Quality

  • Published:
Journal of Digital Imaging Aims and scope Submit manuscript

Abstract

A large database of digital chest radiographs was developed over a 14-month period. Ten radiographic technologists and five radiologists independently evaluated a stratified subset of images from the database for quality deficiencies and decided whether each image should be rejected. The evaluation results showed that the radiographic technologists and radiologists agreed only moderately in their assessments. When compared against each other, radiologist and technologist reader groups were found to have even less agreement than the inter-reader agreement within each group. Radiologists were found to be more accepting of limited-quality studies than technologists. Evidence from the study suggests that the technologists weighted their reject decisions more heavily on objective technical attributes, while the radiologists weighted their decisions more heavily on diagnostic interpretability relative to the image indication. A suite of reject-detection algorithms was independently run on the images in the database. The algorithms detected 4 % of postero-anterior chest exams that were accepted by the technologist who originally captured the image but which would have been rejected by the technologist peer group. When algorithm results were made available to the technologists during the study, there was no improvement in inter-reader agreement in deciding whether to reject an image. The algorithm results do, however, provide new quality information that could be captured within a site-wide, reject-tracking database and leveraged as part of a site-wide QA program.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. IMV Medical Information Division: 2010 X-ray/DR/CR market outlook report. November 2010

  2. Foos DH, Sehnert WJ, Reiner B, Siegel EL, Segal A, Waldman DL: Digital radiography reject analysis: data collection methodology, results, and recommendations from an in-depth investigation at two hospitals. J Digit Imaging 22:89–98, 2009

    Article  PubMed  Google Scholar 

  3. Jones AK, Polman R, Willis CE, Shepard SJ: One year’s results from a server-based system for performing reject analysis and exposure analysis in computed radiography. J Digit Imaging 24(2):243–255, 2011

    Article  PubMed  Google Scholar 

  4. Tzeng W, Kuo K, Liu C, Yao H, Chen C, Lin H: Managing repeated digital radiography images—a systematic approach and improvement. J Med Syst 36(4):2697–2704, 2012. doi:10.1007/s10916-011-9744-8

    Article  PubMed  Google Scholar 

  5. Reiner B: Automating quality assurance for digital radiography. J Am Coll Radiol 6:486–490, 2009

    Article  PubMed  Google Scholar 

  6. Reiner B, Siegel E, Sehnert WJ: Beauty in the Eyes of the Beholder: Image Quality Perceptions for Digital Radiography. Presented at: Society of Imaging Informatics in Medicine, Seattle, May 17, 2008

  7. Shet N, Chen J, Siegel E: Continuing challenges in defining image quality. Pediatr Radiol 41(5):582–587, 2011

    Article  PubMed  Google Scholar 

  8. Reiner BI, Siegel E, Foos D: Quantitative analysis of digital radiography QA deficiencies. In: Abstracts of the Annual Meeting of the Society for Imaging Informatics in Medicine. Providence, 2007

  9. Waaler D, Hofmann B: Image rejects/retakes—radiographic challenges. Radiat Prot Dosim 139(1–3):375–379, 2010

    Article  CAS  Google Scholar 

  10. Nagy PG, Pierce B, Otto M, Safdar NM: Quality control management and communication between radiologists and technologists. J Am Coll Radiol 5(6):759–765, 2008

    Article  PubMed  Google Scholar 

  11. Minnigh TR, Gallet J: Maintaining quality control using a radiological digital X-ray dashboard. J Digit Imaging 22(1):84–88, 2009

    Article  PubMed  Google Scholar 

  12. Prieto C, Vano E, Ten JI, Fernandez JM, Iñiguez AI, Arevalo N, Litcheva A, Crespo E, Floriano A, Martinez D: Image retake analysis in digital radiography using DICOM header information. J Digit Imaging 22(4):393–399, 2009

    Article  PubMed  CAS  Google Scholar 

  13. Meyer-Base A: Pattern Recognition for Medical Imaging: Application of Statistical Classification Methods in Biomedical Imaging. Elsevier Academic, San Diego, 2004

    Google Scholar 

  14. Zhang P, Varma B, Kumar K: Neural vs. statistical classifier in conjunction with genetic algorithm feature selection in digital mammography. The 2003 Congress on Evolutionary Computation 2:1206–1213, doi:10.1109/CEC.2003.1299806, December 8–12, 2003

  15. Castro A, Boveda C, Arcay B: Comparison of various fuzzy clustering algorithms in the detection of ROI in lung CT and a modified kernelized-spatial fuzzy c-means algorithm. 2010 10th IEEE International Conference on Information Technology and Applications in Biomedicine, 1–4, doi:10.1109/ITAB.2010.5687726, November 3–5, 2010

  16. Rale AP, Gharpure DC, Ravindran VR: Comparison of different ANN techniques for automatic defect detection in X-ray images. Int Conf Emerg Trends Electron Photon Devices Syst 2009:193–197, 2009

    Article  Google Scholar 

  17. Shiraishi J, Li Q, Appelbaum D: Computer-aided diagnosis and artificial intelligence in clinical imaging. Semin Nucl Med 41(6):449–462, 2011

    Article  PubMed  Google Scholar 

  18. Dayhoff JE, DeLeo JM: Artificial neural networks: opening the black box. Cancer 91:1615–1635, 2001

    Article  PubMed  CAS  Google Scholar 

  19. Luo H, Sehnert WJ, Ellinwood JS: Method for Detecting Anatomical Motion Blur in Diagnostic Images. Patent US7899229B2, March 1, 2011

  20. Luo H: Method for Detecting Clipped Anatomy in Medical Images. Patent US7912263B2, March 22, 2011

  21. Hamel L: Knowledge Discovery with Support Vector Machines. Wiley, Hoboken, 2009

    Book  Google Scholar 

  22. Gallet J: The Concept of Exposure Index for Carestream Directview Systems. Carestream Technical Brief Series. CAT No. 120 7091, 2010

  23. Radiological Society of North America: RadLex: A Lexicon for Uniform Indexing and Retrieval of Radiology Information Resources. http://www.rsna.org/radlex/. Accessed 09 April 2010

  24. Cohen J: A coefficient of agreement for nominal scales. Educ Psychol Meas 20(1):37–46, 1960

    Article  Google Scholar 

  25. Birger H: The foundation of the concept of relevance. J Am Soc Inf Sci 61(2):217–237, 2010

    Google Scholar 

  26. Landis JR, Koch GG: The measurement of observer agreement for categorical data. Biometrics 33:159–174, 1977

    Article  PubMed  CAS  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jacquelyn S. Whaley.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Whaley, J.S., Pressman, B.D., Wilson, J.R. et al. Investigation of the Variability in the Assessment of Digital Chest X-ray Image Quality. J Digit Imaging 26, 217–226 (2013). https://doi.org/10.1007/s10278-012-9515-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10278-012-9515-1

Keywords

Navigation