Radiological Physics and Technology

, Volume 5, Issue 2, pp 213–221 | Cite as

Mammographic system performance using an image reading qualification method

Article

Abstract

Our goal was to evaluate mammography systems based on microcalcifications and fiber detection using a statistical phantom (ALVIM, model TRM 18-209, Nuclear Associates) image readings. ALVIM phantom images were acquired under diverse exposure conditions with various equipments, and 5 radiologists with similar expertise reported their findings. The reading performance in the detection of microcalcifications and fibers of different sizes was measured by simulation of equivalent breast tissue with 4.5 and 6.5 cm thicknesses. We determined kappa values, ROC curves, and kappa probability density and detection rates with dedicated software developed locally. The statistical results generated three kappa (K) ranges that allowed quantification of the detection performance at three quality levels: unacceptable (K ≤ 0.64), achievable (K ≥ 0.70) and acceptable (0.64 < K < 0.70). An extensive database permitted a comparison of the reading performance with 99.5 % reliability (p < 0.005). The comparison showed a larger dispersion of the kappa values for the images with low contrast generated with mammography equipment which was not properly calibrated, showing that the method is able to detect the performance changes associated with the loss of image quality.

Keywords

Microcalcifications and fibers Statistical phantom Mammography ROC curves Kappa Image quality 

References

  1. 1.
    Kopans DB. Breast imaging. 3rd ed. Boston: Lippincott Williams & Wilkins; 2007.Google Scholar
  2. 2.
    Nyathi T, Mwale AN, Segone P, Mhlanga SH, Pule ML. Radiographic viewing conditions at Johannesburg Hospital. Biomed Imaging Interv J. 2008;4(2):e17.PubMedCrossRefGoogle Scholar
  3. 3.
    Brazilian Health Ministry. Portaria da Secretaria de Vigilância Sanitária n#453. Diretrizes de Proteção Radiológica em Radiodiagnóstico Médico e Odontológico. Diário Oficial da União, Brasília, 02 de janeiro de 1998.Google Scholar
  4. 4.
    American College of Radiology. Mammography quality control manual. Annapolis Junction: Committee on Quality Assurance in Mammography; 1999.Google Scholar
  5. 5.
    Gurvich VA. Statistical approach for image quality. Evaluation in daily medical practice. Med Phys. 2000;27:94–100.PubMedCrossRefGoogle Scholar
  6. 6.
    Park SH, Goo JM, Goo JM, Jo CH. Receiver operating characteristic (ROC) curve: practical review for radiologists. Koreal J Radiol. 2004;5(1):11–8.CrossRefGoogle Scholar
  7. 7.
    Pisano ED, Cole EB, Kistner EO, Muller KE, Hemminger BM, Brown ML, et al. Interpretation of digital mammograms: comparison of speed and accuracy of soft-copy versus printed-film display. Radiology. 2002;223(2):483–8.Google Scholar
  8. 8.
    Pisano ED, Gatsonis C, Hendrick E, Yaffe M, Baum JK, Acharyya S, et al. Diagnostic performance of digital versus film mammography for breast-cancer screening. N Engl J Med. 2005;353:1773–83.PubMedCrossRefGoogle Scholar
  9. 9.
    Metz CE. Some practical issues of experimental design and data analysis in radiological ROC studies. Invest Radiol. 1989;24:234–45.PubMedCrossRefGoogle Scholar
  10. 10.
    Egan JP. Signal detection theory and ROC analysis. London: Academic Press; 1975.Google Scholar
  11. 11.
    Nuclear Associates: Division of Victoreen, Inc. ALVIM-Statistical Phantoms Instruction Manual Models 07-650/07-750/18-209. NY, USA.Google Scholar
  12. 12.
    Pires SR, Elias S, Gauer M, Medeiros RB. Detectabilidad de Microcalcificaciones Y fibras Evaluadas Por Medio de Imágenes Simuladas Interpretadas en Negatoscopio Para Mamografía Y En Monitor. Radioprotección. 2006;49(XIII):121–123.Google Scholar
  13. 13.
    Ruberti EM, Pires SR, Medeiros RB. Calidad Del Proceso Generador de Imágenes Mamográficas de Equipos Instalados en La Red Pública de Salud. Radioprotección. 2006;49(XIII):118–120.Google Scholar
  14. 14.
    Altman DG. Practical statistics for medical research. 1ª ed. London: Chapman & Hall; 1991. p. 397–439.Google Scholar
  15. 15.
    Krug KB, Stützer H, Girnus R, Zähringer M, Goßmann A, Winnekendonk G, et al. Image quality of digital direct flat-panel mammography versus an analog screen-film technique using a phantom model. AJR. 2007;188:399–407.PubMedCrossRefGoogle Scholar
  16. 16.
    Gurvich VA. Statistical approach for image quality evaluation in daily medical practice. Med. Phys. 2000;27:94. doi:10.1118/1.598860.Google Scholar
  17. 17.
    Davydenko G, Gurvich V, Smekhov M. Application of StatPhantom Software for image quality evaluation. J Digit Imaging. 2002;15(1):219–20.PubMedCrossRefGoogle Scholar
  18. 18.
    Robson KJ, Kotre CJ, Faulkner K. The use of a contrast-detail test object in the optimization of optical density in mammography. Br J Radiol. 2003;68:277–82.CrossRefGoogle Scholar
  19. 19.
    Skaane P, Balleyguier C, Diekmann F, Diekmann S, Piquet JC, Young K, et al. Breast lesion detection and classification: comparison of screen-film mammography and full-field digital mammography with soft-copy reading—observer performance study. Radiology. 2005;237:37–44.PubMedCrossRefGoogle Scholar
  20. 20.
    Obuchowski NA. Receiver operating characteristic curves and their use in radiology. Radiology. 2003;229:3–8.PubMedCrossRefGoogle Scholar
  21. 21.
    Hanley JA, McNeil BJ. The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology. 1982;143:29–36.PubMedGoogle Scholar
  22. 22.
    Pires SR, Medeiros RB, Schiabel H. Management software for a database of mammography images classified by quality index. In: IFMBE Proceedings—World Congress on Medical Physics and Biomedical Engineering, 2003. Sydney.Google Scholar
  23. 23.
    Pires SR, Medeiros RB. Influence of monitor characteristics on the signals detection present in the mammographic phantom image. In: Proceedings of SPIE, vol. 6917, 2008. p. 69171H-1–69171H-10.Google Scholar

Copyright information

© Japanese Society of Radiological Technology and Japan Society of Medical Physics 2012

Authors and Affiliations

  1. 1.Departamento de Informática em Saúde, Escola Paulista de MedicinaUniversidade Federal de São PauloSão PauloBrazil
  2. 2.Departamento de Diagnóstico por Imagem, Escola Paulista de MedicinaUniversidade Federal de São PauloSão PauloBrazil

Personalised recommendations