Pediatric Radiology

, Volume 37, Issue 12, pp 1201–1208 | Cite as

Observer variability assessing US scans of the preterm brain: the ELGAN study

  • Karl Kuban
  • Ira Adler
  • Elizabeth N. Allred
  • Daniel Batton
  • Steven Bezinque
  • Bradford W. Betz
  • Ellen Cavenagh
  • Sara Durfee
  • Kirsten Ecklund
  • Kate Feinstein
  • Lynn Ansley Fordham
  • Frederick Hampf
  • Joseph Junewick
  • Robert Lorenzo
  • Roy McCauley
  • Cindy Miller
  • Joanna Seibert
  • Barbara Specter
  • Jacqueline Wellman
  • Sjirk Westra
  • Alan Leviton
Original Article

Abstract

Background

Neurosonography can assist clinicians and can provide researchers with documentation of brain lesions. Unfortunately, we know little about the reliability of sonographically derived diagnoses.

Objective

We sought to evaluate observer variability among experienced neurosonologists.

Materials and methods

We collected all protocol US scans of 1,450 infants born before the 28th postmenstrual week. Each set of scans was read by two independent sonologists for the presence of intraventricular hemorrhage (IVH) and moderate/severe ventriculomegaly, as well as hyperechoic and hypoechoic lesions in the cerebral white matter. Scans read discordantly for any of these four characteristics were sent to a tie-breaking third sonologist.

Results

Ventriculomegaly, hypoechoic lesions and IVH had similar rates of positive agreement (68–76%), negative agreement (92–97%), and kappa values (0.62 to 0.68). Hyperechoic lesions, however, had considerably lower values of positive agreement (48%), negative agreement (84%), and kappa (0.32). No sonologist identified all abnormalities more or less often than his/her peers. Approximately 40% of the time, the tie-breaking reader agreed with the reader who identified IVH, ventriculomegaly, or a hypoechoic lesion in the white matter. Only about 25% of the time did the third party agree with the reader who reported a white matter hyperechoic lesion.

Conclusion

Obtaining concordance seems to be an acceptable way to assure reasonably high-quality of images needed for clinical research.

Keywords

Brain Newborn Premature 

Notes

Acknowledgements

This work was funded by a cooperative agreement with the National Institute of Neurological Disorders and Stroke (1 U01 NS 40069-01A2) and a program project grant form the National Institute of Child Health and Human Development (NIH-P30-HD-18655). The authors are also grateful for the assistance of all their colleagues, and the cooperation of the families of the infants who are the focus of our attention.

References

  1. 1.
    Griffiths GD, Razzaq R, Farrell A et al (2001) Variability in measurement of internal carotid artery stenosis by arch angiography and duplex ultrasonography – time for a reappraisal? Eur J Vasc Endovasc Surg 21:130–136PubMedCrossRefGoogle Scholar
  2. 2.
    Ballantyne SA, O’Neill G, Hamilton R et al (2002) Observer variation in the sonographic measurement of optic nerve sheath diameter in normal adults. Eur J Ultrasound 15:145–149PubMedCrossRefGoogle Scholar
  3. 3.
    Winkfield B, Aube C, Burtin P et al (2003) Inter-observer and intra-observer variability in hepatology. Eur J Gastroenterol Hepatol 15:959–966PubMedCrossRefGoogle Scholar
  4. 4.
    Berg WA, Blume JD, Cormack JB et al (2006) Operator dependence of physician-performed whole-breast US: lesion detection and characterization. Radiology 241:355–365PubMedCrossRefGoogle Scholar
  5. 5.
    Pinto J, Paneth N, Kazam E et al (1988) Interobserver variability in neonatal cranial ultrasonography. Paediatr Perinat Epidemiol 2:43–58PubMedGoogle Scholar
  6. 6.
    Pinto-Martin J, Paneth N, Witomski T et al (1992) The central New Jersey neonatal brain haemorrhage study: design of the study and reliability of ultrasound diagnosis. Paediatr Perinat Epidemiol 6:273–284PubMedCrossRefGoogle Scholar
  7. 7.
    Harris DL, Teele RL, Bloomfield FH et al (2005) Does variation in interpretation of ultrasonograms account for the variation in incidence of germinal matrix/intraventricular haemorrhage between newborn intensive care units in New Zealand? Arch Dis Child Fetal Neonatal Ed 90:F494–F499PubMedCrossRefGoogle Scholar
  8. 8.
    O’Shea TM, Volberg F, Dillard RG (1993) Reliability of interpretation of cranial ultrasound examinations of very low-birthweight neonates. Dev Med Child Neurol 35:97–101PubMedCrossRefGoogle Scholar
  9. 9.
    Hintz SR, Slovis T, Bulas D et al (2007) Interobserver reliability and accuracy of cranial ultrasound scanning interpretation in premature infants. J Pediatr 150:592–596PubMedCrossRefGoogle Scholar
  10. 10.
    Tooth L, Ware R, Bain C et al (2005) Quality of reporting of observational longitudinal research. Am J Epidemiol 161:280–288PubMedCrossRefGoogle Scholar
  11. 11.
    Teele R, Share J (1991) Ultrasonography of infants and children. Saunders, PhiladelphiaGoogle Scholar
  12. 12.
    Leviton A, Paneth N, Reuss ML et al (1999) Maternal infection, fetal inflammatory response, and brain damage in very low birth weight infants. Developmental Epidemiology Network Investigators. Pediatr Res 46:566–575PubMedCrossRefGoogle Scholar
  13. 13.
    Kuban K, Sanocka U, Leviton A et al (1999) White matter disorders of prematurity: association with intraventricular hemorrhage and ventriculomegaly. The Developmental Epidemiology Network. J Pediatr 134:539–546PubMedCrossRefGoogle Scholar
  14. 14.
    Kuban KC, Allred EN, Dammann O et al (2001) Topography of cerebral white-matter disease of prematurity studied prospectively in 1607 very-low-birthweight infants. J Child Neurol 16:401–408PubMedCrossRefGoogle Scholar
  15. 15.
    Fleiss JL, Kingman A (1990) Statistical management of data in clinical research. Crit Rev Oral Biol Med 1:55–66PubMedGoogle Scholar
  16. 16.
    Byrt T, Bishop J, Carlin JB (1993) Bias, prevalence and kappa. J Clin Epidemiol 46:423–429PubMedCrossRefGoogle Scholar
  17. 17.
    Viera AJ, Garrett JM (2005) Understanding interobserver agreement: the kappa statistic. Fam Med 37:360–363PubMedGoogle Scholar
  18. 18.
    Harris DL, Bloomfield FH, Teele RL et al (2006) Variable interpretation of ultrasonograms may contribute to variation in the reported incidence of white matter damage between newborn intensive care units in New Zealand. Arch Dis Child Fetal Neonatal Ed 91:F11–F16PubMedCrossRefGoogle Scholar
  19. 19.
    Redline RW, Faye-Petersen O, Heller D et al (2003) Amniotic infection syndrome: nosology and reproducibility of placental reaction patterns. Pediatr Dev Pathol 6:435–448PubMedCrossRefGoogle Scholar
  20. 20.
    Beam CA, Layde PM, Sullivan DC (1996) Variability in the interpretation of screening mammograms by US radiologists. Findings from a national sample. Arch Intern Med 156:209–213PubMedCrossRefGoogle Scholar
  21. 21.
    Elmore JG, Wells CK, Lee CH et al (1994) Variability in radiologists’ interpretations of mammograms. N Engl J Med 331:1493–1499PubMedCrossRefGoogle Scholar
  22. 22.
    Svensson E, Starmark JE, Ekholm S et al (1996) Analysis of interobserver disagreement in the assessment of subarachnoid blood and acute hydrocephalus on CT scans. Neurol Res 18:487–494PubMedGoogle Scholar
  23. 23.
    Cloft HJ, Kaufmann T, Kallmes DF (2007) Observer agreement in the assessment of endovascular aneurysm therapy and aneurysm recurrence. AJNR 28:497–500PubMedGoogle Scholar
  24. 24.
    Grotta JC, Chiu D, Lu M et al (1999) Agreement and variability in the interpretation of early CT changes in stroke patients qualifying for intravenous rtPA therapy. Stroke 30:1528–1533PubMedGoogle Scholar
  25. 25.
    Kapeller P, Barber R, Vermeulen RJ et al (2003) Visual rating of age-related white matter changes on magnetic resonance imaging: scale comparison, interrater agreement, and correlations with quantitative measurements. Stroke 34:441–445PubMedCrossRefGoogle Scholar
  26. 26.
    de Vet HC, Koudstaal J, Kwee WS et al (1995) Efforts to improve interobserver agreement in histopathological grading. J Clin Epidemiol 48:869–873PubMedCrossRefGoogle Scholar
  27. 27.
    Kujan O, Khattab A, Oliver RJ et al (2007) Why oral histopathology suffers inter-observer variability on grading oral epithelial dysplasia: an attempt to understand the sources of variation. Oral Oncol 43:224–231PubMedCrossRefGoogle Scholar
  28. 28.
    Breeze AC, Cross JJ, Hackett GA et al (2006) Use of a confidence scale in reporting postmortem fetal magnetic resonance imaging. Ultrasound Obstet Gynecol 28:918–924PubMedCrossRefGoogle Scholar
  29. 29.
    Vansteenkiste E, Pizurica A, Philips W (2005) Improved segmentation of ultrasound brain tissue incorporating expert evaluation. Conf Proc IEEE Eng Med Biol Soc 6:6480–6483PubMedGoogle Scholar

Copyright information

© Springer-Verlag 2007

Authors and Affiliations

  • Karl Kuban
    • 1
  • Ira Adler
    • 2
  • Elizabeth N. Allred
    • 3
  • Daniel Batton
    • 4
  • Steven Bezinque
    • 5
  • Bradford W. Betz
    • 5
  • Ellen Cavenagh
    • 6
  • Sara Durfee
    • 7
  • Kirsten Ecklund
    • 8
  • Kate Feinstein
    • 9
  • Lynn Ansley Fordham
    • 10
  • Frederick Hampf
    • 11
  • Joseph Junewick
    • 5
  • Robert Lorenzo
    • 12
  • Roy McCauley
    • 13
  • Cindy Miller
    • 14
  • Joanna Seibert
    • 15
  • Barbara Specter
    • 16
  • Jacqueline Wellman
    • 17
  • Sjirk Westra
    • 18
  • Alan Leviton
    • 19
  1. 1.Division of Pediatric Neurology, Boston University Medical CenterBoston University School of MedicineBostonUSA
  2. 2.Eastern Radiologists, Inc.GrenvilleUSA
  3. 3.Neuroepidemiology Unit, Children’s Hospital BostonHarvard Medical School, Harvard School of Public HealthBostonUSA
  4. 4.Departments of Pediatrics and NeonatologyWilliam Beaumont HospitalRoyal OakUSA
  5. 5.Department of RadiologyDeVos Children’s HospitalGrand RapidsUSA
  6. 6.Department of RadiologySparrow HospitalLansingUSA
  7. 7.Department of Radiology, Brigham & Women’s HospitalHarvard Medical SchoolBostonUSA
  8. 8.Department of Radiology, Children’s Hospital BostonHarvard Medical SchoolBostonUSA
  9. 9.Department of Radiology, University of Chicago HospitalUniversity of ChicagoChicagoUSA
  10. 10.Department of RadiologyUniversity of North Carolina School of MedicineChapel HillUSA
  11. 11.Department of RadiologyBaystate Medical CenterSpringfieldUSA
  12. 12.Department of Radiology, Children’s Healthcare of AtlantaEmory University School of MedicineAtlantaUSA
  13. 13.Department of Radiology, Tufts-New England Medical CenterTufts University School of MedicineBostonUSA
  14. 14.Department of Radiology, Yale-New Haven HospitalYale University School of MedicineNew HavenUSA
  15. 15.Department of Radiology, Arkansas Children’s HospitalUniversity of Arkansas Medical SchoolLittle RockUSA
  16. 16.Department of Radiology, Forsyth Hospital, Baptist Medical CenterWake Forest University School of MedicineWinston-SalemUSA
  17. 17.Department of Radiology, Milford Regional Medical CenterMilfordUSA
  18. 18.Division of Pediatric Radiology, Massachusetts General HospitalHarvard Medical SchoolBostonUSA
  19. 19.Neuroepidemiology Unit, Children’s Hospital BostonHarvard Medical SchoolBostonUSA

Personalised recommendations