Advertisement

Clinical Orthopaedics and Related Research®

, Volume 470, Issue 7, pp 2029–2034 | Cite as

Training Improves Interobserver Reliability for the Diagnosis of Scaphoid Fracture Displacement

  • Geert A. Buijze
  • Thierry G. Guitton
  • C. Niek van Dijk
  • David RingEmail author
  • The Science of Variation Group
Clinical Research

Abstract

Background

The diagnosis of displacement in scaphoid fractures is notorious for poor interobserver reliability.

Questions/purposes

We tested whether training can improve interobserver reliability and sensitivity, specificity, and accuracy for the diagnosis of scaphoid fracture displacement on radiographs and CT scans.

Methods

Sixty-four orthopaedic surgeons rated a set of radiographs and CT scans of 10 displaced and 10 nondisplaced scaphoid fractures for the presence of displacement, using a web-based rating application. Before rating, observers were randomized to a training group (34 observers) and a nontraining group (30 observers). The training group received an online training module before the rating session, and the nontraining group did not. Interobserver reliability for training and nontraining was assessed by Siegel’s multirater kappa and the Z-test was used to test for significance.

Results

There was a small, but significant difference in the interobserver reliability for displacement ratings in favor of the training group compared with the nontraining group. Ratings of radiographs and CT scans combined resulted in moderate agreement for both groups. The average sensitivity, specificity, and accuracy of diagnosing displacement of scaphoid fractures were, respectively, 83%, 85%, and 84% for the nontraining group and 87%, 86%, and 87% for the training group. Assuming a 5% prevalence of fracture displacement, the positive predictive value was 0.23 in the nontraining group and 0.25 in the training group. The negative predictive value was 0.99 in both groups.

Conclusions

Our results suggest training can improve interobserver reliability and sensitivity, specificity and accuracy for the diagnosis of scaphoid fracture displacement, but the improvements are slight. These findings are encouraging for future research regarding interobserver variation and how to reduce it further.

Keywords

Training Group Interobserver Reliability Displace Fracture Scaphoid Fracture Interobserver Variation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Amadio PC, Berquist TH, Smith DK, Ilstrup DM, Cooney WP III, Linscheid RL. Scaphoid malunion. J Hand Surg Am. 1989;14:679–687.PubMedCrossRefGoogle Scholar
  2. 2.
    Bain GI, Bennett JD, MacDermid JC, Slethaug GP, Richards RS, Roth JH. Measurement of the scaphoid humpback deformity using longitudinal computed tomography: intra- and interobserver variability using various measurement techniques. J Hand Surg Am. 1998;23:76–81.PubMedCrossRefGoogle Scholar
  3. 3.
    Bankier AA, Fleischmann D, De Maertelaer V, Kontrus M, Zontsich T, Hittmair K, Mallek R. Subjective differentiation of normal and pathological bronchi on thin-section CT: impact of observer training. Eur Respir J. 1999;13:781–786.PubMedCrossRefGoogle Scholar
  4. 4.
    Berg WA, D’Orsi CJ, Jackson VP, Bassett LW, Beam CA, Lewis RS, Crewson PE. Does training in the Breast Imaging Reporting and Data System (BI-RADS) improve biopsy recommendations or feature analysis agreement with experienced breast imagers at mammography? Radiology. 2002;224:871–880.PubMedCrossRefGoogle Scholar
  5. 5.
    Bernard SA, Murray PM, Heckman MG. Validity of conventional radiography in determining scaphoid waist fracture displacement. J Orthop Trauma. 2010;24:448–451.PubMedCrossRefGoogle Scholar
  6. 6.
    Bernstein J, Adler LM, Blank JE, Dalsey RM, Williams GR, Iannotti JP. Evaluation of the Neer system of classification of proximal humeral fractures with computerized tomographic scans and plain radiographs. J Bone Joint Surg Am. 1996;78:1371–1375.PubMedGoogle Scholar
  7. 7.
    Brorson S, Bagger J, Sylvest A, Hrobjartsson A. Improved interobserver variation after training of doctors in the Neer system: a randomised trial. J Bone Joint Surg Br. 2002;84:950–954.PubMedCrossRefGoogle Scholar
  8. 8.
    Cohen J. A coefficient of agreement for nominal scales. Educ Psychol Meas. 1960;20:37–46.CrossRefGoogle Scholar
  9. 9.
    Cooney WP, Dobyns JH, Linscheid RL. Fractures of the scaphoid: a rational approach to management. Clin Orthop Relat Res. 1980;149:90–97.PubMedGoogle Scholar
  10. 10.
    Dabezies EJ, Mathews R, Faust DC. Injuries to the carpus: fractures of the scaphoid. Orthopedics. 1982;5:1510–1515.Google Scholar
  11. 11.
    de Vet HC, Koudstaal J, Kwee WS, Willebrand D, Arends JW. Efforts to improve interobserver agreement in histopathological grading. J Clin Epidemiol. 1995;48:869–873.PubMedCrossRefGoogle Scholar
  12. 12.
    Desai VV, Davis TR, Barton NJ. The prognostic value and reproducibility of the radiological features of the fractured scaphoid. J Hand Surg Br. 1999;24:586–590.PubMedGoogle Scholar
  13. 13.
    Eddeland A, Eiken O, Hellgren E, Ohlsson NM. Fractures of the scaphoid. Scand J Plast Reconstr Surg. 1975;9:234–239.PubMedCrossRefGoogle Scholar
  14. 14.
    Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–174.PubMedCrossRefGoogle Scholar
  15. 15.
    Lozano-Calderon S, Blazar P, Zurakowski D, Lee SG, Ring D. Diagnosis of scaphoid fracture displacement with radiography and computed tomography. J Bone Joint Surg Am. 2006;88:2695–2703.PubMedCrossRefGoogle Scholar
  16. 16.
    Lujan ME, Chizen DR, Peppin AK, Kriegler S, Leswick DA, Bloski TG, Pierson RA. Improving inter-observer variability in the evaluation of ultrasonographic features of polycystic ovaries. Reprod Biol Endocrinol. 2008;6:30.PubMedCrossRefGoogle Scholar
  17. 17.
    Magnan MA, Maklebust J. The effect of Web-based Braden Scale training on the reliability of Braden subscale ratings. J Wound Ostomy Continence Nurs. 2009;36:51–59.PubMedCrossRefGoogle Scholar
  18. 18.
    Patel AB, Amin A, Sortey SZ, Athawale A, Kulkarni H. Impact of training on observer variation in chest radiographs of children with severe pneumonia. Indian Pediatr. 2007;44:675–681.PubMedGoogle Scholar
  19. 19.
    Posner KL, Sampson PD, Caplan RA, Ward RJ, Cheney FW. Measuring interrater reliability among multiple raters: an example of methods for nominal data. Stat Med. 1990;9:1103–1115.PubMedCrossRefGoogle Scholar
  20. 20.
    Ring D, Patterson JD, Levitz S, Wang C, Jupiter JB. Both scanning plane and observer affect measurements of scaphoid deformity. J Hand Surg Am. 2005;30:696–701.PubMedCrossRefGoogle Scholar
  21. 21.
    Ripsweden J, Mir-Akbari H, Brolin EB, Brismar T, Nilsson T, Rasmussen E, Ruck A, Svensson A, Werner C, Winter R, Cederlund K. Is training essential for interpreting cardiac computed tomography? Acta Radiol. 2009;50:194–200.PubMedCrossRefGoogle Scholar
  22. 22.
    Sanders WE. Evaluation of the humpback scaphoid by computed tomography in the longitudinal axial plane of the scaphoid. J Hand Surg Am. 1988;13:182–187.PubMedCrossRefGoogle Scholar
  23. 23.
    Siegel S, Castellan NJ. Nonparametric Statistics for the Behavioral Sciences. 2nd ed. New York, NY: McGraw-Hill; 1988.Google Scholar
  24. 24.
    Stieber J, Quirno M, Cunningham M, Errico TJ, Bendo JA. The reliability of computed tomography and magnetic resonance imaging grading of lumbar facet arthropathy in total disc replacement patients. Spine (Phila Pa 1976). 2009;34:E833–840.Google Scholar
  25. 25.
    Szabo RM, Manske D. Displaced fractures of the scaphoid. Clin Orthop Relat Res. 1988;230:30–38.PubMedGoogle Scholar

Copyright information

© The Association of Bone and Joint Surgeons® 2012

Authors and Affiliations

  • Geert A. Buijze
    • 1
  • Thierry G. Guitton
    • 1
  • C. Niek van Dijk
    • 1
  • David Ring
    • 2
    Email author
  • The Science of Variation Group
  1. 1.Orthopaedic Research Center Amsterdam, Department of Orthopaedic SurgeryAcademic Medical CentreAmsterdamThe Netherlands
  2. 2.Orthopaedic Hand and Upper Extremity Service, Massachusetts General Hospital, Harvard Medical SchoolYawkey CenterBostonUSA

Personalised recommendations