Advertisement

Skeletal Radiology

, Volume 39, Issue 2, pp 155–160 | Cite as

Inter- and intra-observer variation in classification systems for impending fractures of bone metastases

  • Moataz El-HusseinyEmail author
  • Nigel Coleman
Scientific Article

Abstract

Background

The study was designed to assess the reproducibility and reliability of Mirels' scoring system and the conventional scoring system for impending pathological fractures. The results of both classification systems influence the choice of therapeutic procedures offered to these patients.

Methods

Eight independent observers (four orthopaedic surgeons and four radiologists with varying clinical experience) scored blinded plain radiographs from 47 patients with bone metastases. Each observer scored the radiographs as per the Mirels and the conventional systems. After 12 weeks, the observers scored the radiographs again. Inter- and intra-observer agreement was assessed based on the weighted kappa coefficient values for both systems.

Results

For intra-observer reproducibility, kappa values for the conventional system had a mean of 0.499 (SD 0.074) showing a moderate agreement, while Mirels’ scoring system had a mean of 0.396 (SD 0.101) showing a fair agreement. For inter-observer reliability, kappa values for the conventional scoring system were 0.322 for the first test and 0.47 for the second test, giving fair and moderate agreement respectively. For Mirels' scoring system, the kappa coefficient for inter-observer reliability was 0.183 for the first test and 0.218 for the second, giving poor and fair agreement respectively.

Conclusions

The conventional scoring system showed better inter and intra-observer agreement compared with Mirels' scoring system. Both systems fail to take into account factors such as co-morbidities and prognosis. We believe the conventional system is a good screening tool, but a new scoring system is required for impending pathological fractures.

Keywords

Classification systems Mirels’ scoring system Pathological fractures 

Notes

Acknowledgements

Thank you to Mr. A. Chakrabarti, Dr. M. Sparks and Dr. M. Crowe for their time to rate both classification systems.

References

  1. 1.
    Carnesale P. Malignant tumours of bone. In: St Canale JB, editor. Campbell's operative orthopaedics, vol. 1. 10th ed. Amsterdam: Elsevier; 2003. p. 551.Google Scholar
  2. 2.
    Mirels H (1989) Metastatic disease in long bones. A proposed scoring system for diagnosing impending pathologic fractures. Clin Orthop Relat Res (249):256–264Google Scholar
  3. 3.
    Van der Linden YM, et al. Comparative analysis of risk factors for pathological fracture with femoral metastases. J Bone Joint Surg Br. 2004;86(4):566–73.CrossRefGoogle Scholar
  4. 4.
    Cohen J. A coefficient of agreement for nominal scales. Edu Psychol Meas. 1960;20(1):37–46.CrossRefGoogle Scholar
  5. 5.
    Landis R, Koch G. An application of the hierarchical kappa-type statistic in the assessment of majority agreement among multiple observers. Biometrics. 1977;33(10):363–74.CrossRefGoogle Scholar
  6. 6.
    Van der Linden YM, et al. Simple radiographic parameter predicts fracturing in metastatic femoral bone lesions: results from a randomised trial. Radiother Oncol. 2003;69(1):21–31.CrossRefGoogle Scholar
  7. 7.
    Cumming D, et al. Metastatic bone disease: the requirement for improvement in a multidisciplinary approach. Int Orthop. 2009;33(2):493–6.CrossRefGoogle Scholar
  8. 8.
    Damron TA et al (2003) Critical evaluation of Mirels' rating system for impending pathologic fractures. Clin Orthop Relat Res (415 Suppl):S201-S207CrossRefGoogle Scholar
  9. 9.
    Evans AR, et al. Mirels' rating for humerus lesions is both reproducible and valid. Clin Orthop Relat Res. 2008;466(6):1279–84.CrossRefGoogle Scholar
  10. 10.
    Martin JS, Marsh JL. Current classification of fractures. Rationale and utility. Radiol Clin North Am. 1997;35(3):491–506.PubMedGoogle Scholar
  11. 11.
    Wainwright AM, Williams JR, Carr AJ. Interobserver and intraobserver variation in classification systems for fractures of the distal humerus. J Bone Joint Surg Br. 2000;82(5):636–42.CrossRefGoogle Scholar
  12. 12.
    Frandsen PA, et al. Garden's classification of femoral neck fractures. An assessment of inter-observer variation. J Bone Joint Surg Br. 1988;70(4):588–90.CrossRefGoogle Scholar
  13. 13.
    Brumback RJ, Jones AL. Interobserver agreement in the classification of open fractures of the tibia. The results of a survey of two hundred and forty-five orthopaedic surgeons. J Bone Joint Surg Am. 1994;76(8):1162–6.CrossRefGoogle Scholar

Copyright information

© ISS 2009

Authors and Affiliations

  1. 1.University College London HospitalsLondonUK
  2. 2.Orthopaedics and Trauma UnitQueen Elizabeth Hospital, King’s Lynn NHS TrustNorfolkUK

Personalised recommendations