A Lightweight CNN and Joint Shape-Joint Space (\(JS^2\)) Descriptor for Radiological Osteoarthritis Detection
- 579 Downloads
Knee osteoarthritis (OA) is very common progressive and degenerative musculoskeletal disease worldwide creates a heavy burden on patients with reduced quality of life and also on society due to financial impact. Therefore, any attempt to reduce the burden of the disease could help both patients and society. In this study, we propose a fully automated novel method, based on combination of joint shape and convolutional neural network (CNN) based bone texture features, to distinguish between the knee radiographs with and without radiographic osteoarthritis. Moreover, we report the first attempt at describing the bone texture using CNN. Knee radiographs from Osteoarthritis Initiative (OAI) and Multicenter Osteoarthritis (MOST) studies were used in the experiments. Our models were trained on 8953 knee radiographs from OAI and evaluated on 3445 knee radiographs from MOST. Our results demonstrate that fusing the proposed shape and texture parameters achieves the state-of-the art performance in radiographic OA detection yielding area under the ROC curve (AUC) of \(95.21\%\).
KeywordsKnee osteoarthritis Joint space width Joint shape Bone texture
The OAI is a public-private partnership comprised of five contracts (N01-AR-2-2258; N01-AR-2-2259; N01-AR-2-2260; N01-AR-2-2261; N01-AR-2-2262) funded by the National Institutes of Health, a branch of the Department of Health and Human Services, and conducted by the OAI Study Investigators. Private funding partners include Merck Research Laboratories; Novartis Pharmaceuticals Corporation, GlaxoSmithKline; and Pfizer, Inc. Private sector funding for the OAI is managed by the Foundation for the National Institutes of Health. This manuscript was prepared using an OAI public use data set and does not necessarily reflect the opinions or views of the OAI investigators, the NIH, or the private funding partners.
Multicenter Osteoarthritis Study (MOST) Funding Acknowledgment. MOST is comprised of four cooperative grants (Felson – AG18820; Torner – AG18832, Lewis – AG18947, and Nevitt – AG19069) funded by the National Institutes of Health, a branch of the Department of Health and Human Services, and conducted by MOST study investigators. This manuscript was prepared using MOST data and does not necessarily reflect the opinions or views of MOST investigators.
We would like to acknowledge the strategic funding of the University of Oulu, Infotech Oulu.
We gratefully acknowledge the help received from Aleksei Tiulpin who extracted the landmarks using BoneFinder® and the support of NVIDIA Corporation with the donation of the Quadro P6000 GPU used in this research.
- 2.Altman, R.D.: Early management of osteoarthritis. Am. J. Managed Care 16, S41–S47 (2010)Google Scholar
- 3.Heidari, B.: Knee osteoarthritis prevalence, risk factors, pathogenesis and features: part I. Caspian J. Intern. Med. 2, 205 (2011)Google Scholar
- 8.Kraus, V.B., et al.: Predictive validity of radiographic trabecular bone texture in knee osteoarthritis: the osteoarthritis research society international/foundation for the national institutes of health osteoarthritis biomarkers consortium. Arthritis Rheumatol. 70, 80–87 (2018)CrossRefGoogle Scholar
- 10.Thomson, J., O’Neill, T., Felson, D., Cootes, T.: Automated shape and texture analysis for detection of osteoarthritis from radiographs of the knee. In: International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 127–134 (2015)Google Scholar
- 12.Antony, J., McGuinness, K., O’Connor, N.E., Moran, K.: Quantifying radiographic knee osteoarthritis severity using deep convolutional neural networks. In: 23rd International Conference on Pattern Recognition (ICPR), pp. 1195–1200 (2016)Google Scholar
- 15.Górriz, M., Antony, J., McGuinness, K., Giró-i-Nieto, X., O’Connor, N.E.: Assessing knee OA severity with CNN attention-based end-to-end architectures. In: International Conference on Medical Imaging with Deep Learning, pp. 197–214 (2019)Google Scholar
- 17.Minciullo, L. & Cootes, T.: Fully automated shape analysis for detection of osteoarthritis from lateral knee radiographs. In: 23rd International Conference on Pattern Recognition (ICPR), pp. 3787–3791 (2016)Google Scholar
- 20.Ahlback, S.: Osteoarthrosis of the knee. a radiographic investigation. Acta Radiol. 227, 7–72 (1968)Google Scholar
- 21.Geirhos, R., et al.: ImageNet-trained CNNs are biased towards texture; increasing shape bias improves accuracy and robustness (2018) arXiv preprint arXiv:1811.12231
- 26.Mehta, N., et al.: Comparison of 2 radiographic techniques for measurement of tibiofemoral joint space width. Orthop. J. Sports Med. 5, 2325967117728675 (2017)Google Scholar
- 33.Janvier, T., Toumi, H., Harrar, K., Lespessailles, E., Jennane, R.: ROI impact on the characterization of knee osteoarthritis using fractal analysis. In: 2015 International Conference on Image Processing Theory, Tools and Applications (IPTA), 304–308 (2015)Google Scholar
- 34.Ojala, T., Pietikäinen, M., Mäenpää, T.: Gray scale and rotation invariant texture classification with local binary patterns. In: European Conference on Computer Vision, pp. 404–420 (2000)Google Scholar
- 35.Lynch, J., Hawkes, D., Buckland-Wright, J.: A robust and accurate method for calculating the fractal signature of texture in macroradiographs of osteoarthritic knees. Med. Inf. 16, 241–251 (1991)Google Scholar
- 37.Lin, T.-Y., Maji, S.: Improved bilinear pooling with CNNs in BMVC (2017)Google Scholar
- 38.Zhang, R.: Making convolutional networks shift-invariant again (2019). arXiv preprint arXiv:1904.11486
- 40.Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. In: Advances in NIPS, pp. 8024–8035 (2019)Google Scholar