Skip to main content

Advertisement

Log in

Automated assessment of BI-RADS categories for ultrasound images using multi-scale neural networks with an order-constrained loss function

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Ultrasound imaging is one of the most frequently used diagnostic tools for detecting and analyzing abnormalities of the breast. Recently proposed methods for the automated analysis of breast ultrasound images have shown great success, especially in the classification of breast abnormalities into malignant or benign lesions. In this study, we explore the use of a deep convolutional neural network with a multi-scale module for the automated assessment of BI-RADS category on breast ultrasound images. We propose a multi-input region of interest extraction method to extract the breast region from ultrasound images, with this method avoiding the deformation of region of interest images and providing high classification performance. Moreover, we also propose an order-constrained loss function that fully considers the continuity between BI-RADS categories and shows higher performance than traditional loss functions. A large annotated dataset containing 8246 breast ultrasound images was collected to train and evaluate the proposed methods. Ablation experiments were performed to validate the effectiveness of the proposed methods. Experimental results indicate that our method can be used to mimic experienced radiologists in the assessment of the BI-RADS category of breast ultrasound images and that the automated interpretations could be acceptable in routine clinical breast ultrasound examination reports.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Abdullah N, Mesurolle B, El-Khoury M, Kao E (2009) Breast imaging reporting and data system lexicon for us: interobserver agreement for assessment of breast masses. Radiology 252(3):665–672

    Article  Google Scholar 

  2. Albarqouni S, Baur C, Achilles F, Belagiannis V, Demirci S, Navab N (2016) Aggnet: deep learning from crowds for mitosis detection in breast cancer histology images. IEEE transactions on medical imaging 35(5):1313–1321

    Article  Google Scholar 

  3. Bradski G (2000) The OpenCV Library. Dr. Dobb’s Journal of Software Tools

  4. Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, Jemal A (2018) Global cancer statistics 2018: Globocan estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: a cancer journal for clinicians 68(6), 394–424

  5. Carneiro G, Nascimento J, Bradley AP (2017) Automated analysis of unregistered multi-view mammograms with deep learning. IEEE transactions on medical imaging 36(11):2355–2365

    Article  Google Scholar 

  6. Chang YW, Chen YR, Ko CC, Lin WY, Lin KP (2020) A novel computer-aided-diagnosis system for breast ultrasound images based on bi-rads categories. Applied Sciences 10(5):1830

    Article  Google Scholar 

  7. Chen DR, Chang RF, Chen CJ, Ho MF, Kuo SJ, Chen ST, Hung SJ, Moon WK (2005) Classification of breast ultrasound images using fractal feature. Clinical imaging 29(4):235–245

    Article  Google Scholar 

  8. Chen LC, Papandreou G, Kokkinos I, Murphy K, Yuille AL (2017) Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE transactions on pattern analysis and machine intelligence 40(4):834–848

    Article  Google Scholar 

  9. Chen, LC, Papandreou, G, Schroff, F, Adam, H (2017) Rethinking atrous convolution for semantic image segmentation. arXiv:1706.05587

  10. Chen LC, Yang Y, Wang J, Xu W, Yuille AL (2016) Attention to scale: Scale-aware semantic image segmentation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3640–3649

  11. Cheng HD, Shan J, Ju W, Guo Y, Zhang L (2010) Automated breast cancer detection and classification using ultrasound images: A survey. Pattern recognition 43(1):299–317

    Article  Google Scholar 

  12. Ciritsis A, Rossi C, Eberhard M, Marcon M, Becker AS, Boss A (2019) Automatic classification of ultrasound breast lesions using a deep convolutional neural network mimicking human decision-making. European radiology 29(10):5458–5468

    Article  Google Scholar 

  13. Deng J, Dong W, Socher R, Li LJ, Li K, Fei-Fei L (2009) Imagenet: A large-scale hierarchical image database. In: 2009 IEEE conference on computer vision and pattern recognition. Ieee, pp 248–255

  14. Farabet C, Couprie C, Najman L, LeCun Y (2012) Learning hierarchical features for scene labeling. IEEE transactions on pattern analysis and machine intelligence 35(8):1915–1929

    Article  Google Scholar 

  15. Han S, Kang HK, Jeong JY, Park MH, Kim W, Bang WC, Seong YK (2017) A deep learning framework for supporting the classification of breast lesions in ultrasound images. Physics in Medicine & Biology 62(19):7714

    Article  Google Scholar 

  16. ...Harris CR, Millman KJ, van der Walt SJ, Gommers R, Virtanen P, Cournapeau D, Wieser E, Taylor J, Berg S, Smith NJ, Kern R, Picus M, Hoyer S, van Kerkwijk MH, Brett M, Haldane A, del R’ıo JF, Wiebe M, Peterson P, G’erard-Marchant P, Sheppard K, Reddy T, Weckesser W, Abbasi H, Gohlke C, Oliphant TE (2020) Array programming with NumPy. Nature 585(7825):357–362

    Article  Google Scholar 

  17. He K, Zhang X, Ren S, Sun J (2015) Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE transactions on pattern analysis and machine intelligence 37(9):1904–1916

    Article  Google Scholar 

  18. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  19. Hellquist BN, Duffy SW, Abdsaleh S, Björneld L, Bordás P, Tabár L, Viták B, Zackrisson S, Nyström L, Jonsson H (2011) Effectiveness of population-based service screening with mammography for women ages 40 to 49 years: evaluation of the swedish mammography screening in young women (scry) cohort. Cancer 117(4):714–722

    Article  Google Scholar 

  20. Hooley RJ, Scoutt LM, Philpotts LE (2013) Breast ultrasonography: state of the art. Radiology 268(3):642–659

    Article  Google Scholar 

  21. Huang G, Liu Z, Van Der Maaten L, Weinberger KQ (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4700–4708

  22. Jalalian A, Mashohor SB, Mahmud HR, Saripan MIB, Ramli ARB, Karasfi B (2013) Computer-aided detection/diagnosis of breast cancer in mammography and ultrasound: a review. Clinical imaging 37(3):420–426

    Article  Google Scholar 

  23. Joo S, Yang YS, Moon WK, Kim HC (2004) Computer-aided diagnosis of solid breast nodules: use of an artificial neural network based on multiple sonographic features. IEEE transactions on medical imaging 23(10):1292–1300

    Article  Google Scholar 

  24. Kermany DS, Goldbaum M, Cai W, Valentim CC, Liang H, Baxter SL, McKeown A, Yang G, Wu X, Yan F et al (2018) Identifying medical diagnoses and treatable diseases by image-based deep learning. Cell 172(5):1122–1131

    Article  Google Scholar 

  25. Kingma, DP, Ba, J (2014) Adam: A method for stochastic optimization. arXiv:1412.6980

  26. Nah S, Hyun Kim T, Mu Lee K (2017) Deep multi-scale convolutional neural network for dynamic scene deblurring. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 3883–3891

  27. Paszke A, Gross S, Chintala S, Chanan G, Yang E, DeVito Z, Lin Z, Desmaison A, Antiga L, Lerer A (2017) Automatic differentiation in pytorch. In: NIPS-W

  28. Pi Y, Chen Y, Deng D, Qi X, Li J, Lv Q, Yi Z (2020) Automated diagnosis of multi-plane breast ultrasonography images using deep neural networks. Neurocomputing

  29. Qi X, Zhang L, Chen Y, Pi Y, Chen Y, Lv Q, Yi Z (2019) Automated diagnosis of breast ultrasonography images using deep neural networks. Medical image analysis 52:185–198

    Article  Google Scholar 

  30. Qian X, Zhang B, Liu S, Wang Y, Chen X, Liu J, Yang Y, Chen X, Wei Y, Xiao Q, et al (2020) A combined ultrasonic b-mode and color doppler system for the classification of breast masses using neural network. European Radiology pp. 1–11

  31. Rodríguez-Cristerna A, Gómez-Flores W, de Albuquerque Pereira WC (2018) A computer-aided diagnosis system for breast ultrasound based on weighted bi-rads classes. Computer Methods and Programs in Biomedicine 153:33–40

    Article  Google Scholar 

  32. Roganovic D, Djilas D, Vujnovic S, Pavic D, Stojanov D (2015) Breast mri, digital mammography and breast tomosynthesis: comparison of three methods for early detection of breast cancer. Bosnian journal of basic medical sciences 15(4):64

    Article  Google Scholar 

  33. Shan J, Alam SK, Garra B, Zhang Y, Ahmed T (2016) Computer-aided diagnosis for breast ultrasound using computerized bi-rads features and machine learning methods. Ultrasound in medicine & biology 42(4):980–988

    Article  Google Scholar 

  34. Shen WC, Chang RF, Moon WK, Chou YH, Huang CS (2007) Breast ultrasound computer-aided diagnosis using bi-rads features. Academic radiology 14(8):928–939

    Article  Google Scholar 

  35. Sickles EA, D’Orsi CJ, Bassett LW, Appleton CM, Berg WA, Burnside ES, et al (2013) Acr bi-rads® atlas, breast imaging reporting and data system. Reston, VA: American College of Radiology pp. 39–48

  36. Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  37. Stavros AT (2004) Breast ultrasound. Lippincott Williams & Wilkins

  38. Szegedy C, Ioffe S, Vanhoucke V, Alemi AA (2017) Inception-v4, inception-resnet and the impact of residual connections on learning. In: Thirty-first AAAI conference on artificial intelligence

  39. Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1–9

  40. Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z (2016) Rethinking the inception architecture for computer vision. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2818–2826

  41. Tajbakhsh N, Shin JY, Gurudu SR, Hurst RT, Kendall CB, Gotway MB, Liang J (2016) Convolutional neural networks for medical image analysis: Full training or fine tuning? IEEE transactions on medical imaging 35(5):1299–1312

    Article  Google Scholar 

  42. Wang L (2017) Early diagnosis of breast cancer. Sensors 17(7):1572

    Google Scholar 

  43. Wei M, Wu X, Zhu J, Liu P, Luo Y, Zheng L, Du Y (2019) Multi-feature fusion for ultrasound breast image classification of benign and malignant. In: 2019 IEEE 4th International Conference on Image, Vision and Computing (ICIVC). IEEE, pp 474–478

  44. Xie J, Song X, Zhang W, Dong Q, Wang Y, Li F, Wan C (2020) A novel approach with dual-sampling convolutional neural network for ultrasound image classification of breast tumors. Physics in Medicine & Biology 65(24):245001

    Article  Google Scholar 

  45. Zeiler MD, Fergus R (2014) Visualizing and understanding convolutional networks. In: European conference on computer vision. Springer, pp 818–833

  46. Zhang, H, Cisse, M, Dauphin, YN, Lopez-Paz, D (2017) mixup: Beyond empirical risk minimization. arXiv preprint arXiv:1710.09412

  47. Zhang R, Shen J, Wei F, Li X, Sangaiah AK (2017) Medical image classification based on multi-scale non-negative sparse coding. Artificial intelligence in medicine 83:44–51

    Article  Google Scholar 

  48. Zhao H, Shi J, Qi X, Wang X, Jia J (2017) Pyramid scene parsing network. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2881–2890

Download references

Acknowledgements

This work was financially supported by the National Major Science and Technology Projects of China (2018AAA0100201) and the Sichuan Science and Technology Program of China (2020JDRC0042).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Dan Deng or Zhang Yi.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pi, Y., Li, Q., Qi, X. et al. Automated assessment of BI-RADS categories for ultrasound images using multi-scale neural networks with an order-constrained loss function. Appl Intell 52, 12943–12956 (2022). https://doi.org/10.1007/s10489-021-03140-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-03140-5

Keywords

Navigation