Advertisement

Multimedia Tools and Applications

, Volume 78, Issue 1, pp 1067–1080 | Cite as

Method for real-time automatic setting of ultrasonic image parameters based on deep learning

  • Dongyue Wang
  • Junjie Tian
  • Taeg Keun WhangboEmail author
Article
  • 131 Downloads

Abstract

We propose a method for the automatic setting of ultrasonic image parameter values based on deep learning of image classification in this paper. The method first classifies ultrasonic images through a convolutional neural network and then sets gray map and Gain parameters correspondingly to acquire high-quality images. In the classification step, we initially tried to classify the images using GoogLeNet. However, as GoogLeNet has a complicated structure and a low operating speed, this paper proposes a new structure for the convolutional neural network to classify the images. The results show that the customized classification method can result in faster recognition without compromising the performance, thus successfully achieving rapid and automatic setting of ultrasonic image parameters.

Keywords

Ultrasonic image classification Convolutional neural network Deep learning 

Notes

Acknowledgments

This work was supported by the GRRC program of Gyeonggi province. [GRRC-Gachon2017(B03), Development of Personalized Digital Support Technology based on Artificial Intelligence].

References

  1. 1.
    Arajo T, Aresta G, Castro E, Rouco J, Aguiar P, Eloy C et al (2017) Classification of breast cancer histology images using convolutional neural networks. PLoS One 12(6):e0177544CrossRefGoogle Scholar
  2. 2.
    Chen W, Liu T, Wang B (2011) Ultrasonic image classification based on support vector machine with two independent component features. Computers & Mathematics with Applications 62(7):2696–2703MathSciNetzbMATHCrossRefGoogle Scholar
  3. 3.
    Chon A, Balachandar N, Lu P (2017) Deep Convolutional Neural Networks for Lung Cancer Detection. tech. rep., Stanford UniversityGoogle Scholar
  4. 4.
    Cui K, Qin X (2018) Virtual reality research of the dynamic characteristics of soft soil under metro vibration loads based on BP neural networks. Neural Comput & Applic 29(5):1233–1242CrossRefGoogle Scholar
  5. 5.
    Cui K, Zhao TT (2017) Unsaturated dynamic constitutive model under cyclic loading. Clust Comput 20(4):2869–2879MathSciNetCrossRefGoogle Scholar
  6. 6.
    Dahl GE, Sainath TN, Hinton GE (2013) Improving deep neural networks for LVCSR using rectified linear units and dropout. Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on. IEEEGoogle Scholar
  7. 7.
    Doust BD, Maklad NF (1974) Ultrasonic B-mode examination of the gallbladder: Technique and criteria for the diagnosis of gallstones. Radiology 110(3):643–647CrossRefGoogle Scholar
  8. 8.
    Du JF, Xiao P, Wu JS et al (2012) Design of isotropic orthogonal transform algorithm-based multicarrier systems with blind channel estimation. IET Commun 6(16):2695–2704CrossRefGoogle Scholar
  9. 9.
    Fatemi M, Kak AC (1980) Ultrasonic B-scan imaging: Theory of image formation and a technique for restoration. Ultrason Imaging 2(1):1–47CrossRefGoogle Scholar
  10. 10.
    Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks." Proceedings of the thirteenth international conference on artificial intelligence and statisticsGoogle Scholar
  11. 11.
    Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier neural networks. Proceedings of the Fourteenth International Conference on Artificial Intelligence and StatisticsGoogle Scholar
  12. 12.
    Gong, Y et al (2014) Multi-scale orderless pooling of deep convolutional activation features. European Conference on Computer Vision. Springer, ChamGoogle Scholar
  13. 13.
    Hecht-Nielsen R (1992) Theory of the backpropagation neural network. Neural networks for perception 65–93Google Scholar
  14. 14.
    Hoskins PR, Martin K, Thrush A (2010) Diagnostic ultrasound: physics and equipment. Cambridge University Press, CambridgeCrossRefGoogle Scholar
  15. 15.
    Howard AG, et al (2017) Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861Google Scholar
  16. 16.
    Hu J, Shen L, Sun G (2017) Squeeze-and-excitation networks. arXiv preprint arXiv:1709.01507Google Scholar
  17. 17.
    Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks,” Advances in Neural Information Processing Systems 1097–1105Google Scholar
  18. 18.
    LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436–444CrossRefGoogle Scholar
  19. 19.
    Luo QL, Fang W, Wu JS et al (2012) Reliable broadband wireless communication for high speed trains using baseband cloud. EURASIP J Wirel Commun Netw 2012:1–12CrossRefGoogle Scholar
  20. 20.
    Maršál K et al (1984) Blood flow in the fetal descending aorta; intrinsic factors affecting fetal blood flow, ie fetal breathing movements and cardiac arrhythmia. Ultrasound Med Biol 10(3):339–348CrossRefGoogle Scholar
  21. 21.
    Peng JS, Shao YM (2018) Intelligent method for identifying driving risk based on V2V multisource big data. Complexity 2018:1–9zbMATHGoogle Scholar
  22. 22.
    Petchesky RP (1987) Fetal images: The power of visual culture in the politics of reproduction. Fem Stud 13(2):263–292CrossRefGoogle Scholar
  23. 23.
    Russakovsky O et al (2015) Imagenet large scale visual recognition challenge. Int J Comput Vis 115(3):211–252MathSciNetCrossRefGoogle Scholar
  24. 24.
    Schalkoff RJ (1997) Artificial neural networks, Vol 1. McGraw-Hill, New YorkzbMATHGoogle Scholar
  25. 25.
    Shin H-C et al (2016) Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Trans Med Imaging 35(5):1285–1298CrossRefGoogle Scholar
  26. 26.
    Sibi P, Allwyn Jones S, Siddarth P (2013) Analysis of different activation functions using back propagation neural networks. Journal of Theoretical and Applied Information Technology 47(3):1264–1268Google Scholar
  27. 27.
    Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 Google Scholar
  28. 28.
    Srivastava N et al (2014) Dropout: A simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research 15(1):1929–1958MathSciNetzbMATHGoogle Scholar
  29. 29.
    Sun YG, Qiang HY, Mei X et al (2017) Modified repetitive learning control with unidirectional control input for uncertain nonlinear systems. Neural Comput & Applic.  https://doi.org/10.1007/s00521-017-2983-y
  30. 30.
    Sun YG, Qiang HY, Xu JQ, Dong DS (2017) The nonlinear dynamics and anti-sway tracking control for offshore container crane on a mobile harbor. Journal of Marine Science and Technology-Taiwan 25(6):656–665Google Scholar
  31. 31.
    Szegedy C et al (2015) Going deeper with convolutions. CvprGoogle Scholar
  32. 32.
    Yang K, Yang J, Wu JS et al (2014) Performance analysis of DF cooperative diversity system with OSTBC over spatially correlated Nakagami-m fading channels. IEEE Trans Veh Technol 63(3):1270–1281CrossRefGoogle Scholar
  33. 33.
    Yang K, Martin S, Xing CW et al (2016) Energy-Efficient Power Control for Device-to-Device Communications. IEEE Journal on Selected Areas in Communications 34(12):3208–3220CrossRefGoogle Scholar
  34. 34.
    Yang A, Han Y, Pan Y et al (2017) Optimum surface roughness prediction for titanium alloy by adopting response surface methodology. Results in Physics 7:1046–1050CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Computer ScienceGachon UniversitySeongnam-SiSouth Korea

Personalised recommendations