Vessel Detection in Ultrasound Images Using Deep Convolutional Neural Networks

  • Erik Smistad
  • Lasse Løvstakken
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10008)


Deep convolutional neural networks have achieved great results on image classification problems. In this paper, a new method using a deep convolutional neural network for detecting blood vessels in B-mode ultrasound images is presented. Automatic blood vessel detection may be useful in medical applications such as deep venous thrombosis detection, anesthesia guidance and catheter placement. The proposed method is able to determine the position and size of the vessels in images in real-time. 12,804 subimages of the femoral region from 15 subjects were manually labeled. Leave-one-subject-out cross validation was used giving an average accuracy of 94.5 %, a major improvement from previous methods which had an accuracy of 84 % on the same dataset. The method was also validated on a dataset of the carotid artery to show that the method can generalize to blood vessels on other regions of the body. The accuracy on this dataset was 96 %.


Graphic Processing Unit Ultrasound Image Convolutional Neural Network Deep Neural Network Vessel Segmentation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Abolmaesumi, P., Sirouspour, M., Salcudean, S.: Real-time extraction of carotid artery contours from ultrasound images. In: Proceedings 13th IEEE Symposium on Computer-Based Medical Systems, CBMS 2000, pp. 181–186. IEEE Computer Society (2000)Google Scholar
  2. 2.
    Bengio, Y., Lamblin, P., Popovici, D., Larochelle, H.: Greedy layer-wise training of deep networks. Adv. Neural Inf. Process. Syst. 19(1), 153–160 (2007)Google Scholar
  3. 3.
    Girshick, R.: Fast R-CNN. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 1440–1448. IEEE, December 2015Google Scholar
  4. 4.
    Glorot, X., Bordes, A., Bengio, Y.: Deep Sparse rectifier neural networks. In: 14th International Conference on Artificial Intelligence and Statistics, pp. 315–323 (2011)Google Scholar
  5. 5.
    Guerrero, J., Salcudean, S.E., McEwen, J.A., Masri, B.A., Nicolaou, S.: System for deep venous thombosis detection using objective compression measures. IEEE Trans. Biomed. Eng. 53(5), 845–854 (2006)CrossRefGoogle Scholar
  6. 6.
    Guerrero, J., Salcudean, S.E., McEwen, J.A., Masri, B.A., Nicolaou, S.: Real-time vessel segmentation and tracking for ultrasound imaging applications. IEEE Trans. Med. Imaging 26(8), 1079–1090 (2007)CrossRefGoogle Scholar
  7. 7.
    Jia, Y., Shelhamer, E., Donahue, J., Karayev, S., Long, J., Girshick, R., Guadarrama, S., Darrell, T.: Caffe: convolutional architecture for fast feature embedding. In: Proceedings of the ACM International Conference on Multimedia, pp. 675–678 (2014)Google Scholar
  8. 8.
    Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)Google Scholar
  9. 9.
    Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3431–3440 (2015)Google Scholar
  10. 10.
    Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: Towards real-time object detection with region proposal networks. Advances in neural information processing systems, pp. 91–99, June 2015Google Scholar
  11. 11.
    Smistad, E., Bozorgi, M., Lindseth, F.: FAST: framework for heterogeneous medical image computing and visualization. Int. J. Comput. Assist. Radiol. Surg. 10(11), 1811–1822 (2015)CrossRefGoogle Scholar
  12. 12.
    Smistad, E., Lindseth, F.: Real-time automatic artery segmentation, reconstruction and registration for ultrasound-guided regional anaesthesia of the femoral nerve. IEEE Trans. Med. Imaging 35(3), 752–761 (2016)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Norwegian University of Science and TechnologyTrondheimNorway
  2. 2.SINTEF Medical TechnologyTrondheimNorway

Personalised recommendations