Learning needle tip localization from digital subtraction in 2D ultrasound
- 202 Downloads
This paper addresses localization of needles inserted both in-plane and out-of-plane in challenging ultrasound-guided interventions where the shaft and tip have low intensity. Our approach combines a novel digital subtraction scheme for enhancement of low-level intensity changes caused by tip movement in the ultrasound image and a state-of-the-art deep learning scheme for tip detection.
As the needle tip moves through tissue, it causes subtle spatiotemporal variations in intensity. Relying on these intensity changes, we formulate a foreground detection scheme for enhancing the tip from consecutive ultrasound frames. The tip is augmented by solving a spatial total variation regularization problem using the split Bregman method. Lastly, we filter irrelevant motion events with a deep learning-based end-to-end data-driven method that models the appearance of the needle tip in ultrasound images, resulting in needle tip detection.
The detection model is trained and evaluated on an extensive ex vivo dataset collected with 17G and 22G needles inserted in-plane and out-of-plane in bovine, porcine and chicken phantoms. We use 5000 images extracted from 20 video sequences for training and 1000 images from 10 sequences for validation. The overall framework is evaluated on 700 images from 20 sequences not used in training and validation, and achieves a tip localization error of 0.72 ± 0.04 mm and an overall processing time of 0.094 s per frame (~ 10 frames per second).
The proposed method is faster and more accurate than state of the art and is resilient to spatiotemporal redundancies. The promising results demonstrate its potential for accurate needle localization in challenging ultrasound-guided interventions.
KeywordsNeedle tip localization Ultrasound Deep learning Minimally invasive procedures
This work was accomplished with funding support from the North American Spine Society 2017 young investigator award.
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
This article does not contain any studies with human participants or animals performed by any of the authors.
This article does not contain patient data.
Supplementary material 1 (MP4 51745 kb)
- 6.Fevre MC, Vincent C, Picard J, Vighetti A, Chapuis C, Detavernier M, Allenet B, Payen JF, Bosson JL, Albaladejo P (2018) Reduced variability and execution time to reach a target with a needle GPS system: comparison between physicians, residents and nurse anaesthetists. Anaesth Crit Care Pain Med 37(1):55–60CrossRefPubMedGoogle Scholar
- 11.Hacihaliloglu I, Beigi P, Ng G, Rohling RN, Salcudean S, Abolmaesumi P (2015) Projection-based phase features for localization of a needle Tip in 2D curvilinear ultrasound. MICCAI 9349:347–354Google Scholar
- 17.Pourtaherian A, Ghazvinian Zanjani F, Zinger S, Mihajlovic N, Ng G, Korsten H, With P (2017) Improving needle detection in 3D ultrasound using orthogonal-plane convolutional networks. MICCAI 2:610–618Google Scholar
- 19.Redmon J, Farhadi A (2016) Yolo9000: better, faster, stronger. arXiv:1612.08242
- 24.Klambauer G, Unterthiner T, Mayr A, Hochreiter S (2017) Self-normalizing neural networks. arXiv:1706.02515v5