Robust motion tracking in liver from 2D ultrasound images using supporters

  • Ece Ozkan
  • Christine Tanner
  • Matej Kastelic
  • Oliver Mattausch
  • Maxim Makhinya
  • Orcun Goksel
Original Article
  • 161 Downloads

Abstract

Purpose

Effectiveness of image-guided radiation therapy with precise dose delivery depends highly on accurate target localization, which may involve motion during treatment due to, e.g., breathing and drift. Therefore, it is important to track the motion and adjust the radiation delivery accordingly. Tracking generally requires reliable target appearance and image features, whereas in ultrasound imaging acoustic shadowing and other artifacts may degrade the visibility of a target, leading to substantial tracking errors. To minimize such errors, we propose a method based on so-called supporters, a computer vision tracking technique. This allows us to leverage information from surrounding motion for improving robustness of motion tracking on 2D ultrasound image sequences of the liver.

Methods

Image features, potentially useful for predicting the target positions, are individually tracked, and a supporter model capturing the coupling of motion between these features and the target is learned on-line. This model is then applied to predict the target position, when the target cannot be otherwise tracked reliably.

Results

The proposed method was evaluated using the Challenge on Liver Ultrasound Tracking (CLUST)-2015 dataset. Leave-one-out cross-validation was performed on the training set of 24 2D image sequences of each 1–5 min. The method was then applied on the test set (24 2D sequences), where the results were evaluated by the challenge organizers, yielding 1.04 mm mean and 2.26 mm 95%ile tracking error for all targets. We also devised a simulation framework to emulate acoustic shadowing artifacts from the ribs, which showed effective tracking despite the shadows.

Conclusions

Results support the feasibility and demonstrate the advantages of using supporters. The proposed method improves its baseline tracker, which uses optic flow and elliptic vessel models, and yields the state-of-the-art real-time tracking solution for the CLUST challenge.

Keywords

Tracking liver in ultrasound Respiratory motion compensation Image-guided radiation therapy Supporters 

References

  1. 1.
    Keall PJ, Mageras GS, Balter JM, Emery RS, Forster KM, Jiang SB, Kapatoes JM, Low DA, Murphy MJ, Murray BR et al (2006) The management of respiratory motion in radiation oncology report of AAPM Task Group 76a). Med Phys 33(10):3874–3900CrossRefGoogle Scholar
  2. 2.
    Magerasa GS, Yorkea E (2004) Deep inspiration breath hold and respiratory gating strategies for reducing organ motion in radiation treatment. Semin Radiat Oncol 14(1):65–75CrossRefGoogle Scholar
  3. 3.
    De Luca V, Szkely G, Tanner C (2015) Estimation of large-scale organ motion in B-mode ultrasound image sequences: a survey. Ultrasound Med Biol 41(12):3044–3062CrossRefPubMedGoogle Scholar
  4. 4.
    Vijayan S, Klein S, Hofstad EF, Lindseth F, Ystgaard B, Langø T (2013) Validation of a non-rigid registration method for motion compensation in 4D ultrasound of the liver. In: 2013 IEEE 10th international symposium on biomedical imaging, pp 792–795Google Scholar
  5. 5.
    De Luca V, Tschannen M, Székely G, Tanner C (2013) A learning-based approach for fast and robust vessel tracking in long ultrasound sequences. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 518–525Google Scholar
  6. 6.
    Makhinya M, Goksel O (2015) Motion tracking in 2D ultrasound using vessel models and robust optic-flow. In: MICCAI 2015 challenge on liver ultrasound trackingGoogle Scholar
  7. 7.
    Grabner H, Matas J, Van Gool L, Cattin P (2010) Tracking the invisible: learning where the object might be. In: International conference on computer vision and pattern recognition (CVPR), pp 1285–1292Google Scholar
  8. 8.
    Yanagawa Y, Echigo T, Vu H, Okazaki H, Fujiwara Y, Arakawa T, Yagi Y (2012) Tracking abnormalities in video capsule endoscopy using surrounding features with a triangular constraint. In: International symposium on biomedical imaging (ISBI)Google Scholar
  9. 9.
    Chakraborty A, Roy-Chowdhury AK (2015) Context aware spatio-temporal cell tracking in densely packed multilayer tissues. Med Image Anal 19(1):149–163CrossRefPubMedGoogle Scholar
  10. 10.
    Xia Y, Hussein S, Singh V, John M, Wu Y, Chen T (2016) Context region discovery for automatic motion compensation in fluoroscopy. Int J Comput Assist Radiol Surg 11(6):1–9CrossRefGoogle Scholar
  11. 11.
    Sun Z, Yao H, Zhang S, Sun X (2011) Robust visual tracking via context objects computing. In: 18th IEEE international conference on image processing, pp 509–512Google Scholar
  12. 12.
    Xiong F, Camps OI, Sznaier M (2012) Dynamic context for tracking behind occlusions. In: European conference on computer vision (ECCV), pp 580–593Google Scholar
  13. 13.
    Meng L, Jia Q (2013) Multi-target tracking based on level set segmentation and contextual information. Int J Signal Process Image Process Pattern Recognit 6(4):287–296Google Scholar
  14. 14.
    Zhang L, Van Der Maaten L (2014) Preserving structure in model-free tracking. IEEE Trans Pattern Anal Mach Intell 36(4):756–769CrossRefPubMedGoogle Scholar
  15. 15.
    Meshgi K, Maeda S-I, Oba S, Skibbe H, Li Y-Z, Ishii S (2016) An occlusion-aware particle filter tracker to handle complex and persistent occlusions. Comput Vis Image Underst 150:81–94CrossRefGoogle Scholar
  16. 16.
    Samei G, Chlebus G, Sz ekely G, Tanner C (2013) Adaptive confidence regions of motion predictions from population exemplar models. In: MICCAI workshop on computational and clinical challenges in abdominal imaging, pp 231–240Google Scholar
  17. 17.
    De Luca T, annd Benz V, Kondo S, Knig L, Lbke D, Rothlbbers S, Somphone O, Allaire S, Lediju Bell M, Chung D, Cifor A, Grozea C, Gnther M, Jenne J, Kipshagen T, Kowarschik M, Navab N, Rhaak J, Schwaab J, Tanner C (2015) The 2014 liver ultrasound tracking benchmark. Phys Med Biol 60(14):5571CrossRefPubMedGoogle Scholar
  18. 18.
    Mohr M, Abrams E, Engel C, Long WB, Bottlang M (2007) Geometry of human ribs pertinent to orthopedic chest-wall reconstruction. J Biomech 40(6):1310–1317CrossRefPubMedGoogle Scholar
  19. 19.
    Mattausch O, Goksel O (2016) Monte-carlo ray-tracing for realistic interactive ultrasound simulation. In: Eurographics workshop on visual computing for biology and medicineGoogle Scholar
  20. 20.
    Lucas B, Kanade T (1981) An iterative image registration technique with an application to stereo vision. In: Proceedings of imaging understanding workshop, pp 121–130Google Scholar
  21. 21.
    Crimi A, Makhinya M, Baumann U, Thalhammer C, Szekely G, Goksel O (2016) Automatic measurement of venous pressure using B-mode ultrasound. IEEE Trans Biomed Eng 63(2):288–299CrossRefPubMedGoogle Scholar

Copyright information

© CARS 2017

Authors and Affiliations

  • Ece Ozkan
    • 1
  • Christine Tanner
    • 1
  • Matej Kastelic
    • 1
  • Oliver Mattausch
    • 1
  • Maxim Makhinya
    • 1
  • Orcun Goksel
    • 1
  1. 1.Computer-Assisted Applications in Medicine GroupComputer Vision Lab, Department of Information Technology and Electrical EngineeringZurichSwitzerland

Personalised recommendations