Skip to main content

Advertisement

Log in

Robust motion tracking in liver from 2D ultrasound images using supporters

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Effectiveness of image-guided radiation therapy with precise dose delivery depends highly on accurate target localization, which may involve motion during treatment due to, e.g., breathing and drift. Therefore, it is important to track the motion and adjust the radiation delivery accordingly. Tracking generally requires reliable target appearance and image features, whereas in ultrasound imaging acoustic shadowing and other artifacts may degrade the visibility of a target, leading to substantial tracking errors. To minimize such errors, we propose a method based on so-called supporters, a computer vision tracking technique. This allows us to leverage information from surrounding motion for improving robustness of motion tracking on 2D ultrasound image sequences of the liver.

Methods

Image features, potentially useful for predicting the target positions, are individually tracked, and a supporter model capturing the coupling of motion between these features and the target is learned on-line. This model is then applied to predict the target position, when the target cannot be otherwise tracked reliably.

Results

The proposed method was evaluated using the Challenge on Liver Ultrasound Tracking (CLUST)-2015 dataset. Leave-one-out cross-validation was performed on the training set of 24 2D image sequences of each 1–5 min. The method was then applied on the test set (24 2D sequences), where the results were evaluated by the challenge organizers, yielding 1.04 mm mean and 2.26 mm 95%ile tracking error for all targets. We also devised a simulation framework to emulate acoustic shadowing artifacts from the ribs, which showed effective tracking despite the shadows.

Conclusions

Results support the feasibility and demonstrate the advantages of using supporters. The proposed method improves its baseline tracker, which uses optic flow and elliptic vessel models, and yields the state-of-the-art real-time tracking solution for the CLUST challenge.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Keall PJ, Mageras GS, Balter JM, Emery RS, Forster KM, Jiang SB, Kapatoes JM, Low DA, Murphy MJ, Murray BR et al (2006) The management of respiratory motion in radiation oncology report of AAPM Task Group 76a). Med Phys 33(10):3874–3900

    Article  Google Scholar 

  2. Magerasa GS, Yorkea E (2004) Deep inspiration breath hold and respiratory gating strategies for reducing organ motion in radiation treatment. Semin Radiat Oncol 14(1):65–75

    Article  Google Scholar 

  3. De Luca V, Szkely G, Tanner C (2015) Estimation of large-scale organ motion in B-mode ultrasound image sequences: a survey. Ultrasound Med Biol 41(12):3044–3062

    Article  PubMed  Google Scholar 

  4. Vijayan S, Klein S, Hofstad EF, Lindseth F, Ystgaard B, Langø T (2013) Validation of a non-rigid registration method for motion compensation in 4D ultrasound of the liver. In: 2013 IEEE 10th international symposium on biomedical imaging, pp 792–795

  5. De Luca V, Tschannen M, Székely G, Tanner C (2013) A learning-based approach for fast and robust vessel tracking in long ultrasound sequences. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 518–525

  6. Makhinya M, Goksel O (2015) Motion tracking in 2D ultrasound using vessel models and robust optic-flow. In: MICCAI 2015 challenge on liver ultrasound tracking

  7. Grabner H, Matas J, Van Gool L, Cattin P (2010) Tracking the invisible: learning where the object might be. In: International conference on computer vision and pattern recognition (CVPR), pp 1285–1292

  8. Yanagawa Y, Echigo T, Vu H, Okazaki H, Fujiwara Y, Arakawa T, Yagi Y (2012) Tracking abnormalities in video capsule endoscopy using surrounding features with a triangular constraint. In: International symposium on biomedical imaging (ISBI)

  9. Chakraborty A, Roy-Chowdhury AK (2015) Context aware spatio-temporal cell tracking in densely packed multilayer tissues. Med Image Anal 19(1):149–163

    Article  PubMed  Google Scholar 

  10. Xia Y, Hussein S, Singh V, John M, Wu Y, Chen T (2016) Context region discovery for automatic motion compensation in fluoroscopy. Int J Comput Assist Radiol Surg 11(6):1–9

    Article  Google Scholar 

  11. Sun Z, Yao H, Zhang S, Sun X (2011) Robust visual tracking via context objects computing. In: 18th IEEE international conference on image processing, pp 509–512

  12. Xiong F, Camps OI, Sznaier M (2012) Dynamic context for tracking behind occlusions. In: European conference on computer vision (ECCV), pp 580–593

  13. Meng L, Jia Q (2013) Multi-target tracking based on level set segmentation and contextual information. Int J Signal Process Image Process Pattern Recognit 6(4):287–296

    Google Scholar 

  14. Zhang L, Van Der Maaten L (2014) Preserving structure in model-free tracking. IEEE Trans Pattern Anal Mach Intell 36(4):756–769

    Article  PubMed  Google Scholar 

  15. Meshgi K, Maeda S-I, Oba S, Skibbe H, Li Y-Z, Ishii S (2016) An occlusion-aware particle filter tracker to handle complex and persistent occlusions. Comput Vis Image Underst 150:81–94

    Article  Google Scholar 

  16. Samei G, Chlebus G, Sz ekely G, Tanner C (2013) Adaptive confidence regions of motion predictions from population exemplar models. In: MICCAI workshop on computational and clinical challenges in abdominal imaging, pp 231–240

  17. De Luca T, annd Benz V, Kondo S, Knig L, Lbke D, Rothlbbers S, Somphone O, Allaire S, Lediju Bell M, Chung D, Cifor A, Grozea C, Gnther M, Jenne J, Kipshagen T, Kowarschik M, Navab N, Rhaak J, Schwaab J, Tanner C (2015) The 2014 liver ultrasound tracking benchmark. Phys Med Biol 60(14):5571

    Article  PubMed  Google Scholar 

  18. Mohr M, Abrams E, Engel C, Long WB, Bottlang M (2007) Geometry of human ribs pertinent to orthopedic chest-wall reconstruction. J Biomech 40(6):1310–1317

    Article  PubMed  Google Scholar 

  19. Mattausch O, Goksel O (2016) Monte-carlo ray-tracing for realistic interactive ultrasound simulation. In: Eurographics workshop on visual computing for biology and medicine

  20. Lucas B, Kanade T (1981) An iterative image registration technique with an application to stereo vision. In: Proceedings of imaging understanding workshop, pp 121–130

  21. Crimi A, Makhinya M, Baumann U, Thalhammer C, Szekely G, Goksel O (2016) Automatic measurement of venous pressure using B-mode ultrasound. IEEE Trans Biomed Eng 63(2):288–299

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This work is supported by the Swiss National Science Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ece Ozkan.

Ethics declarations

Conflict of interest

All the authors declare that they have no conflict of interest.

Appendix: Baseline method: tracking by Makhinya and Goksel (TMG)

Appendix: Baseline method: tracking by Makhinya and Goksel (TMG)

Our previously developed tracker [6], which is runner up of the Challenge in Liver Ultrasound Tracking (CLUST)-2015 challenge and is based on optic flow and elliptic vessel model, is employed as object tracker for tracking the supporters and target. The method is summarized below for completeness. Note that this method can track several landmarks together real time and works faster than US acquisition.

Overview The method decides in the initial frame, if the target is vessel-like or not by matching with ellipsoid vessel templates and integrates then several tracking strategies. It involves reference tracking (RT) when the local appearance on the initial, \({\mathbf {I}}^{0}\), and the current frame, \({\mathbf {I}}^{f}\), are similar. Meanwhile, it uses model-based iterative tracking (IT) when RT fails and local appearance of consecutive frames, \({\mathbf {I}}^{f-1}\) and \({\mathbf {I}}^{f}\), are similar. A robust motion tracking is applied in either case. For vessel-like structures, this is improved further by model-based tracking.

Motion tracking Lucas–Kanade-based tracking [20] was applied on a set of regularly spaced grid points around each target. RT is then used for exploiting the repetitive breathing motion characteristic, while IT is used for tracking the motion during the rest of the cycle, i.e., when RT fails. Each tracking strategy yields several motion vectors, which are then filtered for outliers. Finally, from the remaining motion vectors, an affine transform is computed to provide a robust motion estimate for the target.

Model-based tracking For vessel-like structures, model-based tracking is done using an axis-aligned ellipse representation of vessels. For each frame \({\mathbf {I}}^{f}\), first the center is transformed by the affine transform determined by motion tracking; see above, and then the center and radii are re-estimated as in [21] using the Star Edge detection, dynamic programming, model fitting, and binary template matching. The center of the resulting ellipse is then used as the estimated target position at frame \({\mathbf {I}}^{f}\).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ozkan, E., Tanner, C., Kastelic, M. et al. Robust motion tracking in liver from 2D ultrasound images using supporters. Int J CARS 12, 941–950 (2017). https://doi.org/10.1007/s11548-017-1559-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-017-1559-8

Keywords

Navigation