3D ultrasound registration-based visual servoing for neurosurgical navigation
- 431 Downloads
We present a fully image-based visual servoing framework for neurosurgical navigation and needle guidance. The proposed servo-control scheme allows for compensation of target anatomy movements, maintaining high navigational accuracy over time, and automatic needle guide alignment for accurate manual insertions.
Our system comprises a motorized 3D ultrasound (US) transducer mounted on a robotic arm and equipped with a needle guide. It continuously registers US sweeps in real time with a pre-interventional plan based on CT or MR images and annotations. While a visual control law maintains anatomy visibility and alignment of the needle guide, a force controller is employed for acoustic coupling and tissue pressure. We validate the servoing capabilities of our method on a geometric gel phantom and real human anatomy, and the needle targeting accuracy using CT images on a lumbar spine gel phantom under neurosurgery conditions.
Despite the varying resolution of the acquired 3D sweeps, we achieved direction-independent positioning errors of \(0.35\pm 0.19\) mm and \(0.61^\circ \pm 0.45^\circ \), respectively. Our method is capable of compensating movements of around 25 mm/s and works reliably on human anatomy with errors of \(1.45\pm 0.78\) mm. In all four manual insertions by an expert surgeon, a needle could be successfully inserted into the facet joint, with an estimated targeting accuracy of \(1.33\pm 0.33\) mm, superior to the gold standard.
The experiments demonstrated the feasibility of robotic ultrasound-based navigation and needle guidance for neurosurgical applications such as lumbar spine injections.
KeywordsRegistration-based visual servoing 3D Ultrasound Neurosurgical navigation Needle insertion
We thank ImFusion GmbH, Munich, Germany, for providing their image processing framework and their continuous support, and the department of nuclear medicine at Klinikum Rechts der Isar for several CT acquisitions. Furthermore, we wish to thank Julia Rackerseder for the production of the used phantoms and Rüdiger Göbl for his assistance during experiments.
Funding This work was partially funded by the Bayerische Forschungsstiftung Award Number AZ-1072-13 (project RoBildOR).
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
Informed consent was obtained from all individual participants included in the volunteer study. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and national research committees.
- 1.Center P, Manchikanti L (2015) A systematic review and best evidence synthesis of effectiveness of therapeutic facet joint interventions in managing chronic spinal pain. Pain phys 18:E535–E582Google Scholar
- 10.Zettinig O, Fuerst B, Kojcev R, Esposito M, Salehi M, Wein W, Rackerseder J, Sinibaldi E, Frisch B, Navab N (2016) “Toward real-time 3D ultrasound registration-based visual servoing for interventional navigation,” In: 2016 IEEE International conference on robotics and automation (ICRA). IEEE, p 945–950Google Scholar
- 12.Brudfors M, Seitel A, Rasoulian A, Lasso A, Lessoway V. A, Osborn J, Maki A, Rohling R. N, Abolmaesumi P (2015) “Towards real-time, tracker-less 3D ultrasound guidance for spine anaesthesia,” International Journal of Computer Assisted Radiology and Surgery, pp. 1–11Google Scholar
- 15.Rasoulian A, Osborn J, Sojoudi S, Nouranian S, Lessoway V. A, Rohling R. N, Abolmaesumi P (2014) “A system for ultrasound-guided spinal injections: A feasibility study,” In Information Processing in Computer-Assisted Interventions. Springer, pp. 90–99Google Scholar
- 16.Abolmaesumi P, Salcudean S, Zhu W (2000) “Visual servoing for robot-assisted diagnostic ultrasound,” In: Engineering in medicine and biology Society 2000. Proceedings of the 22nd annual international conference of the IEEE, vol. 4. IEEE, pp 2532–2535Google Scholar
- 17.Nakadate R, Solis J, Takanishi A, Minagawa E, Sugawara M, Niki K (2011) “Out-of-plane visual servoing method for tracking the carotid artery with a robot-assisted ultrasound diagnostic system,” In: Robotics and automation (ICRA), 2011 IEEE International conference on IEEE, pp. 5267–5272Google Scholar
- 19.Krupa A, Folio D, Novales C, Vieyres P, Li T (2014) Robotized tele-echography: an assisting visibility tool to support expert diagnostic. Syst J IEEE 99:1–10Google Scholar
- 20.Chatelain P, Krupa A, Navab N (2015) “Optimization of ultrasound image quality via visual servoing.” In: IEEE International conference on robotics and automation, ICRAGoogle Scholar
- 21.Virga S, Zettinig O, Esposito M, Pfister K, Frisch B, Neff T, Navab N, Hennersperger C (2016) “Automatic force-compliant robotic ultrasound screening of abdominal aortic aneurysms,” In: Intelligent robots and systems (IROS), 2016 IEEE/RSJ International conference on, October 2016 (vol. in press)Google Scholar
- 23.Krupa A (2014) “3D steering of a flexible needle by visual servoing,” In: Medical image computing and computer-assisted intervention-MICCAI 2014. Springer, Berlin, pp 480–487Google Scholar
- 26.Karamalis A, Wein W, Kutter O, Navab N (2009) “Fast hybrid freehand ultrasound volume reconstruction,” In: SPIE medical imaging. International society for optics and photonics, pp 726 114–726 114Google Scholar
- 27.Wein W, Khamene A (2008) “Image-based method for in-vivo freehand ultrasound calibration,” In: Medical Imaging. International society for optics and photonics, pp 69 200K–69 200KGoogle Scholar
- 30.Powell MJ (2009) The BOBYQA algorithm for bound constrained optimization without derivatives, Cambridge NA Report NA2009/06. University of Cambridge, CambridgeGoogle Scholar
- 33.Tokuda J, Fischer GS, Papademetris X, Yaniv Z, Ibanez L, Cheng P, Liu H, Blevins J, Arata J, Golby AJ, Kapur T, Pieper S, Burdette EC, Fichtinger G, Tempany CM, Hata N (2009) OpenIGTLink: an open network protocol for image-guided therapy environment. Int J Med Robot Comput Assist Surg 5(4):423–434CrossRefGoogle Scholar
- 38.Jung B-H, Kim B-H, Hong S-M (2013) Respiratory motion prediction with extended kalman filters based on local circular motion model. Int J Bio-Sci Bio-Technol 5(1):51–58Google Scholar