A “eye-in-body” integrated surgery robot system for stereotactic surgery
- 143 Downloads
Current stereotactic surgical robots system relies on cumbersome operations such as calibration, tracking and registration to establish the accurate intraoperative coordinate transformation chain, which makes the system not easy to use. To overcome this problem, a novel stereotactic surgical robot system has been proposed and validated.
First, a hand–eye integrated scheme is proposed to avoid the intraoperative calibration between robot arm and motion tracking system. Second, a special reference-tool-based patient registration and tracking method is developed to avoid intraoperative registration. Third, a model-free visual servo method is used to reduce the accuracy requirement of hand–eye relationship and robot kinematic model. Finally, a prototype of the system is constructed and performance tests and a pedicle screw drilling experiment are performed.
The results show that the proposed system has acceptable accuracy. The target positioning error in the plane was − 0.68 ± 0.52 mm and 0.06 ± 0.41 mm. The orientation error was 0.43 ± 0.25°. The pedicle screw drilling experiment shows that the system can complete accurate stereotactic surgery.
The stereotactic surgical robot system described in this paper can perform stereotactic surgery without the intraoperative hand–eye calibration and nor manual registration and can achieve an acceptable position and orientation accuracy while tolerating the errors in the hand–eye coordinate transformation error and the robot kinematics model error.
KeywordsStereotactic surgery Surgical robotics Model-free control Patient tracking Patient registration Image-guided intervention
The authors acknowledge the support of the Ministry of Science and Technology of China (Grant 2017YFA0205904, 2016YFC0105800).
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
This article does not contain any studies with human participants or animals performed by any of the authors.
This article does not contain patient data.
- 10.Gerber N, Gavaghan KA, Bell BJ, Williamson TM, Weisstanner C, Caversaccio MD, Weber S (2013) High-accuracy patient-to-image registration for the facilitation of image-guided robotic microsurgery on the head. IEEE Trans Biomed Eng 60(4):960–968. https://doi.org/10.1109/tbme.2013.2241063 CrossRefPubMedGoogle Scholar
- 11.Weber S, Gavaghan K, Wimmer W, Williamson T, Gerber N, Anso J, Bell B, Feldmann A, Rathgeb C, Matulic M, Stebinger M, Schneider D, Mantokoudis G, Scheidegger O, Wagner F, Kompis M, Caversaccio M (2017) Instrument flight to the inner ear. Sci Robot 2(4):12. https://doi.org/10.1126/scirobotics.aal4916 CrossRefGoogle Scholar
- 13.Jin YS, Wang YF, Chen XT, Wang ZC, Liu XH, Jiang H, Chen XP (2017) Model-less feedback control for soft manipulators. In: Bicchi A, Okamura A (eds) 2017 IEEE/RSJ international conference on intelligent robots and systems. IEEE International Conference on Intelligent Robots and Systems. IEEE, New York, pp 2916–2922Google Scholar
- 14.Wang HS, Jiang MK, Chen WD, Liu YH (2012) Visual servoing of robots with uncalibrated robot and camera parameters. Mechatronics 22(6):661–668. https://doi.org/10.1016/j.mechatronics.2011.05.007 CrossRefGoogle Scholar
- 15.Piepmeier JA, Mcmurray GV, Lipkin H (1999) A dynamic quasi-Newton method for uncalibrated visual servoing. In: Proceedings of IEEE international conference on robotics and automation, 1999, vol 1592. pp 1595–1600Google Scholar
- 16.Danying H, DeTone D, Chauhan V, Spivak I, Malisiewicz T (2018) Deep ChArUco: dark ChArUco marker pose estimation. arXiv. arXiv (USA):11, p 11Google Scholar
- 17.Lin Q, Cai K, Yang R, Chen H, Wang Z, Zhou J (2016) Development and validation of a near-infrared optical system for tracking surgical instruments. J Med Syst 40(4):1–14Google Scholar