Perceptual Docking for Robotic Control
- Cite this paper as:
- Yang GZ., Mylonas G.P., Kwok KW., Chung A. (2008) Perceptual Docking for Robotic Control. In: Dohi T., Sakuma I., Liao H. (eds) Medical Imaging and Augmented Reality. MIAR 2008. Lecture Notes in Computer Science, vol 5128. Springer, Berlin, Heidelberg
In current robotic surgery, dexterity is enhanced by microprocessor controlled mechanical wrists which allow motion scaling for reduced gross hand movements and improved performance of micro-scale tasks. The continuing evolution of the technology, including force feedback and virtual immobilization through real-time motion adaptation, will permit complex procedures such as beating heart surgery to be carried out under a static frame-of-reference. In pursuing more adaptive and intelligent robotic designs, the regulatory, ethical and legal barriers imposed on interventional surgical robots have given rise to the need of a tightly integrated control between the operator and the robot when autonomy is considered. This paper outlines the general concept of perceptual docking for robotic control and how it can be used for learning and knowledge acquisition in robotic assisted minimally invasive surgery such that operator specific motor and perceptual/cognitive behaviour is acquired through in situ sensing. A gaze contingent framework is presented in this paper as an example to illustrate how saccadic eye movements and ocular vergence can be used for attention selection, recovering 3D tissue deformation and motor channelling during minimally invasive surgical procedures.
Keywordsperceptual docking minimally invasive surgery perceptual feedback eye tracking machine vision deformation recovery 3D tracking autonomous robot robotic control haptics human-robot interfacing
Unable to display preview. Download preview PDF.