Perceptual Docking for Robotic Control

  • Guang-Zhong Yang
  • George P. Mylonas
  • Ka-Wai Kwok
  • Adrian Chung
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5128)

Abstract

In current robotic surgery, dexterity is enhanced by microprocessor controlled mechanical wrists which allow motion scaling for reduced gross hand movements and improved performance of micro-scale tasks. The continuing evolution of the technology, including force feedback and virtual immobilization through real-time motion adaptation, will permit complex procedures such as beating heart surgery to be carried out under a static frame-of-reference. In pursuing more adaptive and intelligent robotic designs, the regulatory, ethical and legal barriers imposed on interventional surgical robots have given rise to the need of a tightly integrated control between the operator and the robot when autonomy is considered. This paper outlines the general concept of perceptual docking for robotic control and how it can be used for learning and knowledge acquisition in robotic assisted minimally invasive surgery such that operator specific motor and perceptual/cognitive behaviour is acquired through in situ sensing. A gaze contingent framework is presented in this paper as an example to illustrate how saccadic eye movements and ocular vergence can be used for attention selection, recovering 3D tissue deformation and motor channelling during minimally invasive surgical procedures.

Keywords

perceptual docking minimally invasive surgery perceptual feedback eye tracking machine vision deformation recovery 3D tracking autonomous robot robotic control haptics human-robot interfacing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Moore, M.M.: Real-World Applications for Brain-Computer Interface Technology. IEEE Trans. on Neural Systems and Rehabilitation Engineering 11(2), 162–165 (2003)CrossRefGoogle Scholar
  2. 2.
    Wolpaw, J.R., McFarland, D.J.: Control of a Two-Dimensional Movement Signal by a Noninvasive Brain-Computer Interface in Humans. NeuroScience 101(51), 17849–17854 (2004)Google Scholar
  3. 3.
    Serruya, M.D., Hatsopoulos, N.G., Paninski, L., Fellow, M.R., Donoghue, J.P.: Instant Neural Control of a Movement Signal. Nature 416, 141–142 (2002)CrossRefGoogle Scholar
  4. 4.
    Yoo, S.S., Fairneny, T., Chen, N.K., Choo, S.E., Panych, L.P., Park, H., Lee, S.Y., Jolesz, F.A.: Brain-Computer Interface Using fMRI: Spatial Navigation by Thoughts. Neuroreport 15(10), 1591–1595 (2004)CrossRefGoogle Scholar
  5. 5.
    Franceschini, M.A., Boas, D.A.: Noninvasive Measurement of Neuronal Activity with Near-Infrared Optical Imaging. NeuroImage 21(1), 372–386 (2004)CrossRefGoogle Scholar
  6. 6.
    Coyle, S.M., Ward, T.E., Markham, C.M.: Brain-Computer Interface Using a Simplified Functional Near-Infrared Spectroscopy System. Journal of Neural Eng. 4(3), 219–226 (2007)CrossRefGoogle Scholar
  7. 7.
    Tsubone, T., Muroga, T., Wada, Y.: Application to Robot Control Using Brain Function Measurement by Near-Infrared Spectroscopy. In: Proc. of IEEE Eng. Med. Biol. Soc., pp. 5324–5345 (2007)Google Scholar
  8. 8.
    Sitaram, R., Zhang, H., Guan, C., Thulasidas, M., Hoshi, Y., Ishikawa, A., Shimizu, K., Birbaumer, N.: Temporal Classification of Multichannel Near-Infrared Spectroscopy Signals of Motor Imagery for Developing a Brain-Computer Interface. Neuroimage 34(4), 1416–1427 (2007)CrossRefGoogle Scholar
  9. 9.
    Leff, D., Koh, P.H., Aggarwal, R., Leong, J., Deligiani, F., Elwell, C., Delpy, D.T., Darzi, A., Yang, G.Z.: Optical Mapping of the Frontal Cortex During a Surgical Knot-Tying Task, a Feasibility Study. In: Medical Imaging Augmented Reality, pp. 140–147 (2006)Google Scholar
  10. 10.
    Leff, D.R., Leong, J.J., Aggarwal, R., Yang, G.Z., Darzi, A.: Could Variations in Technical Skills Acquisition in Surgery be Explained by Differences in Cortical Plasticity? Annals of Surgery 247(3), 540–543 (2008)CrossRefGoogle Scholar
  11. 11.
    Leff, D.R., Orihuela-Espina, F., Atallah, L., Darzi, A., Yang, G.Z.: Functional Near Infrared Spectroscopy in Novice and Expert Surgeons - a Manifold Embedding Approach. In: Ayache, N., Ourselin, S., Maeder, A. (eds.) MICCAI 2007, Part II. LNCS, vol. 4792, pp. 270–277. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  12. 12.
    Leff, D.R., Elwell, C.E., Orihuela-Espina, F., Atallah, L., Delpy, D.T., Darzi, A.W., Yang, G.Z.: Changes in Prefrontal Cortical Behaviour Depend Upon Familiarity on a Bimanual Co-Ordination Task: an fNIRS Study. Neuroimage 39(2), 805–813 (2008)CrossRefGoogle Scholar
  13. 13.
    Yang, G.Z., Dempere-Marco, L., Hu, X.-P., Rowe, A.: Visual Search: Psychophysical Models and Practical Applications. Image and Vision Computing 20, 291–305 (2002)CrossRefGoogle Scholar
  14. 14.
    Mylonas, G.P., Darzi, A., Yang, G.Z.: Gaze Contingent Depth Recovery and Motion Stabilisation for Minimally Invasive Robotic Surgery. In: Yang, G.-Z., Jiang, T. (eds.) MIAR 2004. LNCS, vol. 3150, pp. 311–319. Springer, Heidelberg (2004)Google Scholar
  15. 15.
    Mylonas, G.P., Stoyanov, D., Deligianni, F., Darzi, A., Yang, G.-Z.: Gaze-Contingent Soft Tissue Deformation Tracking for Minimally Invasive Robotic Surgery. In: Duncan, J.S., Gerig, G. (eds.) MICCAI 2005. LNCS, vol. 3749, pp. 843–850. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  16. 16.
    Lerotic, M., Chung, A.J., Mylonas, G., Yang, G.-Z.: pq-Space Based Non-Photorealistic Rendering for Augmented Reality. In: Ayache, N., Ourselin, S., Maeder, A. (eds.) MICCAI 2007, Part II. LNCS, vol. 4792, pp. 102–109. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  17. 17.
    Rosenberg, L.B.: Virtual Fixtures: Perceptual Tools for Telerobotic Manipulation. In: Proc. of the IEEE Annual International Symposium on Virtual Reality, pp. 76–82 (1993)Google Scholar
  18. 18.
    Mylonas, G.P., Kwok, K.-W., Darzi, A., Yang, G.Z.: Gaze-Contingent Motor Channelling and Haptic Constraints for Minimally Invasive Robotic Surgery. In: Proceedings of the 11th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2008, New York (to appear, 2008)Google Scholar
  19. 19.
    Okamura, A.M.: Methods for Haptic Feedback in Teleoperated Robot-Assisted Surgery. Industrial Robot: An International Journal 31(6), 499–508 (2004)CrossRefGoogle Scholar
  20. 20.
    Mendoza, C., Laugier, C.: Tissue Cutting Using Finite Elements and Force Feedback. In: Proc. of International Symposium on Surgery Simulation and Soft Tissue Modeling, pp. 175–182 (2003)Google Scholar
  21. 21.
    Crouch, J.R., Schneider, C.M., Wainer, J., Okamura, A.M.: A Velocity-Dependent Model for Needle Insertion in Soft Tissue. In: Duncan, J.S., Gerig, G. (eds.) MICCAI 2005. LNCS, vol. 3750, pp. 624–632. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  22. 22.
    Heverly, M., Dupont, P., Triedman, J.: Trajectory Optimization for Dynamic Needle Insertion. In: Proc. of the 2005 IEEE International Conf. on Robotics and Automation, pp. 1646–1651 (2005)Google Scholar
  23. 23.
    Kennedy, C.W., Hu, T., Desai, J.P., Wechsler, A.S., Kresh, J.Y.: A Novel Approach to Robotic Cardiac Surgery Using Haptics and Vision. Cardiovascular Engineering: An International Journal 2(1), 15–21 (2002)CrossRefGoogle Scholar
  24. 24.
    Tholey, G., Desai, J.P.: A General Purpose 7 DOF Haptic Device: Applications Towards Robot-Assisted Surgery. IEEE/ASME Trans. on Mechatronics 12(6), 662–669 (2007)CrossRefGoogle Scholar
  25. 25.
    Unger, B., Hollis, R., Klatzky, R.: JND Analysis of Texture Roughness Perception Using a Magnetic Levitation Haptic Device. In: Proc. Of the 2nd Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 9–15 (2007)Google Scholar
  26. 26.
    Saddik, E.: The Potential of Haptics Technologies. IEEE Instrumentation and Measurement Magazine 10(1), 10–17 (2007)CrossRefGoogle Scholar
  27. 27.
    Lin, M., Salisbury, K.: Haptic Rendering-Beyond Visual Computing. IEEE Computer Graphics and Applications 24(2), 22–23 (2004)CrossRefGoogle Scholar
  28. 28.
    Orozco, M., Asfaw, Y., Shirmohammadi, S.S., Adler, A., Saddik, A.E.: Haptic-Based Biometrics: A Feasibility Study. In: Proc. of the Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 265–271 (2006)Google Scholar
  29. 29.
    Tsagarakis, N.G., Petrone, M., Testi, D., Mayoral, R., Zannoni, C., Viceconti, M., Clapworthy, G.J., Gray, J.O., Caldwell, D.G.: Pre-Operative Planning for Total Hip Arthroplasty Using a Haptic Enabled Multimodal Interface and Framework. IEEE Trans. of Multimedia and Visualization: Special issue in Haptics 13(3), 40–48 (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Guang-Zhong Yang
    • 1
  • George P. Mylonas
    • 1
  • Ka-Wai Kwok
    • 1
  • Adrian Chung
    • 1
  1. 1.Royal Society/Wolfson Medical Image Computing Laboratory Institute of Biomedical EngineeringImperial College LondonLondonUnited Kingdom

Personalised recommendations