Advertisement

Annals of Biomedical Engineering

, Volume 42, Issue 12, pp 2600–2601 | Cite as

Letter to the Editor on “Designing a Wearable Navigation System for Image-Guided Cancer Resection Surgery”

  • Vincenzo FerrariEmail author
Article

I read with great interest the paper recently published in this prestigious journal,6 and I would like to initiate a discussion to clarify two topics, that in my opinion, are not clear.

Before posing the questions, however, I would like to offer the new readers of this paper some information on this interesting and complex work. The goal of the paper was to describe and evaluate an original solution for visualization of the surgical margins visible under fluorescence imaging. As stated by the authors, current fluorescence imaging systems do not provide the surgeon with a natural point of view because the fluorescent images, which show anatomic structures and lesions not visible to the unaided eye, are generally reproduced on an additional tabletop or ceiling-mounted display. These solutions are ergonomic during endoscopic surgery because the surgeon usually operates watching video images reproduced on a steady display. To offer an ergonomic solution for open surgery, the implementation...

Keywords

Fluorescent Image Augmented Reality Chicken Breast Excitation Light Source Fluorescence Imaging System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Ferrari, V., G. Megali, E. Troia, A. Pietrabissa, and F. Mosca. A 3-D mixed-reality system for stereoscopic visualization of medical dataset. IEEE Trans Biomed Eng. 56:2627–2633, 2009.PubMedCrossRefGoogle Scholar
  2. 2.
    Genc, Y., F. Sauer, F. Wenzel, M. Tuceryan, and N. Navab. Optical see-through HMD calibration: a stereo method validated with a video see-through system. Proceedings of the IEEE and ACM International Symposium on Augmented Reality, 2000, pp. 165–174.Google Scholar
  3. 3.
    Kellner, F., B. Bolte, G. Bruder, U. Rautenberg, F. Steinicke, M. Lappe, et al. Geometric Calibration of Head-Mounted Displays and its Effects on Distance Estimation. Ieee T Vis Comput Gr. 18:589–596, 2012.CrossRefGoogle Scholar
  4. 4.
    Pietrabissa, A., L. Morelli, M. Ferrari, A. Peri, V. Ferrari, A. Moglia, et al. Mixed reality for robotic treatment of a splenic artery aneurysm. Surg Endosc. 24:1204, 2010.PubMedCrossRefGoogle Scholar
  5. 5.
    Sauer, F., A. Khamene, and S. Vogt. An augmented reality navigation system with a single-camera tracker: system design and needle biopsy phantom trial. International Conference on Medical Image Computing and Computer-Assisted Intervention, 2002, pp. 116–124.Google Scholar
  6. 6.
    Shao, P., H. Ding, J. Wang, P. Liu, Q. Ling, J. Chen, et al. Designing a wearable navigation system for image-guided cancer resection surgery. Ann. Biomed. Eng. 2014.Google Scholar
  7. 7.
    Sielhorst, T., C. Bichlmeier, S. M. Heining, and N. Navab. Depth perception–a major issue in medical AR: evaluation study by twenty surgeons. Medical image computing and computer-assisted intervention : MICCAI International Conference on Medical Image Computing and Computer-Assisted Intervention. 9:364–372, 2006.Google Scholar

Copyright information

© Biomedical Engineering Society 2014

Authors and Affiliations

  1. 1.EndoCAS CenterUniversity of PisaPisaItaly

Personalised recommendations