Designing a Wearable Navigation System for Image-Guided Cancer Resection Surgery
A wearable surgical navigation system is developed for intraoperative imaging of surgical margin in cancer resection surgery. The system consists of an excitation light source, a monochromatic CCD camera, a host computer, and a wearable headset unit in either of the following two modes: head-mounted display (HMD) and Google glass. In the HMD mode, a CMOS camera is installed on a personal cinema system to capture the surgical scene in real-time and transmit the image to the host computer through a USB port. In the Google glass mode, a wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A software program is written in Python to call OpenCV functions for image calibration, co-registration, fusion, and display with augmented reality. The imaging performance of the surgical navigation system is characterized in a tumor simulating phantom. Image-guided surgical resection is demonstrated in an ex vivo tissue model. Surgical margins identified by the wearable navigation system are co-incident with those acquired by a standard small animal imaging system, indicating the technical feasibility for intraoperative surgical margin detection. The proposed surgical navigation system combines the sensitivity and specificity of a fluorescence imaging system and the mobility of a wearable goggle. It can be potentially used by a surgeon to identify the residual tumor foci and reduce the risk of recurrent diseases without interfering with the regular resection procedure.
KeywordsSurgical resection margin Fluorescence imaging Google glass Surgical navigation Head mounted display Augmented reality
This project was partially supported by National Cancer Institute (R21CA15977) and the Fundamental Research Funds for the Central Universities. The authors are grateful to Ms. Chuangsheng Yin at University of Science and Technology of China for helping the manuscript preparation and Drs. Edward Martin, Stephen Povoski, Michael Tweedle, and Alper Yilmaz at The Ohio State University for their technical and clinical helps and suggestions.
- 4.Haglund, M. M., D. W. Hochma, A. M Spence, and M. S. Berge. Enhanced optical imaging of rat gliomas and tumor margins. Neurosurgery 35:930–940; discussion 40–41, 1994.Google Scholar
- 8.Kuroiwa, T., Y. Kajimoto, and T. Ohta. Development of a fluorescein operative microscope for use during malignant glioma surgery: a technical note and preliminary report. Surg. Neurol. 50:41–48; discussion 8–9, 1998.Google Scholar
- 9.Kurose, J. F., and K. W. Ross. Computer Networking: A Top-Down Approach. London: Pearson Education, Inc., 2007.Google Scholar
- 11.Liu, P., S. Zhang, and R. X. Xu. 3D topography of biologic tissue by multiview imaging and structured light illumination. Proc. SPIE 8935-0H:1–8, 2014.Google Scholar
- 12.Martin E. W., R. Xu, D. Sun, S. P. Povoski, J. P. Heremans, et al. Fluorescence Detection System. US Patent 09/30763, 2009.Google Scholar
- 13.Mednieks, Z., L. Dornin, G. B. Meik, and M. Nakamura. Programming Android. Sebastopol, CA: O’ Reilly Media, 2011.Google Scholar
- 18.Soltesz E. G., S. Kim, R. G. Laurence, A. M. DeGrand, C. P. Parungo, et al. Intraoperative sentinel lymph node mapping of the lung using near-infrared fluorescent quantum dots. Ann. Thorac. Surg. 79:269–77; discussion 77, 2005.Google Scholar