Technology and computers are omnipresent in our daily lives today, in which they play a decisive role, even if they are often invisible, seamlessly integrated into our surrounding and the objects we use every day. Image-guided therapy is certainly not an exception to this. Nowadays practically all imaging devices rely on computer support, commonly used for the purpose of controlling medical devices, for post-processing acquired raw data to turn them into images, and for transmitting resulting digital data for storage or further use. Furthermore, many therapeutic devices are equipped with sophisticated sensors and actuators, which are also controlled by computers. Indeed, software support today is an intrinsic component of all phases of therapy.
In this chapter, we provide an overview of tools and technologies that are used as the building blocks of almost every computational support system in image-guided therapy. The tools introduced in this chapter include segmentation, registration, localization, simulation, model generation, visualization, robotic tools, and man–machine interfaces. The use of such tools in preoperative planning and intraoperative surgical support is then described and exemplified on typical treatment scenarios.
McInerney T, Terzopoulos D. Deformable models in medical image analysis: a survey. Med Image Anal. 1996;1:91–108.PubMedCrossRefGoogle Scholar
Goksel O, Salcudean SE. Automatic prostate segmentation from transrectal ultrasound elastography images using geometric active contours. In: International conference on ultrasonic measurement and imaging of tissue elasticity (ITEC), Vlissingen. 2009; p. 34.Google Scholar
Tobias H, Hans-Peter M. Statistical shape models for 3D medical image segmentation: a review. Med Image Anal. 2009;13:543–63.CrossRefGoogle Scholar
Alan W. 3D computer graphics. 3rd ed. Harlow: Addison-Wesley; 2000.Google Scholar
Engel K, Hadwiger M, Kniss JM, Lefohn AE, Salama CR, Weiskopf D. Real-time volume graphics in ACM SIGGRAPH 2004 “Course Notes 28”. 2004.Google Scholar
van der Meijden OAJ, Schijven MP. The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review. Surg Endosc. 2009;23:1180–90.PubMedCentralPubMedCrossRefGoogle Scholar
Moustris GP, Hiridis SC, Deliparaschos KM, Konstantinidis KM. Evolution of autonomous and semi-autonomous robotic surgical systems: a review of the literature. Int J Med Robot. 2011;7(4):375–92.PubMedGoogle Scholar
Nicolau S, Soler L, Mutter D, Marescaux J. Augmented reality in laparoscopic surgical oncology. Surg Oncol. 2011;20:189–201. Special Issue: Education for Cancer.PubMedCrossRefGoogle Scholar
Fürnstahl P, Székely G, Gerber C, Hodler J, Snedeker JG, Harders M. Computer assisted reconstruction of complex proximal humerus fractures for preoperative planning. Med Image Anal. 2012;16(3):704–20.PubMedCrossRefGoogle Scholar
Maciunas RJ. Interactive image-guided neurosurgery. Park Ridge: American Association of Neurological Surgeons; 1993.Google Scholar
McBeth PB, Louw DF, Rizun PR, Sutherland GR. Robotics in neurosurgery. Am J Surg. 2004;188:68–75.CrossRefGoogle Scholar
Morel A. Stereotactic atlas of the human thalamus and basal ganglia. New York: Informa Healthcare; 2007.CrossRefGoogle Scholar
Krauth A, Blanc R, Poveda A, Jeanmonod D, Morel A, Székely G. A mean three-dimensional atlas of the human thalamus: generation from multiple histological data. Neuroimage. 2010;49:2053–62.PubMedCrossRefGoogle Scholar
Alexander E, Maciunas RJ, editors. Advanced neurosurgical navigation. New York: Thieme; 1999.Google Scholar