Advertisement

Instrumentation for Intraoperative Detection

  • Pat ZanzonicoEmail author
Living reference work entry

Abstract

Focusing on nuclear and optical modalities, the current chapter reviews the rapidly advancing technologies in intraoperative detection and imaging. This chapter reviews not only currently routine intraoperative imaging technologies but also investigational technologies potentially adaptable to the intraoperative setting.

Keywords

Intraoperative imaging Optical imaging Gamma probes Beta probes Cerenkov imaging Photoacoustic imaging Raman effect 

Glossary

[18F]FDG

2-deoxy-2-[18F]fluoro-d-glucose

ALA

5-aminolevulinic acid

CARS

Coherence anti-Stokes Raman scattering

CCD

Charge-coupled detector

CdTe

Cadmium telluride

CdZnTe (or CZT)

Cadmium zinc telluride

CMOS

Complementary metal oxide semiconductor

CsI(Na)

Sodium-doped cesium iodide

CsI(Tl)

Thallium-doped cesium iodide

CT

X-ray computed tomography

DOT

Diffuse optical tomography

EPR

Enhanced permeability and retention

FITC

Fluorescein isothiocyanate

FLI

Intraoperative fluorescence imaging

FLuc

Gene encoding for firefly luciferase

FOV

Field of view

FWHM

Full-width half-maximum

GFP

Green fluorescent protein

GSO

Cerium-doped gadolinium orthooxysilicate

HgI2

Mercuric iodide

ICG

Indocyanine green

LED

Light-emitting diode

LEHR

Low-energy high-resolution

LSO

Cerium-doped lutetium orthooxysilicate

MBq

Mega-Becquerel (106 Becquerel)

MR

Magnetic resonance

MRI

Magnetic resonance imaging

MSOT

Multispectral optoacoustic tomography

NaI(Tl)

Thallium-doped sodium iodide

NBI

Narrowband imaging

NIR:

Near-infrared

OCT

Optical coherence tomography

PA

Photoacoustic

PET

Positron emission tomography

PMT

Photomultiplier tube

POCI

Peroperative compact imager

PpIX

Protoporphyrin IX

RF

Radiofrequency

RGD

Tripeptide composed of L-arginine, glycine, and L-aspartic acid

SERS

Surface enhanced Raman scattering (or surface enhanced Raman spectroscopy)

SLN

Sentinel lymph node

SPECT

Single-photon emission tomography

SSGC

Small semiconductor gamma camera

US

Ultrasonography

Introduction

Focusing on nuclear and optical modalities, the current chapter reviews the rapidly advancing technologies in intraoperative detection and imaging. This chapter reviews not only currently routine intraoperative imaging technologies but also investigational technologies potentially adaptable to the intraoperative setting. The reader is referred to a recently published volume entitled, “Imaging and Visualization in the Modern Operating Room: A Comprehensive Guide for Physicians,” which provides a detailed overview of the many logistical as well as technical considerations in modern intraoperative imaging [1].

Nuclear Counting and Imaging

Beginning with the pioneering studies of Sweet [2] 60 years ago, intraoperative probes (i.e., counters) have evolved into an important, well-established technology in the management of cancer [3, 4, 5, 6]. Such probes (see Fig. 1) are used in radioguided surgery to more expeditiously identify and localize sentinel lymph nodes and thereby reduce the extent and potential morbidity of surgical procedures and, to a much lesser extent, to identify and localize tumor margins as well as visually occult disease following systemic administration of a tumor-avid radiotracer (e.g., 2-deoxy-2-[18F]fluoro-d-glucose ([18F]FDG)).
Fig. 1

The general design and operating principles of an intraoperative gamma probe . The handheld probe (upper left panel) is comprised of a collimated, small-area (typically ~1 cm in diameter) scintillation or solid-state ionization (i.e., semiconductor) detector (right panel). The probe itself is connected to a control unit (lower left panel) which typically provides both a visual readout of the count rate and an audible signal related to the count rate, with the frequency of the latter signal increasing or decreasing in relation to the detected count rate [12, 13, 14]

Radionuclide-based detection and localization of tumors, especially small tumors, has several well-known limitations [7] which are mitigated through the use of intraoperative probes and, potentially, intraoperative gamma cameras. First, absolute tumor uptake of cancer-targeted radiotracers remains generally quite low, typically ~0.1% or less of the administered activity per gram. Second, overall radiation detection sensitivity in vivo is low as well, ranging from about 0.1% for gamma camera imaging (including SPECT) to ~10% for PET. This is exacerbated, of course, by the signal-degrading effect of attenuation of emitted radiation by overlying tissue. Third, a significant portion of the counts apparently emanating from a tumor or other targeted tissue may actually include counts originating elsewhere (i.e., from background activity in adjacent tissues) because of contrast- and resolution-degrading Compton scatter. However, because of the close proximity of a collimated detector to a tumor or sentinel lymph node which can be achieved at surgery, radionuclide detection of such structures can be enhanced using intraoperative probes or gamma cameras. In a study of simulated tumors in a torso phantom having uniform background activity, for example, Barber et al. demonstrated that a scintillation probe detected tumors with greater sensitivity than a gamma camera over a wide range of conditions provided the probe was placed within 1 cm of the tumor [8]. Alternatively, tumor or sentinel lymph node detection may also be improved under some circumstances using beta (negatron or positron), rather than gamma, detection [9, 10, 11] because the very short range (typically ~1 mm or less) of such particulate radiations eliminates the contribution of confounding counts from activity other than in the immediate vicinity of the detector. Of course, the short range of particulate radiations also limits the application of beta probes to intraoperative or endoscopic settings with the lesion near the surface of the exposed tissue.

Gamma probes. The most widely used type of intraoperative probe (e.g., in sentinel node detection) is the general-purpose “gamma” probe (Fig. 1), designed for counting of radionuclides emitting x- and/or γ-rays; such “single-photon” (i.e., non-positron) emitters include technetium-99m (99mTc), indium-11 (111In), iodine-123 (123I), and iodine-131 (131I) [12, 13, 14]. Gamma probes generally use inorganic (i.e., nonplastic) scintillation detectors or solid-state (i.e., semiconductor) ionization detectors and lead or tungsten shielding and collimation. Scintillators used in such probes include thallium-doped sodium iodide (NaI(Tl)), thallium- and sodium-doped cesium iodide (CsI(Tl) and CsI(Na), respectively), and cerium-doped lutetium orthooxysilicate (LSO). Semiconductors used in intraoperative probes include cadmium telluride (CdTe), cadmium zinc telluride (CZT), and mercuric iodide (HgI2). The former offer high sensitivity while the latter better energy resolution (and therefore better scatter rejection). Clinically, however, scintillation detector- and ionization detector-based probes provide generally comparable performance [12, 13, 14]. Positron emitters such as fluorine-18 (18F) may also be counted with such probes by single-photon (i.e., noncoincidence) counting of the 511-keV annihilation γ-rays. As illustrated in Fig. 2, however, this requires thicker collimation and shielding to prevent penetration of significant numbers of such highly energetic γ-rays emitted from outside of the field of view (FOV) (as defined by the collimator aperture) from reaching the detector and thereby degrading spatial resolution as well as target-to-background contrast. This has also been demonstrated in preliminary clinical and preclinical studies [15, 16, 17, 18]. GF&E Tech GmBH (Seeheim, Germany) has marketed a novel probe for 511-keV annihilation γ-rays using “electronic collimation” based on the time differences in activation among multiple CsI(Tl) crystals [19].
Fig. 2

Comparative thickness of gamma probe collimation and shielding required for low- to medium-energy x- and γ-rays of single-photon emitters (a), for high-energy (511-keV) annihilation γ-rays of positron emitters (b), and for negatrons and positrons of beta particle emitters (c). Note the much thicker collimation and shielding required for counting of the annihilation γ-rays and the minimal collimation and shielding for counting of beta particles [12] (Courtesy of IntraMedical Imaging, Los Angeles, CA)

Beta probes. Because x- and γ-rays penetrate relatively long distances (circa 10 cm) of soft tissue, a major limitation of the use of gamma probes to specifically identify the target tissue in radioguided surgery is the presence of variable, generally high levels of background activity in normal tissues. Thus, even with a gamma probe centered over a tumor, the contribution of counts originating from activity in normal tissue underlying the tumor and even outside the field of view (due to penetration of the collimation and shielding) may degrade the tumor-to-normal tissue contrast (e.g., reducing tumor-to-normal tissue count ratios to less than 1.5:1 [15, 20, 21]) and thus tumor detectability to the point where lesions may be missed. A potential solution to this limitation of radioguided surgery is the use of so-called “beta” probes, that is, intraoperative probes which specifically count only charged particle (negatrons or positrons) radiation. Because they have such short ranges in soft tissues (typically of the order of 1 mm or less), beta particles emitted by a tracer source outside the probe’s FOV or underlying the surface tissue do not reach the detector and are not counted (by the same token, minimal if any collimation and shielding is required (Fig. 2c). As a result, the discrimination between higher-activity tumor and lower-activity normal tissues is enhanced (i.e., the tumor-to-normal tissue count ratios are increased). Of course, the short path length of beta particle restricts the use of such probes to surface lesions; beta probes could not be used, for example, for (percutaneous) detection of sentinel lymph nodes.

Beta probes generally utilize either semiconductor or plastic scintillator detectors, since such detectors have lower effective atomic numbers and mass densities than inorganic scintillators such as NaI(Tl) and thus lower intrinsic efficiencies for x- and γ-rays, minimizing the potentially confounding count contribution of tracers emitting both beta (or conversion electron) and gamma radiation (e.g., iodine-131) [9, 10, 11, 21, 22]. For a pure beta particle emitter such as phosphorus-32, this would not be a problem [9, 23]. Daghighian et al. [9] have developed and evaluated a plastic scintillator-based positron probe (Fig. 3). The basic design of this dual-detector probe (Fig. 3a) includes two scintillation detectors, a central solid cylinder detector (designated “Detector 1” in Fig. 3a) and a hollow cylinder detector (designated “Detector 2” in Fig. 3a) in 1-mm-thick stainless steel cladding; the outputs of the two detectors are passed by fiber-optic cabling to separate PMTs (PMT 1 and PMT 2, respectively). The Detector 1 counts result from both positrons and the 511-keV annihilation γ-rays associated with positron emission, while the stainless steel cladding of Detector 2 completely attenuates the positrons and allows only the annihilation γ-rays to enter the detector and generate counts from that detector. Because of the differences between Detectors 1 and 2, their sensitivities for the 511-keV γ-rays are different. The Detector 1-to-Detector 2 ratio of the measured sensitivities for 511-keV γ-rays is the weighting factor by which the Detector 2 count rate is multiplied and then subtracted from the Detector 1 count rate to yield the Detector 1 positron-only count rate. This probe was evaluated using the phantom setup shown in Fig. 3b, with a small 18F-containing capsule simulating a tumor and a uniform 18F-filled cylindrical container simulating underlying normal tissue activity. The probe was then scanned across the phantom and the Detector 1 and 2 count rates at lateral positions relative to the “tumor” (i.e., capsule) recorded; the results, in terms of the measured count rates and the capsule (i.e., tumor)-to-background ratios, are plotted in Fig. 3c and d, respectively. These results (particularly the dramatic improvement – from ~2 to ~10 – in the tumor-to-background ratios (Fig. 3d)) clearly demonstrate the feasibility of this dual-detector design and weighted subtraction algorithm for beta probes in general and positron probes in particular. This has also been demonstrated in preliminary clinical and preclinical studies [17, 18].
Fig. 3

(a) Basic design of a dual-detector beta probe [9]. (b) Experimental phantom setup for evaluation of the performance of the probe shown in (a). The phantom consisted of a small 18F-containing capsule simulating a tumor and a uniform 18F-filled cylindrical source simulating underlying normal tissue activity. The probe was then scanned across the phantom. Note that the capsule (i.e., tumor)-to-background (i.e., normal tissues) activity concentration ratio was 10:1. (c) The measured Detector 1 and 2 count rates and the calculated weighted difference of the Detector 1 and 2 count rates (see text) as a function of the lateral position of the probe relative to the capsule. (d) The capsule (i.e., tumor)-to-background count rate ratios for Detector 1 with and without weighted subtraction of the Detector 2 count rates (see text) (From [9] with permission)

Intraoperative gamma cameras. The sensitivity and specificity of detection of sentinel lymph nodes using current approaches such as the preoperative gamma camera imaging, gamma probes, and “blue dye” technique are quite high. Newman [24], for example, performed a meta-analysis of nearly 70 published studies and found an overall sensitivity of over 90% and a false-negative rate of only 8.4% for detection of such nodes in breast cancer. For preoperative gamma camera imaging, detection rates of 72–85% have been reported [25]. Sentinel lymph nodes were successfully detected using intraoperative gamma probes in 98% of patients in whom such nodes were successfully imaged preoperatively, with a false-negative rate of only 7%. And, for sentinel nodes not visualized by preoperative lymphoscintigraphy, there was a 90% detection rate intraoperatively. Importantly, however, negative preoperative lymphoscintigraphy often predicted a negative intraoperative probe result, and the foregoing improvement in the detection rate intraoperatively was primarily due to the use of blue dye [25]. The American Society of Breast Surgeons has recommended a sensitivity of at least 85% and a false-negative rate of less than 5% as acceptable for sentinel node detection in breast cancer [26]. There remains a need, therefore, to develop techniques to improve the sensitivity and the false-negative rates of sentinel lymph node detection. Intraoperative gamma camera imaging may provide the improvement required to satisfy the foregoing requirements. Mathelin et al. [27, 28, 29], for example, found that the use of an intraoperative small (5 × 5 cm) FOV gamma camera for detection of sentinel lymph nodes in breast cancer was practical. In a case report [28], intraoperative gamma camera imaging allowed detection of an additional sentinel lymph node (metastatic and with low radiotracer uptake) that was not detected by preoperative imaging or with a gamma probe, suggesting that intraoperative gamma camera imaging may reduce the false-negative rate. In addition to gamma camera and probe technology, the tracer employed (e.g., sulfur colloid versus nanocolloid, versus tilmanocept) and the injection site (e.g., intradermal versus subcutaneous, versus peri-tumoral) may account for the variation in lymph node detection.

Despite such promising preliminary data, it is not clear that the development and deployment of intraoperative gamma camera technology and the incremental improvement in the sentinel lymph node detection rate that such technology may provide will prove to be cost-effective. Certain considerations, however, lend support to the development of this technology. One such consideration is the variable level of proficiency among surgeons in gamma probe-based detection of sentinel lymph nodes [26]: even with considerable training and experience, not all surgeons achieve a detection rate of 90% or better. In addition, certain sentinel lymph nodes are problematic anatomically or otherwise in terms of detectability. These include nodes which are unusually deep, close to (less than 30 mm from) the injection site or high-activity normal tissues, or have a low (less than 1%) radiotracer uptake [25, 29, 30, 31]. A gamma camera system having a spatial resolution of 3 mm or better at a distance (depth) of the order of 1 cm would likely visualize such problematic nodes intraoperatively. Such an imaging system would offer other practical advantages over probes: the signal is provided in the familiar format of a scintigraphic image rather than a numerical display or variable-frequency tone; the larger FOV of even small gamma cameras (several centimeters) than that of probes (less than 1 cm) allows more rapid interrogation of large areas and/or longer sampling, with collection of more counts and reduction in statistical uncertainty (noise); more straightforward reexamination of the surgical site post-lymph node excision to verify removal of foci of activity; and less reliance on potentially obliterated and otherwise ambiguous preoperative skin markings directing where measurements are to be performed intraoperatively [31]. Intraoperative gamma camera systems thus merit development and evaluation.

A number of small FOV intraoperative gamma camera systems have been developed [12, 32, 33]. The earliest systems were handheld devices having FOVs of only 1.5–2.5 cm in diameter and using conventional NaI(Tl) or CsI(Tl) scintillation detectors. Later units used two-dimensional arrays (mosaics) of scintillation crystals connected to a position-sensitive PMT and, more recently, semiconductors such as CdTe or CdZnTe (CZT). The main problems with these early units were their very small fields of view and the resulting large number of images required to interrogate the surgical field and the difficulty in holding the device sufficiently still for the duration (up to 1 min) of the image acquisition. More recently, larger field-of-view devices have developed which are attached to an articulating arm for convenient and stable positioning. These systems are nonetheless fully portable and small enough overall to be accommodated in typical surgical suites.

Abe et al. [34] evaluated a handheld CZT-based semiconductor gamma camera known as the eZ-SCOPE (Anzai Medical, Tokyo, Japan). As illustrated in Fig. 4, the device is light enough (820 g) to hold for a short time (up to ~1 min). The CZT detector has a 3.2 × 3.2-cm FOV and is 5 mm thick, with an efficiency of 87% and energy resolution of 9% for 99mTc γ-rays. Its collimators are easily exchanged. The CZT crystal is divided into a 16 × 16 array of 2 × 2-mm pixels, with integral and differential uniformities of 1.6% and 1.3%, respectively, with low-energy high-resolution (LEHR) collimation. System spatial resolution with the LEHR collimation was 2.3-, 8.0-, and 15-mm full-width half-maximum (FWHM) at source-to-collimator distances of 1, 5, and 10 cm, respectively. As shown in Fig. 4b, c, this camera is able to clearly image sentinel lymph nodes as well as lymphatic vessels. The small 3.2 × 3.2-cm FOV remains limiting, however; as shown in Fig. 4b, c, for example, a single lymph node occupies nearly half of the FOV and searching the surgical field can thus be time-consuming. A pinhole collimator was therefore subsequently incorporated into a newly designed version of the system, the Sentinella 102 (see below). In order to generate a larger effective FOV (30 × 30 cm), a computer program to integrate multiple adjacent images was developed and tested in mouse studies. The exact spacing of the individual images and the occasional low-count pixels at the periphery of the images were problematic, however. Initial experience with the eZ-SCOPE was nonetheless favorable overall.
Fig. 4

(a) Photograph of the eZ-SCOPE intraoperative gamma camera (Anzai Medical, Tokyo, Japan). (b) Sample eZ-SCOPE image of a lymph node in a patient. (c) Sample eZ-SCOPE images of a lymph node (top left) and lymphatic vessel in a patient (top right) and conventional gamma camera image of the same patient obtained preoperatively (bottom). See text for additional details

A CZT-based semiconductor gamma camera with a larger FOV, 4 × 4 cm, than that of the eZ-SCOPE was developed by General Electric (Haifa, Israel). The 4 × 4-cm pixilated detector consists of a 16 × 16 array of 2.5 × 2.5-mm pixels. Using parallel-hole collimation, the spatial resolution was 5-mm FWHM at a distance of 5 cm with a sensitivity of 100 cps/MBq. Energy resolution for 99mTc was 8.0%, somewhat better than the ~10% value typically quoted for conventional gamma cameras. In one experiment, this system could clearly resolve 99mTc-filled spheres 1 cm in diameter in contact with one another at distances of up to 6 cm; a gamma probe could only distinguish the two sources at a 1-cm depth and only when separated by at least 2 cm. A potential advantage of gamma camera imaging is the ability to resolve sources that may overlap one another in one view by acquiring additional views at different angles, as illustrated by the results of the phantom experiment shown in Fig. 5. This may be helpful, especially in breast cancer, in localizing a sentinel lymph node at a different depth from the injection site and obscured by the injected activity.
Fig. 5

Setup and results of a 99mTc phantom imaging experiment with the intraoperative gamma camera developed by General Electric (Haifa, Israel) [88, 89]. (a) Schematic diagram (side view) of phantom, with two 1-cm spheres at depths of 2 and 4 cm and a third sphere, 1.4 cm in diameter, at a depth of 2 cm and directly over the 2-cm sphere at a depth of 4 cm. The two small spheres and the large sphere had activity concentrations of 1 and 2.5 μCi/ml, respectively. (b) Images (identified as “1,” “2,” and “3,” respectively) were acquired at angles of −45°, 0°, and +45° relative to an axis perpendicular to the top of the phantom (i.e., an axis in the plane of the diagram). The resulting gamma camera images demonstrate the ability to resolve overlying foci of activity by acquiring views at multiple angles (From [88, 89] with permission)

A handheld camera, known as the POCI (“per-operative compact imager”) and utilizing a CsI(Na) scintillation crystal coupled to a focusing image intensifier tube and position-sensitive diode, was developed in France [35] (Fig. 6a). Its field of view is 4.0 cm in diameter. With high-resolution parallel-hole collimation, its 99mTc sensitivity with scatter is 250 cps/MBq at 1 cm and 125 cps/MBq at 5 cm and its spatial resolution 3.9-, 4.8-, and 7.6-mm FWHM at 1, 2, and 5 cm, respectively. Images are acquired in a matrix of 50 × 50 pixels. The energy resolution of the POCI, 28%, is rather poor, however, and the wide energy windows thus required result in inclusion of substantial amounts of scatter in the image, a particular disadvantage when a lymph node is close to the injection site. Figure 6b illustrates the manual positioning of the POCI camera during intraoperative lymphoscintigraphy. Figure 6c presents a representative intraoperative image, with clear visual discrimination of two adjacent lymph nodes. In a preliminary clinical study, lymph nodes in all three patients were identified with the POCI, including one in whom two deep nodes were missed with a gamma probe (most likely due to depth-related loss of sensitivity and proximity of the nodes to the injection site). The total imaging times depended upon the scan area and varied from 15 s to 3 min.
Fig. 6

(a) Photograph of the POCI (per-operative compact imager) intraoperative gamma camera. (b) Intraoperative lymphoscintigraphy, with the POCI in position for imaging of the patient’s left axilla. (c) POCI image (10 s acquisition time), showing two foci of activity corresponding to two neighboring lymph nodes (From [35] with permission)

Another semiconductor gamma camera, utilizing CdTe, was developed by Tsuchimochi and colleagues in Japan [36, 37, 38]. Their choice of CdTe was based on its superior uniformity (integral uniformity, 4.5%) and energy resolution (7.8%) compared to CZT. The camera, referred to as the “small semiconductor gamma camera (SSGC) ,” uses an array of 32 × 32 5-mm-thick CdTe elements, with a matrix of 1.2 × 1.2-mm pixels and a 4.5 × 4.5-cm FOV. The collimation, comprised of tungsten, had 1.2 × 1.2-mm square apertures to match the pixel arrangement. Spatial resolution without scatter was 3.9-, 6.3-, and 11.2-mm FWHM at 2.5, 5, and 10 cm, respectively. The 99mTc sensitivity at the surface without scatter was 300 cps/MBq, comparable to that of the POCI and better than that of a conventional gamma camera with LEHR collimation (~100 cps/MBq). The results of preliminary phantom and clinical imaging studies with the SSGC were encouraging.

A small FOV gamma camera equipped with pinhole collimation, known as the Sentinella 102 , has been developed by General Equipment for Medical Imaging, Spain [39, 40, 41, 42, 43]. It uses a single 4 × 4-cm CsI(Na) scintillation crystal and a position-sensitive PMT, with images acquired in a 300 × 300 matrix. Interchangeable pinhole apertures 1.0, 2.5, and 4.0 mm in diameter are available, yielding an effective FOV of 20 × 20 cm at a distance of 18 cm. The detector assembly weighs 1 kg and is mounted on an articulating arm (Fig. 7a). The 99mTc sensitivity ranged from 200 to 2,000 cps/μCi at 1 cm and 60 to 160 cps/μCi at 10 cm, depending on the pinhole aperture used. The FWHM spatial resolution over the detector face is 5.4–8.2 mm, 7.3–11 mm, and 10–18 mm at 3, 5 and 10 cm, respectively, again depending on the pinhole aperture used. Beyond ~3 cm, therefore, the spatial resolution is poorer than that of cameras with parallel-hole collimation. However, despite the coarser resolution, the advantage of a pinhole collimation lies in the larger effective FOV at such distances. Such a system can therefore be used at a large distances to rapidly survey, with coarser resolution, a large area and then examine suspicious areas at smaller distances and finer resolution [39, 40, 44]. In initial clinical studies, acquisition times of 20–60 s per image were required. Because the distortion associated with pinhole collimation varies with position within the FOV (i.e., is worse toward the periphery) as well as distance, the Sentinella 102 camera is equipped with a laser positioning system, with two intersecting lines being projected onto the surface of the region being imaged (Fig. 7b). This allows positioning of suspicious foci of activity at the center of the field of view, where image quality is best (Fig. 7c). The Sentinella 102 camera is also equipped with a long-lived gadolinium-153 (153Gd) pointer for real-time positioning; the image of the 153Gd pointer source is acquired in a separate energy window from the 99mTc image and is displayed as a small marker superimposed on the 99mTc image.
Fig. 7

(a) Photograph of the Sentinella 102 small FOV gamma camera (General Equipment for Medical Imaging, Spain). The detector assembly is shown mounted on the system’s articulating arm. (b) Illustration of the device’s laser positioning system, with two intersecting red lines projected onto posterior surface of the patient’s right knee joint. This patient had malignant melanoma of the right heel, and lymphoscintigraphy was performed to identify the popliteal sentinel lymph node. (c) Posterior gamma camera images of the patient’s right knee joint before (left) and after (right) surgical excision of the popliteal node. The pre-excision image (left) clearly shows the node centered in the field of view, and the post-excision image (right) is notably absent of any such focus of activity, demonstrating complete removal of the node

The Institut Pluridisciplinaire Hubert Curien (Strasbourg, France) developed an intraoperative gamma camera known as the “CarollReS ” [27, 28, 29]. This device has a relatively large-area 50 × 50-mm cerium-doped gadolinium orthooxysilicate (GSO) scintillation crystal and parallel-hole collimation with 2-mm-wide apertures. Its 99mTc spatial resolution was 10-mm FWHM at 5 cm, sensitivity 130 cpm/kBq, and energy resolution 45%. A prototype version of this device with a larger 100 × 100-mm FOV has been fabricated as well. In a preliminary clinical study with the CarollReS camera, Mathelin et al. [29] compared the depth of lymph nodes estimated by imaging to their actual depth measured at surgery and found a generally good correlation, except in instances where only a portion of the sentinel lymph node was in the camera’s FOV. For 7 of 11 nodes whose depth could be estimated, the image-derived depth was correct.

Crystal Photonics GmbH (Berlin, Germany) markets a light (800-g) handheld gamma camera with a 5-mm-thick CZT detector having a 40 × 40-mm field of view, a photon energy range of 40–250 keV, and interchangeable tungsten or lead parallel-hole collimators. The spatial resolution at 3.5 cm is 5.4 mm with the high-resolution collimator to 9.2 mm with the high-sensitivity collimator, with the latter providing ~fourfold higher sensitivity than the former.

The SurgicEye GmbH (Munich, Germany) has introduced an intraoperative SPECT imaging technology termed “freehand SPECT” [45, 46]. In contrast to conventional SPECT systems, the Declipse® SPECT does not employ a gantry-mounted gamma camera rotating around the patient, but a handheld gamma probe interfaced to an infrared 3D tracking system. A 1- to 2-min scan is performed at anterior-posterior and lateral directions around the patient. The device can be used with low-energy photon emitters such as 99mTc and is compatible with different commercially available gamma probes. The Declipse™ SPECT website reports a reconstructed spatial resolution of 5 mm; details of the data acquisition and processing yielding this resolution were not specified, however.

Overall, small FOV gamma cameras have demonstrated detection rates for sentinel lymph nodes equal to or better than those of non-imaging gamma probes, despite having sensitivities (e.g., expressed in cps/MBq) typically about tenfold lower than those of such probes. The ability of such devices to image a surgical field intraoperatively and thus insure complete excision of lesions is a potentially useful enhancement of surgical management of cancer. The acquisition times per image are typically well under 1 min, so the overall duration of the surgical procedure should not be significantly prolonged. In addition, the use of pinhole collimation, despite having lower sensitivity than parallel-hole collimation, permits initial imaging at longer distance to visualize a larger anatomic area of interest followed by imaging at shorter distance to pinpoint and otherwise characterize suspicious foci of activity. Importantly, the scintigraphic image format is familiar to surgeons, likely facilitating clinical acceptance and integration of intraoperative imaging.

Intraoperative PET scanners . Prescient Imaging LLC is currently developing a portable compact whole-body PET scanner adaptable to intraoperative imaging [47]. It incorporates a 360-degree detector with an axial FOV of 22 cm. The detectors are assembled into three sets, one planar and the other two circular arcs of 90° each connected with a hinge. One of the arcs is fixed while the other can be rotated about the hinge and thereby opened and closed. The planar detector is fixed horizontally such that it can fit under a patient table. By placing the moveable arc in the closed position, the 360° of detector coverage about the patient is achieved. Attenuation correction will be performed using a rotating rod transmission source.

Optical and Near-Infrared (NIR) Imaging

Despite the very limited penetrability of optical and near-infrared (NIR) light in tissue, specialized technologies have led to widespread and very productive use of light – both bioluminescence and fluorescence – for in vivo imaging of rodents [48, 49] and, to a much more limited extent to date, of human subjects; in the case of the latter, this has been restricted to fluorescence imaging [50]. In the most common (i.e., preclinical) optical imaging paradigm, animals are placed in a lighttight imaging enclosure, and the emitted optical or NIR signal is imaged by a charge-coupled detector (CCD). In bioluminescence imaging (Fig. 8a), cells (e.g., tumor cells) which are to be localized or tracked in vivo must first be genetically transduced ex vivo to express a so-called reporter gene, most commonly, a luciferase gene (such as the firefly luciferase, or FLuc, gene). After the cells have been implanted, infused, or otherwise administered to the experimental animal, the luciferase substrate (e.g., d-luciferin in the case of firefly luciferase) is systemically administered. Wherever the administered substrate encounters the luciferase-expressing cells, the ensuing reaction (such as the d-luciferin-luciferase reaction) emits light, which is detected and localized by the imaging system. The CCD in bioluminescence imaging is maintained at a very low temperature (of the order of −100 °C), thereby ensuring that any electronic output it produces results from light striking the CCD rather than the background “dark current” (which would be prohibitively high at ambient temperatures). In this way, the otherwise undetectably small signal originating in vivo and escaping from the surface of the animal can produce an image. In fluorescence imaging (Fig. 8b), cells to be imaged may either be genetically transduced ex vivo to express a fluorescent molecule (or fluorophore) such as green fluorescent protein (GFP) or a fluorophore probe targeting the cells of interest may be systemically administered. In either case, the animal is then illuminated with light at an appropriate excitation wavelength (obtained by filtration or with a laser) to energize the fluorophore in situ, and the resulting emitted light (which has a slightly different wavelength than the excitation light) is itself filtered and detected by the CCD; the difference in wavelengths between the excitation and emitted light is known as the Stokes shift. The excitation light may be provided by reflectance (or epi-illumination) or by transillumination of the animal. Further, by computer processing, the abundant spontaneous fluorescence of the animal’s tissues as well as of foodstuffs in the gut must be mathematically separated, or “de-convolved,” from the overall fluorescence to yield an image specifically of the fluorophore; this is sometimes known as “spectral unmixing.” In practice, the resulting luminescence or fluorescence image is generally superimposed on a conventional (i.e., white-light) photograph of the animal to provide some orientation as to the anatomic location of the signal(s) in vivo (Fig. 8c).
Fig. 8

(a) Bioluminescence and (b) fluorescence optical or NIR in vivo imaging. For fluorescence imaging, the excitation light source may be a white-light source whose emitted light is passed through conventional glass filters to yield a light over a narrow wavelength range centered about the excitation wavelength of the fluorophore being imaged. Alternatively, it may be a laser light source tuned to the appropriate wavelength. In so-called “multispectral” systems, multiple excitation and emission wavelengths and thus multiple fluorophores may be imaged simultaneously. (c, d) These sample images show a pseudo-color bioluminescence images superimposed on a gray-scale photograph and on a three-dimensional rendering of a mouse, respectively (From [90] with permission)

Because light emitted at any depth of tissue is scattered and otherwise dispersed as it passes through overlying tissue before emanating from the surface of the animal, the apparent size of the light source (Fig. 8c, d) is considerably larger than its actual size. Despite the excellent spatial resolution of the CCDs themselves, the effective resolution of optical and NIR imaging is generally rather coarse. Further, for planar optical and NIR imaging, the resulting images are only semiquantitative: absorption and scatter of the emitted light as it passes through overlying tissue makes the measured signal highly depth dependent. Thus, a focus of cells lying deep within tissue may appear less luminescent or fluorescent than an identical focus of cells at a more shallow depth; if excessively deep, such a focus of cells may be undetectable altogether. NIR radiations, however, have a substantially higher penetrability through tissue than blue to green radiations. Importantly, therefore, by using laser transillumination for excitation of administered NIR molecular probes in situ, tomographic fluorescence images can be mathematically reconstructed [49]. The resulting three-dimensional images – in contrast to planar images – are at least semiquantitative: the signal intensity thus reconstructed is related to the local concentration of the fluorophore.

Bioluminescence imaging, because it requires genetic modification of the cells to be imaged, likely has very limited applicability in patients but has proven invaluable in preclinical research. However, with recent advances in adoptive immunotherapy of cancer, bioluminescence imaging conceivably may have some clinical utility in an intraoperative or endoscopic setting to assess tumor targeting of immune effector cells. To date, however, no such studies have been performed. Intraoperative, endoscopic, and even surface fluorescence imaging of patients has been performed and continues to advance (see below).

Fluorescence imaging. As noted, the clinical application of optical imaging to date has utilized fluorescence imaging in endoscopic and intraoperative settings. Fluorescence cystoscopy, for example, is now widely used to identify and localize urinary bladder cancer [51]. The photosensitizer 5-aminolevulinic acid (ALA) is a precursor of the photoreactive (at 375–440 nm) protoporphyrin IX (PpIX). Although the mechanism is not yet well understood, ALA accumulates selectively in cancerous tissue, yielding tumor-to-non-tumor activity concentration ratios of ~20:1 within 2 h of topical administration within the bladder. As illustrated in Fig. 9, fluorescence cystoscopy allows more sensitive and specific visualization of bladder cancer in vivo than conventional (i.e., white-light) imaging.
Fig. 9

Cystoscopic imaging of urinary bladder cancer. Conventional (i.e., white-light) image (left panel) and fluorescence image (right panel) following topical administration of the protoporphyrin IX (PpIX) precursor 5-aminolevulinic acid (ALA). The red coloration of the cancerous tissue makes it far more apparent in the fluorescence image than in the white-light image (From [51] with permission)

Narrowband imaging (NBI) is a notable refinement of non-fluorescence optical cystoscopy that improves the visualization of blood vessels and bladder mucosa and thereby enhances the contrast between cancerous and normal bladder epithelium (given that urothelial lesions are typically hypervascularized due to elevated microvessel density) [52]. NBI exploits these angiogenic features of bladder cancers by filtering white light into two discrete wavelength bands, one blue (415 nm) and one green (540 nm), both of which are absorbed by hemoglobin. The shorter-wavelength 415-nm light penetrates only the superficial layers of the mucosa and yields brownish images of the superficial capillaries. The longer-wavelength 540-nm light penetrates deeper into the bladder wall and produces greenish images. Bladder tumors are thus visually identified by the intensity of brown-green coloration that is characteristic of the elevated vessel density of tumors and distinct from the pink-to-white coloration of the normal mucosa.

In addition to endoscopic fluorescence imaging, large-field, planar optical, and NIR fluorescence imaging potentially may improve human surgery by providing real-time image guidance to surgeons to identify tissue to be resected (such as tumors) and tissue to be avoided (such as blood vessels and nerves). As illustrated in Fig. 10, the use of ALA has been extended to systemic administration and fluorescence imaging-guided resection of glioblastomas [53]. Based on the overexpression of folate receptor-α, intraoperative fluorescence imaging has also been applied to resection of ovarian cancer using folate conjugated via an ethylene diamine spacer to fluorescein isothiocyanate (FITC) (Fig. 11). To further advance the practical implementation of fluorescence imaging-guided surgery, Dr. John Frangione and colleagues have developed the so-called Fluorescence-Assisted Resection and Exploration (FLARE™) system [54, 55, 56, 57, 58, 59, 60]. Briefly, the FLARE™ system consists of an imaging head mounted on an articulated arm and a cart containing control equipment, computer, and monitors (Fig. 12). The imaging head includes light-emitting diodes (LEDs) as the excitation light source, heat dissipation technology to maintain stability of the LEDs, and complementary metal oxide semiconductor (CMOS) cameras; it can be positioned anywhere in 3D space with six degrees of freedom. A customized software system enables the real-time display of color video and two NIR fluorescence channels at a rate of up to 15 frames per second. The software is capable of displaying the NIR fluorescence signal as a pseudo-colored overlay on the color video, thereby providing anatomic guidance to the surgeon. Among its many applications to date, the FLARE™ system has been applied to sentinel lymph node resection in breast cancer surgery (Fig. 13) [60] and has demonstrated improved visualization of nodes and of tumors [54, 55, 56, 57, 58, 59, 60, 61].
Fig. 10

Intraoperative white-light image (left panel) and fluorescence image (right panel) following systemic administration of ALA (20 mg/kg 3 h prior to surgery) of the intracranial resection site of a glioblastoma. The red coloration of the cancerous tissue, blue coloration of normal brain, and lack of color of non-perfused, necrotic tissue allow easier discrimination of these respective tissues than does the white-light image. This in turn should facilitate a more complete resection of the tumor and, presumably, better local control (From [53] with permission)

Fig. 11

(a) Photograph of intraoperative use of fluorescence imaging system developed by Ntziachristos and colleagues. (b) Conventional color image (left panel) and fluorescence gray-scale image (right panel) following systemic administration of FITC-conjugated folate (0.3 mg/kg) of intra-abdominal resection site of ovarian cancer. The tumor deposits are more clearly visualized on the fluorescence image than on the color image. (c) Graphical presentation of the results of conventional versus fluorescence imaging-assisted resection of ovarian cancer, demonstrating that intraoperative fluorescence imaging (FLI) visualized significantly more ovarian cancer deposits than conventional imaging (Adapted from [91] with permission)

Fig. 12

Photograph of the FLARE™ system developed by Frangione and colleagues for intraoperative fluorescence imaging (Adapted from [60] with permission)

Fig. 13

Use of the FLARE™ system in intraoperative localization and resection of sentinel lymph nodes (SLNs) in breast cancer surgery. Four peri-tumoral injections were performed of indocyanine green (ICG) conjugated to human serum albumen (10 μg of ICG in 0.2 mL per injection). The conventional color video images in the left panels show the surgical field and the incision in the proximity of four SLNs. The NIR fluorescence video images (100 ms per exposure) show the injection site (top middle panel) and the four ICG-albumen-concentrating SLNs (middle panels). For the images in the lower two middle panels, the injection site was covered with an opaque surgical drape. The left panels show the overlaid, or merged, color and fluorescence images. The SLNs are clearly far more apparent in the fluorescence than in the color images (Adapted from [60] with permission)

In collaboration with Dr. Michelle Bradbury and colleagues, Quest Medical Imaging has developed an enhanced intraoperative NIR imaging device, the ArteMIS™ (now Spectrum™) handheld fluorescence camera system (Fig. 14a) for either minimally invasive laparoscopic or open surgery configurations [62]. This system is capable of true multiplexing, with simultaneous imaging (at a frame rate of 30 frames per second) and real-time spectral unmixing of the signals in two NIR channels (700 and 800 nm, respectively) and a conventional color channel, as illustrated in Fig. 14b [63].
Fig. 14

(a) Photograph of the ArteMIS™ (now Spectrum™) handheld NIR fluorescence camera system . The compact form factor of this system makes it particularly suitable for the intraoperative setting (Courtesy of Quest Medical Imaging). (b) Preclinical sentinel lymph node study in a spontaneous melanoma miniswine model. The miniswine received percutaneous peri-lesional co-injections of ultrasmall (<10 nm) fluorescent Cornell-dot (C-dot) silica nanoparticles surface-functionalized with alpha-melanin-stimulating hormone (α-MSH) and with cyclic arginine-glycine-aspartic acid-tyrosine (cRGDY) for targeting the MSH receptors and integrin receptors, respectively, overexpressed on the surface of the melanoma cells (as confirmed by immunohistochemical staining of the excised node); the α-MSH- and the cRGDY-functionalized C dots were loaded with the Cy5.5 and the CW800 fluorophores, respectively. The real-time multiplexing capability of the ArteMIS™ (now Spectrum™) system provided clear visual discrimination of the two types of functionalized nanoparticles in the diseased sentinel lymph node. Such molecularly targeted nanoparticle probes thus are not only capable of highly sensitive visual detection of sentinel lymph nodes but also of phenotypic characterization of such nodes (Adapted from [63] with permission)

Cerenkov imaging. Cerenkov imaging is a new approach to optical imaging based on the emission of a continuum of visible light associated with the decay of certain radionuclides (actually, with the particles emitted as result of the radionuclide decay) [64, 65, 66, 67, 68, 69, 70, 71, 72, 73]. This phenomenon, now known as the “Cerenkov effect,” was first observed in the 1920s and characterized in the 1930s by Pavel Cherenkov [65]. In 1958, Cerenkov shared the Nobel Prize in Physics with colleagues Ilya Frank and Igor Tamm for the discovery and explanation of the effect which now bears his name. Cerenkov radiation is perhaps familiar to some readers as the bluish “glow” observed in the water pools containing spent, but still radioactive, fuel rods at nuclear reactors. It arises when charged particles such as beta particles travel through an optically transparent, insulating medium at a speed greater than that of light in that medium. The Cerenkov effect, often analogized to the sonic boom that occurs at the instant a supersonic plane exceeds the speed of sound in air, occurs as the charged particles dissipate their kinetic energy by polarizing the electrons in the insulating medium (most commonly, water) as they travel through the medium. As these polarized electrons then relax (or re-equilibrate), and if the charged particle is traveling faster than light, constructive interference of the light thus emitted occurs, producing the grossly visible Cerenkov radiation.

The application of Cerenkov radiation to in vivo radionuclide imaging is a recent development [64, 66, 67, 68, 69, 70, 71, 72, 73]. Phantom studies by Ruggiero et al. have demonstrated, using a commercial optical imaging system (Ivis 200, Caliper Life Sciences) equipped with a cryo-cooled CCD, measurable emission of Cerenkov radiation associated with a number of clinically relevant radionuclides, including 18F, copper-64 (64Cu), zirconium-89 (89Zr), iodine-124 (124I), iodine-131 (131I), and actinium-225 (225Ac) [72] (Fig. 15a, b). Importantly, the optical Cerenkov signal is linearly related to activity concentrations (Fig. 15c), at least where the effects of attenuation and scatter are minimal. Ruggiero et al. have also produced planar Cerenkov images of the tumor localization of a 89Zr-labeled antibody in prostate tumor xenografts in mice which compare favorably, both qualitatively and quantitatively, with the 89Zr PET images [72] (Fig. 16). Using intradermal tail injections of [18F]FDG, Thorek et al. subsequently performed both PET- and Cerenkov imaging-based lymphography in mice, with both modalities demonstrating excellent visualization of lymph nodes [73] (Fig. 17). Initial clinical trials of Cerenkov imaging are currently underway to delineate its potential clinical utility. As illustrated in Fig. 17, assisting the resection of sentinel lymph nodes appears to be one potential application. In a clinical pilot study, for example, Thorek et al. evaluated the feasibility of Cerenkov imaging of patients (n = 4) undergoing standard diagnostic [18F]FDG scans to detect nodal disease [74]. Two 5-min scans were performed for each patient at ~70 min postinjection of [18F]FDG (12 mCi), one over the site of the [18F]FDG-avid axillary lymph node by PET scanning and one over the contralateral side of the patient without detectable nodal disease as a negative control. An intensified CCD camera (Mega10z, Stanford Photonics) was used. The single-photon-counting camera consisted of a dual-microchannel plate-intensifier system behind a photocathode read off a Peltier-cooled CCD (XX285, Sony Corp). The optics consisted of a quartz high-ultraviolet-transmission 50-mm f/0.8 lens and a long-pass filter with a cutoff at 605 nm (Chroma Technology). High-frame-rate acquisitions (120 frames per second) were integrated in the Piper Control software (Stanford Photonics), a background baseline was subtracted, and the images were processed and analyzed in Image J (National Institutes of Health) (13). The camera with lens and filter was mounted on a standard photography tripod, and the subjects imaged at a distance of ~8 cm. The patients were imaged with the room lights on for a white-light photograph and then with the lights off, the light-sealed door to the room closed, and an optical drape used to eliminate any remaining ambient light. A representative patient study is shown in Fig. 18. Despite the low yield of Cerenkov light and the suboptimal Cerenkov light spectrum for in vivo imaging (primarily in the highly attenuated blue-green range), based on this and other studies [74, 75], Cerenkov imaging of patients is feasible, though its ultimate clinical utility remains to be determined.
Fig. 15

(a) Phantom for evaluation of Cerenkov imaging of positron-emitting radionuclides, comprised of a circular arrangement of six 1-ml Eppendorf tubes filled with increasing activity concentrations. Left panel: Cerenkov image superimposed on a photograph of the phantom acquired with an Ivis 200 optical imaging system (Caliper Life Sciences). Right panel: PET image of the phantom acquired with a Focus 120 microPET scanner (Concorde Microsystems). (b) Average radiance per unit activity concentration (in photons (p)/second (s)/cm2/steradian (sr) per kBq/μL) for different radionuclides as measured using the phantom arrangement and instrumentation described in (a). (c) Linear correlation (r = 0.98) between average radiance (in p/s/cm2/sr) and activity concentration (in kBq/μL) for 89Zr evaluated again using the phantom arrangement and instrumentation described in (a) (Adapted from [72] with permission)

Fig. 16

(a) PET image (left panel) and Cerenkov image (right panel) of a 89Zr-anti-PSMA (prostate-specific membrane antigen) antibody (J591) in a mouse with bilateral flank LNCaP prostate tumor xenografts at 96 h postinjection. The Cerenkov image is superimposed on a photograph of the mouse. The two xenografts are clearly visible in both images. (b) Linear correlation (r = 0.89) between the Cerenkov image-derived average radiance (in p/s/cm2/sr) and the PET image-derived maximum tissue uptake (in percent of the injected dose per gram,%ID/g) (Adapted from [72] with permission)

Fig. 17

Cerenkov imaging-guided lymph node resection in a normal mouse following intradermal tail injection of [18F]FDG (30 μCi). Left panel: Volume rendering of fused CT image (yellow color table) and PET image (green color table) showing lymphatics and lymph nodes in relation to the skeleton. Right panel: Cerenkov image superimposed on a photograph of the mouse following removal post-sacrifice of the dorsal skin before (left image) and after (right image) resection of a luminescent inguinal lymph node. The resected node ex vivo is shown in the inset image (Adapted from [73] with permission)

Fig. 18

Representative Cerenkov and PET-CT images of a [18F]FDG-positive axillary lymph node. (a, b) Cerenkov scans of patient’s right (R) and left (L) axillae, respectively. (c) Negative Cerenkov scan in light-protected environment of right axilla without [18F]FDG-positive lymph node overlaid with white-light photograph. No significant Cerenkov emission associated with [18F]FDG is seen. (d) White-light photograph from left axilla, overlaid with significant Cerenkov signal (shown in red-hue pseudo-color). (e) Transverse section of [18F]FDG PET-CT scan through the patient’s axillae, demonstrating that the Cerenkov signal co-localized (at least grossly) with the nodal focus of [18F]FDG activity. There was no focus of [18F]FDG activity in the contralateral disease-free axilla, consistent with the corresponding Cerenkov scan (From [74] with permission)

Photoacoustic imaging. In photoacoustic imaging [76, 77, 78], tissues are illuminated with laser light. When radiofrequency (RF) pulses are used, the technology is termed, “thermoacoustic imaging.” Some of the delivered energy will be absorbed and converted into heat, leading to transient thermoelastic expansion of the illuminated tissue and thus ultrasonic (i.e., MHz-frequency) emissions. The ultrasonic waves thus emitted are then detected by ultrasonic transducers to form images. Image contrast is provided by the differential absorption among tissues of the incident excitation light. In contrast to fluorescence imaging, in which scattering in tissue degrades spatial resolution with increasing depth, photoacoustic imaging provides better spatial resolution (of the order of 100 μm) and deeper imaging depth (of the order of 1 cm or greater) because there is far less absorption and scattering in tissue of the ultrasonic signal compared to the emitted light signal in fluorescence imaging. When compared with ultrasound imaging, in which the contrast is limited because of the similarity in acoustical properties among tissues, photoacoustic imaging provides better tissue contrast as a result of the wider range of tissue optical properties. The optical absorption in biological tissues can be due to endogenous molecules such as hemoglobin or melanin or exogenously administered contrast agents. Since blood exhibits orders of magnitude higher light absorption than other tissues, there is sufficient endogenous contrast provided by oxygenated hemoglobin (HbO2) and deoxygenated hemoglobin (Hb) for photoacoustic imaging to visualize blood vessels.

Most commonly, photoacoustic scanners use either a tomographic geometry (with an array of up to several hundred US transducers partially surrounding the subject) [76] or a planar geometry employing a linear transducer array [79, 80]. The tomographic approach offers a large effective aperture for data collection, but suffers from a low frame rate (>10 min per frame), due to the need for hundreds to thousands of laser pulses per frame. The use of a linear array eliminates the need for scanning, and thus a two-dimensional frame can be acquired with many fewer laser pulses, providing much higher frame rates. In addition, in the tomographic geometry, the surface of the transducers is of the order of 1 cm from the surface of the subject (up to now, rodents) to accommodate an array of transducers encircling the subject. As a result, the mouse or rat must be immersed in water to provide the necessary acoustical coupling to the transducer; in the multispectral optoacoustic tomography (MSOT) system marketed by iThera Medical, the animal is suspended in a very thin membrane and then immersed in the water, thereby keeping the animal completely dry (Fig. 19a). Another preclinical photoacoustic imaging system, employing a linear transducer array, is marketed by VisualSonics.
Fig. 19

(a) Setup for photoacoustic imaging of regional perfusion in a tumor xenograft on the dorsal surface of a nude mouse. Note that the tumor (on the dorsal surface of the animal and therefore not seen) is “immersed” in water. The animal remains dry, however, because of the presence of a thin membrane between the animal and the water. (b) Photograph of the 4T1 xenograft on the dorsal surface of the mouse. In (a, b), the dashed line indicates the approximate position of the transverse tissue section being imaged. (c) Images (acquired at 790 nm) before (left panel) and 30 s after (right panel) intravenous injection of IGC (V ventral, D dorsal, L left, R right). Note the increase in image contrast in and around the tumor (arrow) after injection, identifying the more highly perfused portions of the tumor (Adapted from [77] with permission)

Photoacoustic imaging has been used successfully in preclinical models for tumor perfusion angiogenesis monitoring (see Fig. 19 [77]), blood oxygenation mapping, functional brain imaging, and melanoma detection, among other applications. The resulting functional images can be superimposed on high-resolution B-mode anatomic images.

While the clinical implementation of MSOT is still progressing and its clinical impact remains to be determined, the ability of this technology to visualize in real time the distribution and concentration of intrinsic and exogenous chromophores (including currently approved chromophores such as ICG and methylene blue) is such that it merits clinical evaluation. The Acuity™, marketed by iThera, is one of the very first clinical MSOT systems [81, 82] and appears adaptable to intraoperative imaging (Fig. 20) . It is equipped with a fast-tunable 50-Hz laser and 2D or 3D detectors (128–512 elements, 2.5–10 MHz) and provides cross-sectional in-plane spatial resolution of 80–250 μm.
Fig. 20

Photograph of the iThera Acuity™ clinical MSOT system (Courtesy of iThera)

Diffuse optical tomography. Diffuse optical tomography (DOT) utilizes NIR light to generate quantitative functional images of tissue with a spatial resolution of 1–5 mm at depths up to several centimeters [83, 84]. Propagation of NIR light through a medium is dominated by scattering rather than absorption – tissue absorption path lengths are ~10 cm while scattering path lengths are less than 50 μm – and can be modeled as a diffusion process where photons behave stochastically (in a manner analogous to that of particles in random-walk modeling of diffusion). Quantitative measurements can be obtained by separating light absorption from scattering using spatial or temporal modulation techniques. Tissue molecular composition, including the determination of the concentrations of oxy- and deoxyhemoglobin, water, lipid, and exogenous probes, and tissue structure can be determined from absorption and scattering measurements, respectively. Time modulation systems use picosecond optical pulses and time-gated photon-counting detectors; frequency modulation systems use a RF-modulated light source, PMTs or fast photodiodes, and RF phase detectors. DOT has been applied to breast cancer diagnostics, joint imaging, and blood oximetry (i.e., activation studies) in the human muscle and brain tissue as well as to cerebral ischemia and cancer studies in small animals and is adaptable to intraoperative imaging. Commercial instruments are now available that yield tomographic and volumetric image sets. These devices are compact, portable, and relatively inexpensive (~$150 K).

Optical coherence tomography. Optical coherence tomography (OCT) is an interferometric technique, typically employing low-coherence NIR light, to produce two-dimensional images of tissue surface layers and structure [85]. The principle of OCT is analogous to that of pulse-echo (i.e., B-mode) ultrasound imaging except OCT uses light instead of acoustic waves to delineate tissue structure by measuring reflectance of light rather than sound waves and thus achieves far better spatial resolution but with less depth penetration. The technique has been described as “an optical biopsy,” since OCT can produce near-histologic images (spatial resolution, 1–15 μm) without excision. Due to photon absorption and scattering, its sampled depth is limited to within several millimeters of the tissue surface. The two-dimensional images can be assembled to construct a volumetric image set. In OCT, the axial resolution is proportional to the center wavelength and inversely proportional to the bandwidth of the light source and improves with the index of refraction of the sample. Originally developed for and still most commonly applied to ophthalmology (to obtain detailed images of retinal structure), OCT is being applied for cancer diagnosis, evaluation of coronary artery disease and tissue characterization.

Raman spectroscopic imaging. When light interacts with matter, most of the light is elastically scattered, retaining its original energy, frequency, and wavelength; this phenomenon is also known as Rayleigh scattering (Fig. 21a). However, a small fraction of light is inelastically scattered, with the scattered light having a lower energy and frequency and longer wavelength than the incident light. The process leading to this inelastic scatter is termed the Raman effect [86], and the difference in wavelength between the incident and scattered light is called the Raman shift. Because photons with optical energies interact with outer-shell, or valence, atomic electrons, which are responsible for the intramolecular chemical bonds among atoms, materials having different molecular compositions will inelastically scatter light differently. Every molecule therefore has a distinct Raman spectrum (or “signature”), that is, a different Raman shift-dependent intensity of the scattered light; this is the basis of using Raman spectroscopy to identify the molecular constituents of various materials [86]. By illuminating a sample with a highly collimated beam of light and at the same time either translating a scattered light detector or the sample in two dimensions, spatial indexing of the Raman spectrum can be performed and a Raman spectrum image created. Though they may appear similar, the Raman effect is distinct from fluorescence in that the former represents a light-scattering phenomenon and the latter light absorption and reemission. Like fluorescence imaging, Raman spectroscopic imaging has been applied to endogenous (or intrinsic) molecules naturally present in tissue and to exogenously administered materials (as in surface-enhanced Raman scattering (SERS) (see below)).
Fig. 21

(a) Raman effect , showing sample being illuminated with incident photons of wavelength (λi). Most of the incident photons are scattered elastically (Rayleigh scattering) and resulting scattered photons have same wavelength (λs) as the incident photons (i.e., λs = λi). A few photons are inelastically scattered (Raman scattering) at wavelengths longer than incident photons (i.e., λs > λi). The relative proportion of inelastically scattered photons is typically depicted using a Raman spectrum, which is a plot of scattered photon intensity versus the Raman shift (i.e., the wave number difference between the incident and scattered photons). Multiple different wavelengths of inelastically scattered light can occur, and a spectrum plot can therefore include multiple peaks, although a single primary peak, as shown, is also possible. (b) Molecular imaging agent approach showing SERS nanoparticles, which consist of a metallic core, a Raman-active layer adsorbed onto the metal surface, and a shell coating the entire particle. An array of unique spectral signatures can be obtained by modifying the Raman-active layer of the nanoparticle. These unique Raman nanoparticles can serve as molecular imaging agents for in vitro and in vivo procedures. (c) The intrinsic approach, showing a human tissue specimen being illuminated with a laser. The intrinsic Raman spectral signature of tissue can reveal important information about phosphate, protein, and lipid content of cells or tissue of interest (From [86] with permission)

A drawback of the Raman effect as an analytical tool is that it is a very weak phenomenon, producing only one inelastically scattered photon for every ten million elastically scattered photons. Technical advances, such as the introduction of lasers and of resonance-based enhancements, have greatly expanded the practical applications of the Raman effect. For years now, the Raman effect has been used in a variety of analytical applications and, more recently, in various in vitro cell assays and microscopy. The most commonly used enhancement methods are surface-enhanced Raman scattering (SERS) and coherence anti-Stokes Raman scattering (CARS); both SERS and CARS enhance the Raman signal by orders of magnitude. SERS involves adding metal (e.g., gold) nanoparticles which absorb the optical energy and yield an enhanced Raman signal by virtue of the metal surface transferring energy to nearby molecules (Fig. 21b). The resulting Raman signal provides picomolar sensitivity, which is compatible with tissue tracer concentrations achievable in vivo. Furthermore, labeling SERS nanoparticles with several different molecular ligands can provide simultaneous assay of multiple molecular components. CARS involves illumination with photons of different energies, with one photon energizing a molecule of interest from its ground state to an initial excited state and a second photon energizing the molecule from a “relaxed” state (i.e., at an energy level after releasing energy during the “laser-off” time interval following absorption of the first photon) to a different higher-energy level; this second (or higher) tier of vibrational energy is ~fivefold more intense than the Raman signal after the original pulse (Fig. 21c). The CARS technique is often used for high-resolution, three-dimensional microscopy. Its advantages are that there is no need for administered probes (as in SERS) and rapid acquisition of images. Several preclinical studies have utilized the SERS or CARS techniques for in vivo molecular imaging of cell receptors (e.g., RGD-carbon nanotubes that bind to α5β3 integrin-expressing tumors), tumor microvessels, enzyme activity, pH, lipid composition, and myelin composition (Fig. 22).
Fig. 22

Triple-modality detection with nanoparticle probes of brain tumors in mice. Three weeks after orthotopic implantation with U87MG glioblastoma cells, the brain tumor-bearing mouse was injected intravenously with the nanoparticles, which localized in the tumor due to the EPR (enhanced permeability and retention) effect. Photoacoustic (PA), Raman, and MR images of the brain (skin and skull intact) were acquired before and 2, 3, and 4 h postinjection, respectively. Raman imaging was performed using a commercial Raman microscope (inVia, Renishaw) with a computer-controlled x, y-translation stage. (a) Axial MR, PA, and Raman images. The postinjection images of all three modalities demonstrated clear tumor visualization. The PA and Raman images were co-registered with the MR image, demonstrating good concordance within the tumor of the nanoparticle distribution among the three modalities. (b) Volumetric rendering of MR images with the tumor segmented (red, top panel); overlay of the 3D PA image (green) over the MR image (middle panel); and overlay of the tumor-segmented MR and PA images (bottom panel) showing good co-localization of the PA signal within the tumor. (c) Quantification of the imaging signals in the tumor shows a significant increase in the MRI, PA, and Raman signals after versus before the nanoparticle injection (“***” indicates p < 0.001, “**” indicates p < 0.01). Error bars represent the standard error of the mean, AU, arbitrary units (From [87] with permission)

Over the last decade, the biomedical applications of Raman spectroscopic imaging have grown dramatically. Because it is essentially a surface imaging technique (like most optical imaging techniques), Raman spectroscopic imaging has been applied mainly to examination of skin and of pathological specimens as well as to small animals. For example, Raman mapping has enabled accurate identification of malignant from benign lesions and normal tissue in the skin, brain, larynx, parathyroids, breast, and urinary bladder. Raman spectroscopic imaging has also been applied endoscopically in the colon. Recently, Kircher et al. have reported a molecular imaging strategy using a novel triple-modality MRI-photoacoustic-Raman nanoparticle probe [87] clinically translatable to the intraoperative setting (Fig. 23).
Fig. 23

Preclinical model of Raman-guided surgery. (a) Following orthotopic implantation with GFP-transduced U87MG glioblastoma cells, the brain tumor-bearing mouse underwent craniotomy under general anesthesia. Quarters of the tumor were then sequentially removed (as illustrated in the photographs). (b) Intraoperative Raman imaging was performed after each resection step, until the entire tumor had been removed by visual inspection. After the gross removal of the tumor, several small foci of Raman signal were found in the resection bed (outlined by the dashed white square). (c) Subsequent immunohistochemical analysis of sections from these foci demonstrated an infiltrative pattern of the tumor in this location, forming fingerlike protrusions extending into the surrounding brain tissue. CD11b (second panel from left) is a widely used microglial immunohistochemical marker. As shown in the Raman microscopy image (right panel), the Raman signal was observed within these protrusions, indicating the selective presence of MPRs in these protrusions. The white dashed box not drawn to scale. Raman signal in linear red color table (From [87] with permission)

Concluding Remarks

The last decade has featured remarkable technical advances in intraoperative imaging and techniques potentially adaptable to intraoperative imaging. While intraoperative nuclear counting using handheld probes is now routine, intraoperative nuclear imaging remains largely investigational. While certain applications of intraoperative and endoscopic optical imaging (such as optical cystoscopy) are also now fairly routine, more exotic forms of optical imaging such as Cerenkov imaging and photoacoustic imaging have only recently been translated to the clinic. Other promising forms of optical imaging such as Raman spectroscopic imaging have not yet been clinically translated to any significant extent. Multimodality intraoperative probes capable of simultaneously detecting both nuclear and optical signals are being developed as well. The ultimate clinical impact of these new imaging modalities in intraoperative and endoscopic settings remains to be determined.

References

  1. 1.
    Fong Y, Giulianotti P, Lewis J, et al. Imaging and visualization in the modern operating room: a comprehensive guide for physicians. New York: Springer; 2015.CrossRefGoogle Scholar
  2. 2.
    Sweet WH. The use of nuclear disintegration in diagnosis and treatment of brain tumors. N Engl J Med. 1951;245:875–8.CrossRefPubMedGoogle Scholar
  3. 3.
    Cody III HS, editor. Sentinel lymph node biopsy. London: Martin Dunitz; 2002.Google Scholar
  4. 4.
    Mariani G, Giuliano AE, Strauss HW, editors. Radioguided surgery: a comprehensive team approach. New York: Springer; 2008.Google Scholar
  5. 5.
    Povoski SP, Neff RL, Mojzisik CM, et al. A comprehensive overview of radioguided surgery using gamma detection probe technology. World J Surg Oncol. 2009;7:11.CrossRefPubMedPubMedCentralGoogle Scholar
  6. 6.
    Gulec SA, Moffat FL, Carroll RG. The expanding clinical role for intraoperative gamma probes. In: Freeman LM, editor. Nuclear medicine annual 1997. Philadelphia: Lippincott-Raven Publishers; 1997. p. 209–37.Google Scholar
  7. 7.
    Woolfenden JM, Barber HB. Intraoperative probes. In: Wagner HN, Szabo Z, Buchanan JW, editors. Principles of nuclear medicine. 2nd ed. Philadelphia: WB Saunders; 1995. p. 292–7.Google Scholar
  8. 8.
    Barber HB, Barrett HH, Woolfenden JM, et al. Comparison of in vivo scintillation probes and gamma cameras for detection of small, deep tumours. Phys Med Biol. 1989;34:727–39.CrossRefPubMedGoogle Scholar
  9. 9.
    Daghighian F, Mazziotta JC, Hoffman EJ, et al. Intraoperative beta probe: a device for detecting tissue labeled with positron or electron emitting isotopes during surgery. Med Phys. 1994;21:153–7.CrossRefPubMedGoogle Scholar
  10. 10.
    Raylman RR, Fisher SJ, Brown RS, et al. Fluorine-18-fluorodeoxyglucose-guided breast cancer surgery with a positron-sensitive probe: validation in preclinical studies. J Nucl Med. 1995;36:1869–74.PubMedGoogle Scholar
  11. 11.
    Raylman RR, Wahl RL. A fiber-optically coupled positron-sensitive surgical probe. J Nucl Med. 1994;35:909–13.PubMedGoogle Scholar
  12. 12.
    Heller S, Zanzonico P. Nuclear probes and intraoperative gamma cameras. Semin Nucl Med. 2011;41:166–81.CrossRefPubMedGoogle Scholar
  13. 13.
    Zanzonico P. The intraoperative gamma probe: design, safety, and operation. In: Cody III HS, editor. Sentinel lymph node biopsy. London: Martin Dunitz; 2008. p. 45–68.Google Scholar
  14. 14.
    Zanzonico P, Heller S. The intraoperative gamma probe: basic principles and choices available. Semin Nucl Med. 2000;30:33–48.CrossRefPubMedGoogle Scholar
  15. 15.
    Essner R, Daghighian F, Giuliano AE. Advances in FDG PET probes in surgical oncology. Cancer J. 2002;8:100–8.CrossRefPubMedGoogle Scholar
  16. 16.
    Essner R, Hsueh EC, Haigh PI, et al. Application of an [18F]fluorodeoxyglucose-sensitive probe for the intraoperative detection of malignancy. J Surg Res. 2001;96:120–6.CrossRefPubMedGoogle Scholar
  17. 17.
    Strong VE, Galanis CJ, Riedl CC, et al. Portable PET probes are a novel tool for intraoperative localization of tumor deposits. Ann Surg Innov Res. 2009;3:2.CrossRefPubMedPubMedCentralGoogle Scholar
  18. 18.
    Strong VE, Humm J, Russo P, et al. A novel method to localize antibody-targeted cancer deposits intraoperatively using handheld PET beta and gamma probes. Surg Endosc. 2008;22:386–91.CrossRefPubMedGoogle Scholar
  19. 19.
    Meller B, Sommer K, Gerl J, et al. High energy probe for detecting lymph node metastases with 18F-FDG in patients with head and neck cancer. Nuklearmedizin. 2006;45:153–9.PubMedGoogle Scholar
  20. 20.
    Wasselle J, Becker J, Cruse W, et al. Localization of malignant melanoma using monoclonal antibodies. Arch Surg. 1991;126:481–4.CrossRefPubMedGoogle Scholar
  21. 21.
    Schneebaum S, Essner R, Even-Sapir E. Positron-sensitive probes. In: Mariani G, Giuliano AE, Strauss HW, editors. Radioguided surgery: a comprehensive team approach. New York: Springer; 2008. p. 23–8.CrossRefGoogle Scholar
  22. 22.
    Raylman RR. Performance of a dual, solid-state intraoperative probe system with 18F, 99mTc, and 111In. J Nucl Med. 2001;42:352–60.PubMedGoogle Scholar
  23. 23.
    Reinhardt H, Stula D, Gratzl O. Topographic studies with 32P tumor marker during operations of brain tumors. Eur Surg Res. 1985;17:333–40.CrossRefPubMedGoogle Scholar
  24. 24.
    Newman LA. Current issues in the surgical management of breast cancer: a review of abstracts from the 2002 San Antonio Breast Cancer Symposium, the 2003 Society of Surgical Oncology annual meeting, and the 2003 American Society of Clinical Oncology meeting. Breast J. 2004;10 Suppl 1:S22–5.CrossRefPubMedGoogle Scholar
  25. 25.
    Goyal A, Newcombe RG, Mansel RE, et al. Role of routine preoperative lymphoscintigraphy in sentinel node biopsy for breast cancer. Eur J Cancer. 2005;41:238–43.CrossRefPubMedGoogle Scholar
  26. 26.
    Tafra L, McMasters KM, Whitworth P, et al. Credentialing issues with sentinel lymph node staging for breast cancer. Am J Surg. 2000;180:268–73.CrossRefPubMedGoogle Scholar
  27. 27.
    Mathelin C, Salvador S, Bekaert V, et al. A new intraoperative gamma camera for the sentinel lymph node procedure in breast cancer. Anticancer Res. 2008;28:2859–64.PubMedGoogle Scholar
  28. 28.
    Mathelin C, Salvador S, Croce S, et al. Optimization of sentinel lymph node biopsy in breast cancer using an operative gamma camera. World J Surg Oncol. 2007;5:132.CrossRefPubMedPubMedCentralGoogle Scholar
  29. 29.
    Mathelin C, Salvador S, Huss D, et al. Precise localization of sentinel lymph nodes and estimation of their depth using a prototype intraoperative mini gamma-camera in patients with breast cancer. J Nucl Med. 2007;48:623–9.CrossRefPubMedGoogle Scholar
  30. 30.
    Britten AJ. A method to evaluate intra-operative gamma probes for sentinel lymph node localisation. Eur J Nucl Med. 1999;26:76–83.CrossRefPubMedGoogle Scholar
  31. 31.
    Aarsvold JN, Alazraki NP. Update on detection of sentinel lymph nodes in patients with breast cancer. Semin Nucl Med. 2005;35:116–28.CrossRefPubMedGoogle Scholar
  32. 32.
    Hoffman EJ, Torni MP, Levin CS. Gamma and beta intra-operative imaging probes. Nucl Instrum Methods Phys Res. 1997;392:324–9.CrossRefGoogle Scholar
  33. 33.
    Scopinaro F, Soluri A. Gamma ray imaging probes for radioguided surgery and site-directed biopsy. In: Mariani G, Giuliano AE, Strauss HW, editors. Radioguided surgery: a comprehensive team approach. New York: Springer; 2008. p. 29–36.CrossRefGoogle Scholar
  34. 34.
    Abe A, Takahashi N, Lee J, et al. Performance evaluation of a hand-held, semiconductor (CdZnTe)-based gamma camera. Eur J Nucl Med Mol Imaging. 2003;30:805–11.CrossRefPubMedGoogle Scholar
  35. 35.
    Pitre S, Menard L, Ricard M, et al. A hand-held imaging probe for radio-guided surgery: physical performance and preliminary clinical experience. Eur J Nucl Med Mol Imaging. 2003;30:339–43.CrossRefPubMedGoogle Scholar
  36. 36.
    Oda T, Hayama K, Tsuchimochi M. Evaluation of small semiconductor gamma camera – simulation of sentinel lymph node biopsy by using a trial product of clinical type gamma camera. Kaku Igaku. 2009;46:1–12.PubMedGoogle Scholar
  37. 37.
    Tsuchimochi M, Hayama K, Oda T, et al. Evaluation of the efficacy of a small CdTe gamma-camera for sentinel lymph node biopsy. J Nucl Med. 2008;49:956–62.CrossRefPubMedGoogle Scholar
  38. 38.
    Tsuchimochi M, Sakahara H, Hayama K, et al. A prototype small CdTe gamma camera for radioguided surgery and other imaging applications. Eur J Nucl Med Mol Imaging. 2003;30:1605–14.CrossRefPubMedGoogle Scholar
  39. 39.
    Sanchez F, Benlloch JM, Escat B, et al. Design and tests of a portable mini gamma camera. Med Phys. 2004;31:1384–97.CrossRefPubMedGoogle Scholar
  40. 40.
    Sanchez F, Fernandez MM, Gimenez M, et al. Performance tests of two portable mini gamma cameras for medical applications. Med Phys. 2006;33:4210–20.CrossRefPubMedGoogle Scholar
  41. 41.
    Vermeeren L, Meinhardt W, Bex A, et al. Paraaortic sentinel lymph nodes: toward optimal detection and intraoperative localization using SPECT/CT and intraoperative real-time imaging. J Nucl Med. 2010;51:376–82.CrossRefPubMedGoogle Scholar
  42. 42.
    Vermeeren L, Valdes Olmos RA, Klop WM, et al. A portable gamma-camera for intraoperative detection of sentinel nodes in the head and neck region. J Nucl Med. 2010;51:700–3.CrossRefPubMedGoogle Scholar
  43. 43.
    Vermeeren L, Valdes Olmos RA, Meinhardt W, et al. Intraoperative imaging for sentinel node identification in prostate carcinoma: its use in combination with other techniques. J Nucl Med. 2011;52:741–4.CrossRefPubMedGoogle Scholar
  44. 44.
    Ortega J, Ferrer-Rebolleda J, Cassinello N, et al. Potential role of a new hand-held miniature gamma camera in performing minimally invasive parathyroidectomy. Eur J Nucl Med Mol Imaging. 2007;34:165–9.CrossRefPubMedGoogle Scholar
  45. 45.
    Naji S, Tadros A, Traub J, et al. Case report: improving the speed and accuracy of melanoma sentinel node biopsy with 3D intra-operative imaging. J Plast Reconstr Aesthet Surg. 2011;64:1712–5.CrossRefPubMedGoogle Scholar
  46. 46.
    Wendler T, Herrmann K, Schnelzer A, et al. First demonstration of 3-D lymphatic mapping in breast cancer using freehand SPECT. Eur J Nucl Med Mol Imaging. 2010;37:1452–61.CrossRefPubMedGoogle Scholar
  47. 47.
    Daghigian F, Fong Y. Detectors for intraoperative molecular imaging: from probes to scanners. In: Fong Y et al., editors. Imaging and visualization in the modern operating room: a comprehensive guide for physicians. New York: Springer; 2015. p. 55–67.CrossRefGoogle Scholar
  48. 48.
    Contag PR, Olomu IN, Stevenson DK, et al. Bioluminescent indicators in living mammals. Nat Med. 1998;4:245–7.CrossRefPubMedGoogle Scholar
  49. 49.
    Ntziachristos V, Ripoll J, Wang LV, et al. Looking and listening to light: the evolution of whole-body photonic imaging. Nat Biotechnol. 2005;23:313–20.CrossRefPubMedGoogle Scholar
  50. 50.
    Taruttis A, Ntziachristos V. Translational optical imaging. AJR Am J Roentgenol. 2012;199:263–71.CrossRefPubMedGoogle Scholar
  51. 51.
    Witjes JA, Douglass J. The role of hexaminolevulinate fluorescence cystoscopy in bladder cancer. Nat Clin Pract Urol. 2007;4:542–9.CrossRefPubMedGoogle Scholar
  52. 52.
    Herr H. Narrow band cystoscopy. In: Fong Y et al., editors. Imaging and visualization in the modern operating room: a comprehensive guide for physicians. New York: Springer; 2015. p. 257–69.CrossRefGoogle Scholar
  53. 53.
    Stummer W, Novotny A, Stepp H, et al. Fluorescence-guided resection of glioblastoma multiforme by using 5-aminolevulinic acid-induced porphyrins: a prospective study in 52 consecutive patients. J Neurosurg. 2000;93:1003–13.CrossRefPubMedGoogle Scholar
  54. 54.
    Ashitate Y, Stockdale A, Choi HS, et al. Real-time simultaneous near-infrared fluorescence imaging of bile duct and arterial anatomy. J Surg Res. 2012;176:7–13.CrossRefPubMedGoogle Scholar
  55. 55.
    Ashitate Y, Tanaka E, Stockdale A, et al. Near-infrared fluorescence imaging of thoracic duct anatomy and function in open surgery and video-assisted thoracic surgery. J Thorac Cardiovasc Surg. 2011;142:31–8.e1-2.CrossRefPubMedPubMedCentralGoogle Scholar
  56. 56.
    Frangioni JV. In vivo near-infrared fluorescence imaging. Curr Opin Chem Biol. 2003;7:626–34.CrossRefPubMedGoogle Scholar
  57. 57.
    Frangioni JV. New technologies for human cancer imaging. J Clin Oncol. 2008;26:4012–21.CrossRefPubMedPubMedCentralGoogle Scholar
  58. 58.
    Hutteman M, Choi HS, Mieog JS, et al. Clinical translation of ex vivo sentinel lymph node mapping for colorectal cancer using invisible near-infrared fluorescence light. Ann Surg Oncol. 2011;18:1006–14.CrossRefPubMedGoogle Scholar
  59. 59.
    Lee BT, Hutteman M, Gioux S, et al. The FLARE intraoperative near-infrared fluorescence imaging system: a first-in-human clinical trial in perforator flap breast reconstruction. Plast Reconstr Surg. 2010;126:1472–81.CrossRefPubMedPubMedCentralGoogle Scholar
  60. 60.
    Troyan SL, Kianzad V, Gibbs-Strauss SL, et al. The FLARE intraoperative near-infrared fluorescence imaging system: a first-in-human clinical trial in breast cancer sentinel lymph node mapping. Ann Surg Oncol. 2009;16:2943–52.CrossRefPubMedPubMedCentralGoogle Scholar
  61. 61.
    Kosaka N, Mitsunaga M, Longmire MR, et al. Near infrared fluorescence-guided real-time endoscopic detection of peritoneal ovarian cancer nodules using intravenously injected indocyanine green. Int J Cancer. 2011;129:1671–7.CrossRefPubMedPubMedCentralGoogle Scholar
  62. 62.
    Bradbury M, Pauliah M, Wiesner U. Ultrasmall fluorescent silica nanoparticles as intraoperative imaging tools for cancer diagnosis and treatment. In: Fong Y et al., editors. Imaging and visualization in the modern operating room: a comprehensive guide for physicians. New York: Springer; 2015. p. 167–79.CrossRefGoogle Scholar
  63. 63.
    Bradbury MS, Pauliah M, Zanzonico P, et al. Intraoperative mapping of sentinel lymph node metastases using a clinically translated ultrasmall silica nanoparticle. Wiley Interdiscip Rev Nanomed Nanobiotechnol. 2015.Google Scholar
  64. 64.
    Beattie BJ, Thorek DL, Schmidtlein CR, et al. Quantitative modeling of Cerenkov light production efficiency from medical radionuclides. PLoS ONE. 2012;7, e31402.CrossRefPubMedPubMedCentralGoogle Scholar
  65. 65.
    Cerenkov PA. Visible emission of clean liquids by action of gamma-radiation. C R Dokl Akad Nauk SSSR. 1934;2:451–4.Google Scholar
  66. 66.
    Dothager RS, Goiffon RJ, Jackson E, et al. Cerenkov radiation energy transfer (CRET) imaging: a novel method for optical imaging of PET isotopes in biological systems. PLoS ONE. 2010;5, e13300.CrossRefPubMedPubMedCentralGoogle Scholar
  67. 67.
    Holland JP, Normand G, Ruggiero A, et al. Intraoperative imaging of positron emission tomographic radiotracers using cerenkov luminescence emissions. Mol Imaging. 2011;10:177–86.PubMedPubMedCentralGoogle Scholar
  68. 68.
    Li C, Mitchell GS, Cherry SR. Cerenkov luminescence tomography for small-animal imaging. Opt Lett. 2010;35:1109–11.CrossRefPubMedPubMedCentralGoogle Scholar
  69. 69.
    Liu H, Ren G, Miao Z, et al. Molecular optical imaging with radioactive probes. PLoS ONE. 2010;5, e9470.CrossRefPubMedPubMedCentralGoogle Scholar
  70. 70.
    Lucignani G. Cerenkov radioactive optical imaging: a promising new strategy. Eur J Nucl Med Mol Imaging. 2011;38:592–5.CrossRefPubMedGoogle Scholar
  71. 71.
    Robertson R, Germanos MS, Li C, et al. Optical imaging of Cerenkov light generation from positron-emitting radiotracers. Phys Med Biol. 2009;54:N355–65.CrossRefPubMedPubMedCentralGoogle Scholar
  72. 72.
    Ruggiero A, Holland JP, Lewis JS, et al. Cerenkov luminescence imaging of medical isotopes. J Nucl Med. 2010;51:1123–30.CrossRefPubMedPubMedCentralGoogle Scholar
  73. 73.
    Thorek DL, Abou DS, Beattie BJ, et al. Positron lymphography: multimodal, high-resolution, dynamic mapping and resection of lymph nodes after intradermal injection of 18F-FDG. J Nucl Med. 2012;53:1438–45.CrossRefPubMedPubMedCentralGoogle Scholar
  74. 74.
    Thorek DL, Riedl CC, Grimm J. Clinical Cerenkov luminescence imaging of 18F-FDG. J Nucl Med. 2014;55:95–8.CrossRefPubMedGoogle Scholar
  75. 75.
    Spinelli AE, Ferdeghini M, Cavedon C, et al. First human cerenkography. J Biomed Opt. 2013;18:20502.CrossRefPubMedGoogle Scholar
  76. 76.
    Xu MH, Wang LHV. Photoacoustic imaging in biomedicine. Rev Sci Instrum. 2006;77:41–101.Google Scholar
  77. 77.
    Herzog E, Taruttis A, Beziere N, et al. Optical imaging of cancer heterogeneity with multispectral optoacoustic tomography. Radiology. 2012;263:461–8.CrossRefPubMedGoogle Scholar
  78. 78.
    Ku G, Fornage BD, Jin X, et al. Thermoacoustic and photoacoustic tomography of thick biological tissues toward breast imaging. Technol Cancer Res Treat. 2005;4:559–66.CrossRefPubMedGoogle Scholar
  79. 79.
    Kruger RA, Kiser WL, Reinecke DR, et al. Thermoacoustic computed tomography using a conventional linear transducer array. Med Phys. 2003;30:856–60.CrossRefPubMedGoogle Scholar
  80. 80.
    Zeng Y, Da X, Wang Y, et al. Photoacoustic and ultrasonic coimage with a linear transducer array. Opt Lett. 2004;29:1760–2.CrossRefPubMedGoogle Scholar
  81. 81.
    McNally LR, Mezera M, Morgan DE, et al. Current and emerging clinical applications of multispectral optoacoustic tomography (MSOT) in oncology. Clin Cancer Res. 2016;22:3432–9.CrossRefPubMedGoogle Scholar
  82. 82.
    Neuschmelting V, Burton NC, Lockau H, et al. Performance of a multispectral optoacoustic tomography (MSOT) system equipped with 2D vs 3D handheld probes for potential clinical translation. Photoacoustics. 2016;4:1–10.CrossRefPubMedGoogle Scholar
  83. 83.
    Hielscher AH. Optical tomographic imaging of small animals. Curr Opin Biotechnol. 2005;16:79–88.CrossRefPubMedGoogle Scholar
  84. 84.
    Jian H. Diffuse optical tomography: principles and applications. Boca Raton: CRC Press; 2010.CrossRefGoogle Scholar
  85. 85.
    Huang D, Swanson EA, Lin CP, et al. Optical coherence tomography. Science. 1991;254:1178–81.CrossRefPubMedPubMedCentralGoogle Scholar
  86. 86.
    Zavaleta CL, Kircher MF, Gambhir SS. Raman’s “effect” on molecular imaging. J Nucl Med. 2011;52:1839–44.CrossRefPubMedGoogle Scholar
  87. 87.
    Kircher MF, de la Zerda A, Jokerst JV, et al. A brain tumor molecular imaging strategy using a new triple-modality MRI-photoacoustic-Raman nanoparticle. Nat Med. 2012;18:829–34.CrossRefPubMedPubMedCentralGoogle Scholar
  88. 88.
    Kopelman D, Blevis I, Iosilevsky G, et al. Sentinel node detection in an animal study: evaluation of a new portable gamma camera. Int Surg. 2007;92:161–6.PubMedGoogle Scholar
  89. 89.
    Kopelman D, Blevis I, Iosilevsky G, et al. A newly developed intra-operative gamma camera: performance characteristics in a laboratory phantom study. Eur J Nucl Med Mol Imaging. 2005;32:1217–24.CrossRefPubMedGoogle Scholar
  90. 90.
    Zanzonico P. Noninvasive imaging for supporting basic research. In: Kiessling F, Pichler BJ, editors. Small-animal imaging: basics and practical guide. Heidelberg: Springer; 2011. p. 3–16.CrossRefGoogle Scholar
  91. 91.
    van Dam GM, Themelis G, Crane LM, et al. Intraoperative tumor-specific fluorescence imaging in ovarian cancer by folate receptor-alpha targeting: first in-human results. Nat Med. 2011;17:1315–9.CrossRefPubMedGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Medical PhysicsMemorial Sloan Kettering Cancer CenterNew YorkUSA

Personalised recommendations