Journal of Nuclear Cardiology

, Volume 20, Issue 6, pp 963–965

Does hybrid imaging have a role in cardiac risk evaluation of the pre-renal transplant patient?

Editorial

DOI: 10.1007/s12350-013-9783-2

Cite this article as:
Hakeem, A., Bhatti, S. & Chang, S.M. J. Nucl. Cardiol. (2013) 20: 963. doi:10.1007/s12350-013-9783-2

In 2010, approximately 17,000 adults with end-stage renal disease (ESRD) underwent kidney transplantation in the United States while approximately 76,000 patients were on the waiting list, a 6% increase from the previous year.1 Considering the limited organ availability and ever growing demand, patients being considered for renal transplantation undergo comprehensive cardiac assessment. Such risk assessment has been widely adopted as being helpful in determining transplant candidacy especially in patients most likely to survive the procedure (without a cardiac event) and make optimal use of their allograft. The risk of cardiovascular events in significantly high in the first months after kidney transplantation, with the highest rate of mortality in the peritransplantation period.2-4 Furthermore, coronary artery disease (CAD) remains the leading cause of death post-renal transplantation in the long term, with approximately one-third of all such deaths due to myocardial infarction (MI).2-4

Several regulatory bodies and scientific councils including the recent American College of Cardiology (ACC) and American Heart Association (AHA) scientific statement have emerged to provide a frame work for the appropriate workup for the ESRD patient awaiting renal transplantation.4 A noninvasive evaluation for pre-transplant patients (regardless of symptomatic status) who have diabetes, multiple risk factors and known CAD has been recommended. Although none of the guidelines specify which stress imaging modality to use, the general opinion is that the modality with the “best local expertise” should be employed. While the “diagnostic performance” of stress myocardial perfusion SPECT (MPS) and dobutamine stress echocardiography (DSE) has been of marginal value for identifying luminal stenosis >50% on invasive angiography as the reference standard, the prognostic utility of both MPS and DSE in risk stratification of the pre-renal transplant patient has been well established.4,5

Integrated imaging modalities (hybrid imaging) offering simultaneous functional and structural assessment have evolved as an attractive frontier in our quest for refining risk prediction. Combining anatomic assessment of coronary atherosclerotic plaque burden with a functional assessment of myocardial ischemia has been shown to refine risk stratification among patients at varying risk, in the general population. In this issue of JNC, Karohl et al6 present their experience with hybrid imaging [coronary artery calcium score (CACS), epicardial adipose tissue (EAT) volume, and myocardial perfusion imaging] using SPECT-CT (97.5%) or PET-CT (2.5%) for assessing pre-operative cardiovascular risk in 411 ESRD patients (86% on dialysis) awaiting kidney transplantation. Only 10% patients had perfusion defects, and only 3.4% had reversible defects (ischemia). The median CACS was 48 for the entire population (interquartile range 0-379). Compared to patients with normal scans (median CACS 27.5) and patients with abnormal scans had higher median CACS (412); P < .001. Similarly, EAT volumes were significantly higher in those with abnormal scans (148.5 vs 115.7; P = .019). Likewise, CACS and EAT were independently associated with abnormal perfusion in multivariate logistic regression analysis. On receiver operator curve analysis, the model containing age, diabetes, and CACS was, at best, only marginally modest (C statistic 0.73) for predicting abnormal perfusion defect. Addition of EAT to the model provided very little incremental value (albeit statistically significant) for predicting abnormal perfusion (C Statistic 0.76; delta 0.026) and minimal net reclassification improvement.6

So, how do we interpret these findings and more importantly can we consider them practice changing? The role of hybrid imaging has been validated in the general population, but has not been studied in the ESRD population. These findings are hence novel. This study used perfusion defect (majority fixed) as an endpoint instead of “hard endpoints,” which is the major limitation of the study. Only 10% had perfusion defects and 3.4% had ischemia.6 This could, in part, be related to the fact that this cohort of patients were already “listed” for transplantation and hence represent a “highly selected” group of relatively healthy ESRD patients, deemed most suitable for transplantation, with the best expected outcomes. While fixed defects also, independently carry prognostic information, the incremental value of hybrid imaging protocols on hard clinical endpoints including death and MI would be essential to establish in the ESRD population before this approach can be widely accepted.

Schenker et al7 were the first to demonstrate the complimentary prognostic value of CACS combined with nuclear perfusion imaging. Among 695 consecutive patients with intermediate risk undergoing CACS and 82Rb perfusion PET, a stepwise increase in death and MI rates with increasing CACS in patients with and without ischemia was observed. In the subset of normal perfusion, the annualized event rate in patients with a CACS of 0 was sixfold lower than in those with a CAC score ≥1,000 (2.6% vs 12.3%, respectively). Similarly, in patients with ischemia, the annualized event rate in those with a CACS of 0 was threefold lower than among patients with a CAC score ≥1,000 (8.2% vs 22.1%).

There continues to be a difference of opinion regarding the optimal role of CACS in ESRD patients, as these patients have, in addition (to atheroma), medial wall calcification secondary to altered mineral metabolism. Patients with ESRD hence have a high burden of CAC compared to a non-CKD population and the association with ischemia is very weak as demonstrated before.4,8 However, few studies have demonstrated a correlation of CACS with luminal stenosis and long-term outcomes even in the ESRD population.8-10 Although EAT has been associated with atherosclerotic burden and cardiac outcomes, it provided very little (if any) incremental information for predicting abnormal perfusion in this study.6 While both CACS and EAT information can be obtained by the same CT dataset, CACS is much easier to perform and most importantly there are established risk categories. EAT on the other hand does not appear very useful (at least at this point) and there are no established cutoff values to guide clinical decision making.

The recent ACC/AHA statement4 has been cautious in recommending CACS in the pre-renal transplant evaluation and has as such given it a class IIb recommendation (uncertain). Another area of uncertainty (Class IIb) has been the relative frequency of repeat testing in patients who await transplantation.4 We have previously demonstrated that the “warranty period” of a normal perfusion is largely contingent on the level of renal dysfunction; such that patients with normal kidney function have a “warranty period” of at least 2 years whereas those with severe kidney dysfunction have a “warranty period” of a few months.11 In our opinion, these two areas of uncertainty can be best addressed by hybrid imaging. One large study,12 in the general population demonstrated that amongst patients with normal perfusion the relative risk for total cardiac and all-cause death/MI events significantly increased in subjects with a normal SPECT result, when CACS exceeded 400. Time point analysis showed separation of the survival curves between the minimal (0-10) and severe (>400) CACS groups at 3 years after initial testing for total cardiac events (P = .02) and at 5 years for death/MI (P = .02). The addition of CACS more precisely defined this “warranty period,” as cardiac event rates significantly increased beginning 3 years after a normal SPECT result when the CACS was severe. MPS hence provides a better short-term risk assessment because it identifies the functional significance of more advanced stages of CAD. However, CACS may better estimate longer term prognosis because of its ability to detect varying degrees of coronary atherosclerosis before the development of stress-induced myocardial ischemia.

In addition, there is potential to identify a false negative stress test (balanced ischemia) in high risk patients with very high CACS. Conversely, patients who have a perfusion defect and normal calcium score may reflect a false positive perfusion test. For this approach to be further validated, large scale studies with long-term clinical follow-up using hard clinical endpoints (death, MI) are needed to precisely define risk categories based on CACS and perfusion imaging in ESRD population. Important issues that need to be addressed in addition to conducting a large randomized study with hard clinical endpoints and cost effectiveness include defining optimal cutoff for CACS in ESRD patients. It was recently demonstrated that the optimal cutoff value of CACS was 2.8-fold higher in patients with moderate CKD than patients without significant CKD.13 Furthermore, a CACS cutoff of 780 was the best discriminator for predicting MACE events long term.10 The idea of using CACS and possibly EAT as filters for downstream testing appears very exciting and intuitively, very cost effective.

Copyright information

© American Society of Nuclear Cardiology 2013

Authors and Affiliations

  1. 1.Division of CardiologyUniversity of Arkansas for Medical Sciences (UAMS), and Central Arkansas VA health SystemLittle RockUSA
  2. 2.Methodist DeBakey Heart & Vascular CenterHoustonUSA

Personalised recommendations