Sudden cardiac death (SCD) is the number one killer of adults in industrialized countries,1-3 responsible for the death of more than 300,000 Americans annually. This is a staggering number surpassing the total annual number of deaths from all cancers. It is established that malignant ventricular arrhythmias namely ventricular fibrillation constitute the most common mechanism of SCD.1-3 Survival of sudden cardiac arrest victims depends primarily on the time to defibrillation and return to a hemodynamically perfusing rhythm.4,5 Because of the hurdles hindering prompt defibrillation, survival of sudden cardiac arrest victims to hospital discharge continues to be very poor.6

Implantable cardioverter-defibrillators (ICD) are devices that are implanted in high risk patients to protect them against SCD. Large randomized trials have demonstrated the benefits of the ICD in reducing total mortality in survivors of sudden cardiac arrests7-9 as well as in patients who are at high risk for SCD.10-21 Identifying those high risk patients who may benefit from an ICD implantation has been the subject of multiple clinical trials over the past two decades. Indices of electrical instability or alternans22,23 and autonomic tone24 to mention a few have been proposed as markers of high risk substrate with varying degrees of accuracy in predicting clinical events. To date, however, only the measure of left ventricular function as assessed by the ejection fraction (EF) continues to be the clinical tool used to determine eligibility for ICD therapy.25 Still, most SCD events happen in patients who are not eligible for ICD therapy based on EF. In addition, the lifelong likelihood of appropriate therapy in ICD recipients is very low. These issues with low sensitivity and specificity in our current patient selection criteria for ICD therapy constitute the main impetus behind the search for newer and better risk stratification parameters.

I-123 meta-iodobenzylguanidine (mIBG) is a guanethidine-based false neurotransmitter analog of norepinephrine (NE) that is taken by the myocardial sympathetic nerve terminal via the norepinephrine transporter 1 (NET 1). But unlike NE, mIBG is not catabolized after its uptake. Thus, it facilitates imaging of sympathetic nerve terminal function.26 The avid uptake of mIBG by the myocardium is an indication of intact sympathetic nerve terminal function, whereas the paucity of myocardial uptake measured as a low heart to mediastinal (HM) ratio on planar imaging is an indication of sympathetic dysfunction. It is well established that the heart failure state is characterized by a functional impairment NET 1, and thus poor myocardial uptake of mIBG.27 Changes in the mIBG HM ratio reflect progression or improvement in heart failure severity. In March 2013, the United States Food and Drug administration approved I-123 mIBG imaging for “the assessment of myocardial sympathetic innervation in the evaluation of patients with NYHA class II-III heart failure and EF <35%”. The approval was based primarily on the ADMIRE-HF (AdreView Myocardial Imaging for Heart Failure) study which comprised two North American multicenter phase III studies of a total of 961 patients with NYHA Class II-III heart failure due to ischemic or non-ischemic cardiomyopathy, EF <35% and guideline-recommended optimal medical therapy.28 Notably, patients with any prior ventricular tachyarrhythmia treated with defibrillation or ICD implantation within 30 days of enrollment were excluded. After a median follow up of 17 months the primary composite end point of NYHA class progression, potentially life-threatening arrhythmic event or cardiac death occurred more frequently in patients with late (4-hour) HM ratio <1.6 compared to late HM ratio ≥1.6. The 2-year probability of cardiac death was 11.2% vs 1.8% for late HM ratio <1.6 and ≥1.6, respectively, and 0% for late HM >1.8. The early HM ratio measured on a 15-minute scan and the washout rate were not independent predictors of outcome. Arrhythmic events (self-limited ventricular tachycardia, resuscitated cardiac arrest or SCD) alone were considered in a secondary analysis and found to be more common in subjects with HM <1.6 (10.4% vs. 3.5%). While these results indicate powerful prognostic capability, specific clinical indications for mIBG imaging in the heart failure population are still a matter of debate. An obvious area of interest is in improving patient selection for ICD implantation.

In this issue of the Journal, Al Badarin et al29 present a re-analysis of ADMIRE-HF trial focusing specifically on the prediction of potentially life-threatening arrhythmic events as defined originally in ADMIRE-HF, in the 778 patients who did not have an ICD at the time of enrollment. Multivariable survival regression was used to determine independent predictors, and derive a predictive risk score The authors found that a HM ratio <1.6 on 123I-mIBG imaging was associated with a 3.5-fold increase in the likelihood of arrhythmic events [Hazard Ratio (HR) 3.48, 95% CI 1.52-8], independent of other clinical predictors of arrhythmias including left ventricular ejection fraction (LVEF). Other independent predictors were LVEF <25% (HR 1.97, 95% CI 1.28-3.05) and systolic blood pressure <120 mm Hg (HR 1.19, 95% CI 1.03-1.39). Based on the developed risk score, patients in the lowest, intermediate, and highest risk groups had arrhythmic event rates of 2%, 10%, and 16%, respectively.

In interpreting the results of this analysis, it is tempting to identify HM ratio as the “strongest” predictor of arrhythmic events based on its HR. This is, however, incorrect since the predictors (HM ratio, LVEF, and systolic BP) have different units of measurement. Such an approach could only be applied if the estimate of HR resulted from a regression model with standardized coefficients in which case the predictors are comparable to each other. It is important to note that the relative importance of a predictor variable in any regression model can be interpreted in different ways. The most common is to first determine the individual (univariate) significance of the predictor. The traditional way of modeling assumes “non-significance” unless proven otherwise. Once all significant predictors are identified, how can one determine which predictor is the most “important”? As pointed out earlier, comparing standardized coefficients is a way to examine the relative importance of a particular predictor variable compared to the other predictor variables in the model. However, even with standardized coefficients, simply looking at the HR or the standardized coefficients of the model may still not be appropriate in some cases as this approach does not consider the correlation among the predictors. When predictors are highly correlated (i.e., problem of collinearity exists), the standardized coefficients and the estimated HR based on these coefficients may provide misleading estimates of the effect of the predictor in magnitude and/or direction. An alternate method of determining the relative importance of predictors is to consider the amount of variability in the response explained by a predictor given all other predictors in the model. This is straightforward to implement in conventional linear regression but the definition of amount of variability in the model is more challenging in logistic and survival models.

The predictive score developed by Al-Badarin et al29 is yet another addition to the growing list of tools that can be used for stratifying patients for arrhythmic risk. Unfortunately, however, these tools which now include many markers of electrical instability such as heart rate variability, heart rate turbulence, and T wave alternans to mention a few, as well as imaging tools for the assessment of scar burden such as late gadolinium enhancement MRI scanning, and genetic markers have all failed thus far to penetrate into the clinical guidelines that define appropriateness for ICD therapy. Despite the abundance of data on the value of many of these markers in arrhythmic risk stratification, the EF value remains the single criterion that dictates eligibility for ICD therapy for the primary prevention of SCD. This is in big part due to our reluctance as a medical community and as a society in general to accept the tradeoff between sensitivity and specificity, i.e., to accept few incremental arrhythmic events in patients who are not protected by an ICD in return for higher rates of appropriate device utilization in ICD recipients. In the risk score proposed by Al Badarin et al29, even the lowest risk stratum that was defined based on HM ratio, EF, and systolic blood pressure had a 2% risk of ventricular arrhythmia over a median follow-up period of 17 months. For most of us today, this would still be considered too high of a risk for ICD therapy to be withheld. The “mood” of the medical community regarding this matter may however change, driven by changes in the health care system in general and by the financial pressures that it has to face. In a new environment where the cost of devices and their associated invasive procedures have to be weighed more carefully against the expected benefits in patient survival, some of the tools of risk stratification such as 123I-mIBG imaging may find their way to the published guidelines for ICD therapy. Until then, we keep accumulating newer tools in our toolbox with the promise of someday being able to use them.