Atrial fibrillation, the most common sustained arrythmia encountered in practice, poses a unique challenge to physicians [1]. While rarely life threatening, atrial fibrillation (AF) significantly impairs functional capacity and quality of life, with high rates of secondary healthcare utilisation [2]. Moreover, AF is a progressive disease, transitioning from an isolated electrical disorder to a sustained arrhythmic condition through a combination of structural, electrical, and contractile remodelling [3].

Contemporary data derived from multiple observational and randomised clinical trials have definitely demonstrated that catheter ablation is superior to antiarrhythmic drugs (AADs) for maintenance of sinus rhythm, which is accompanied by improvements in functional capacity and quality of life, and a consequent reduction in healthcare utilisation [4]. Recent studies have suggested that these benefits extend to patients with treatment naive atrial fibrillation, whereby patients treated with an initial cryoballoon catheter ablation experienced a lower recurrence of any atrial tachyarrhythmia and were more likely to be free of symptoms, which resulted in a significantly greater improvement in quality of life, and lower rates of healthcare utilisation (cardioversion, emergency room visits, and hospitalization) [5]. Three randomized controlled trials that assessed first-line cryoballoon catheter ablation demonstrated that cryoablation was associated with a lower risk of adverse events compared with AADs, suggesting that the cost–benefit of intervention may favour a first-line ablation approach.

In this issue of the journal, Zucchelli et al. present data on healthcare utilisation, arrhythmia burden, and efficacy from the Cryo Global Registry [6]. This registry includes patients enrolled from 46 centres predominantly in Europe and Asia between 2016 and 2018.[6] The current sub-analysis examined 1394 patients with either paroxysmal or persistent AF undergoing cryoballoon ablation, of which 433 (31.1%) underwent first-line intervention (i.e., prior to antiarrhythmic drug trials). On the whole, the authors observed a greater procedural efficacy, a significantly greater freedom from recurrent AF, a greater mean reduction in symptoms, and lower rate of AAD prescription at 12-month follow-up. However, despite these differences, there was no difference in quality of life, no difference in repeat ablations, and no difference in hospitalizations between either group at 12 months.

Zucchelli and colleagues should be commended for undertaking such an important and extensive work; however, there are a few points worth considering.

First, the decision to proceed with first-line ablation was not randomised. As such, it is not surprising that there were significant differences in baseline characteristics between the first- and second-line ablation groups. Specifically, the first-line group was more likely to have paroxysmal AF, a lower body mass index, and a smaller LA diameter. Conversely, the second-line AAD-refractory group was more likely to have a more advanced form of AF, a longer time since diagnosis, and to have undergone device implantation. Together, it indicates that the AAD-refractory group was being treated later in their disease trajectory, which would be expected to significantly influence the outcomes evaluated. When propensity scores were used to account for these differences, the benefit of first-line ablation on arrhythmia-recurrence was lost. However, it is important to note that the time from diagnosis to ablation was relatively long in the first-line group (> 2 years), which may account for the reduced benefit relative to the randomised trials and previous observational series [5, 7].

Second, ablation efficacy was predominantly determined through symptom reporting, with a minority of patients receiving arrhythmia surveillance by intermittent rhythm monitoring (54% and 58% had no Holter monitoring performed in the year following ablation). It is known that the reliance on symptomatic arrhythmia detection overestimates treatment success by 20% or more. Likewise, intermittent monitoring significantly under-detects arrhythmia recurrences [8]. Taken together the estimates of arrhythmia-free survival reported herein are markedly inflated, with the potential for significant misclassification errors that adversely impact the accuracy and precision of the comparative evaluations.

Third, while first-line ablation patients had a greater reduction in symptoms compared to the AAD-refractory group, there was no significant difference in quality of life observed. It is possible that this apparent discrepancy is merely a function of the scale used to evaluate quality of life. Generic quality of life instruments, such as the EuroQOL score, disproportionately focus on general physical health and functioning, which render it insensitive for measuring AF-specific quality of life. In effect, these generic QOL scores are more influenced by patient demographics and comorbidities rather than the disease or intervention. It is possible that a significant difference in quality of life may have been observed on a disease-specific scale, such as the AFEQT score.

Finally, while there was no difference observed in healthcare utilisation, hospitalization, repeat ablation, and AAD prescription, it is possible that this is a function of the overall efficacy of cryoballoon catheter ablation. Specifically, cryoballoon catheter ablation has been observed to be associated with low rates of healthcare utilisation when employed as a first- or second-line intervention in randomised clinical trials [4, 5]. As such, it is possible that the current study was under-powered to observe a difference in this outcome.

While these results must be interpreted in the context with which the information was acquired, namely in a non-randomised and unblinded manner, this real-world registry supports the notion that an early cryoballoon ablation strategy can improve clinical outcomes in those with paroxysmal or persistent AF.