Case Report

The effect of cardiac resynchronisation therapy (CRT) in patients with heart failure without left bundle branch block (LBBB) is disputable [1]. We present a case of a patient with rate-dependent LBBB.

A 55-year-old woman with dilated cardiomyopathy, a left ventricular (LV) ejection fraction of 32% (bloodpool radionuclide scintigraphy) and New York Heart Association class III despite optimal pharmacological therapy was referred for CRT device implantation. Coronary angiography revealed no abnormalities. A 24-h electrocardiogram registration showed complete LBBB that disappeared at slower heart rates between 70 and 80 bpm (Fig. 1). Since LBBB was present for the vast majority of the time, the patient was accepted for CRT implantation according to current guidelines [2]. Echocardiography during narrow QRS showed a dilated left ventricle without visual signs of dyssynchrony, normal interventricular mechanical delay (18 ms) and no LV intraventricular dyssynchrony (septal to lateral strain delay 53 ms) or atrioventricular (AV) dyssynchrony.

Fig. 1
figure 1

12-Lead surface electrocardiogram during normal ventricular conduction and during left bundle branch block. a Narrow QRS (90 ms), heart rate 67 bpm. b Wide QRS (160 ms) with complete LBBB, heart rate 73 bpm

The patient gave written informed consent for a study that was approved by the institutional ethics committee and complies with the Declaration of Helsinki. During implantation, a pressure sensor tipped wire (PressureWire®5, St. Jude Medical Inc., St. Paul, MN, USA) was placed in the left ventricle to optimise the AV delay by measurement of the maximum rate of LV pressure rise (dP/dtmax). The leads were implanted transvenously in the right ventricular outflow tract, right atrial appendage, and in a coronary sinus tributary on the midposterolateral LV free wall.

Pacing leads, pressure recording, and 12-lead surface electrocardiogram were connected to an external pacing and data acquisition computer (Flexstim II, Boston Scientific Corp., St.Paul, MN, USA). The custom-made stimulation protocol consisted of cycles of six beats of atrio-biventricular pacing separated by periods of 14 beats of atrial pacing (AAI, baseline). This cycle was repeated with biventricular pacing without atrial pacing (VDD), separated by periods of no pacing (sinus rhythm, baseline). Each cycle with one of four AV delays (20%, 40%, 60%, and 80% of intrinsic AV conduction time) was repeated four times in random order. The optimal AV delay was defined by the highest relative increase in dP/dtmax compared with baseline.

Complete LBBB was present during AAI pacing at 85 bpm (QRS 160 ms, interventricular delay 130 ms on intracardiac electrograms). During sinus rhythm (75 bpm) there was no LBBB (QRS 90 ms, interventricular delay 60 ms). LBBB caused a sudden decline in average baseline dP/dtmax from 820 ± 75 to 662 ± 52 mmHg/s (p < 0.001, paired t test; Fig. 2a). CRT during LBBB increased dP/dtmax by 19.9% (Fig. 2b). In the absence of LBBB, optimised CRT did not change dP/dtmax compared with baseline; CRT with short AV delay worsened dP/dtmax (Fig. 2b).

Fig. 2
figure 2

Acute haemodynamic response to cardiac resynchronisation therapy (CRT) during left bundle branch block (LBBB) and normal ventricular activation. a Absolute LV dP/dtmax values at baseline (no CRT) and during CRT with four different AV delays. b Relative increase in LV dP/dtmax during CRT compared with baseline without CRT. Mean ± standard error are shown. Squares narrow QRS, diamond LBBB, AVI intrinsic atrioventricular interval, AV delay atrioventricular delay

Discussion

LV dP/dtmax is a measure of systolic function that can reflect acute haemodynamic improvement achieved by CRT [35]. In this case, AAI pacing induced LBBB and immediately decreased dP/dtmax. Acute decrease in haemodynamic function by induction of LBBB was previously shown in a preclinical model [6]. Systematic human research about the effect of CRT during both conditions within the same patient is difficult since data before onset of LBBB are often lacking.

In the absence of LBBB, biventricular pacing did not improve dP/dtmax. In a randomised trial, CRT also failed to improve quality of life and functional and echocardiographic parameters in heart failure patients with QRS <130 ms and mechanical dyssynchrony [1]. In this case, CRT with a short AV delay worsened dP/dtmax, which can be explained by impaired diastolic filling: advanced ventricular contraction causes premature mitral valve closure and thereby premature ending of atrial contraction (A wave truncation). During LBBB, biventricular pacing increased dP/dtmax by 20%; it was thereby almost restored to the baseline level during normal ventricular conduction.

Programming the device in this patient is a challenge. CRT is only effective during LBBB, and may have an adverse effect during normal intrinsic ventricular depolarisation. A rate-adaptive AV delay that is long at slow heart rates and shortens quickly when the heart rate increases to around 75 bpm would promote predominantly intrinsic depolarisation during narrow QRS, and predominantly biventricular pacing during LBBB. However, most currently available CRT-D devices do not have this programming option. Since the heart rate was above the critical value during the majority of the time, one could argue that the CRT-D can be programmed according to usual practice in patients with permanent LBBB.

Conclusions

The presence of a rate-dependent LBBB demonstrated important basic CRT principles:

  1. 1.

    Induction of LBBB had a pronounced negative effect on cardiac function as reflected by an acute decrease in LV dP/dtmax;

  2. 2.

    During LBBB, biventricular pacing almost completely restored LV dP/dtmax to baseline level during intrinsic normal ventricular conduction;

  3. 3.

    Biventricular pacing had no beneficial effect on LV dP/dtmax during normal ventricular conduction.