FormalPara Key Summary Points

The prevalence of and motivation for programming with directional electrodes in clinical practice are not clear.

Initially, in our practice, directional electrodes were used more frequently in patients with essential tremor (ET) than patients with Parkinson’s disease (PD), likely because programming is less complex.

In PD we switched to using directional electrodes at a later time when new solutions to reduce worsening symptoms while avoiding side effects were sought.

Over a 36-month time period we used directional stimulation to improve efficacy or reduce side effects in 39–68% of patients with PD and 50–72% of patients with ET which seems to justify our decision to switch to directional leads when they first became available.

Introduction

Directional deep brain stimulation (d-DBS) axially displaces the volume of tissue activated towards the intended target and away from neighboring structures, potentially improving the benefit and reducing side effects of stimulation. A large prospective, randomized, multicenter, crossover study evaluating d-DBS has demonstrated a wider therapeutic window (TW) with directional stimulation than with conventional, omnidirectional stimulation in 90% of cases [1]. While a large TW intuitively seems advantageous, it is not clear how often and for what reasons directional stimulation is used in clinical practice. Existing reports on directional DBS in clinical practice have shown use of directional electrodes in 50–70% of patients but some only report on essential tremor (ET), provide only short-term data, or do not separate directional programming from other advanced programming techniques [2,3,4,5]. The aim of the current study was to determine the rate of adoption of d-DBS in our practice over time in patients with Parkinson’s disease (PD) and ET. We evaluated the prevalence of and reasons for programming with directional electrodes from its inception in 2016 until 36 months later.

Methods

This study was performed following approval from Rush University IRB-01 (Registration number IRB00000530) with waiver of consent, and in accordance with the Helsinki Declaration of 1964 and its later amendments. In this retrospective study, we included consecutive patients with PD and ET who were implanted between December 2016 and January 2020 with the Abbott Infinity DBS system with Directional Leads (6172) at Rush University Medical Center (Rush). These directional DBS systems were implanted in all Rush DBS patients once they became available in 2016. All DBS surgeries were completed by the same team of neurosurgeon and neurologist in awake patients with physiological confirmation of the target by microelectrode recording and test stimulation. Patients returned to clinic for the initial programming visit approximately 4 weeks after the DBS surgery, and every 1–3 months thereafter based on the clinician’s discretion. All patients were programmed by one of two clinicians (JAK, LVM) with similar styles of programming, work flow, and documentation. The programmer usually followed the same patient over time, but occasionally cross coverage was needed. Both clinicians were early adopters of Abbott’s Informity software which ensured streamlined documentation of initial programming visits. On subsequent programming visits, the reasoning for adjustments to DBS parameters was documented in the clinical history. When programming d-DBS only single segment activation (SSA) was used because it activates the smallest arc of the electrode, producing more axial asymmetry. By adding a second segment (co-activation), 66% of the circumference is activated which decreases directionality. SSA provides the greatest change in field shape compared to omnidirectional stimulation.

The following clinical data were collected from the medical record at the stated time points: baseline—demographic information, Movement Disorder Society Unified Parkinson’s Disease Rating Scale (MDS-UPDRS) Part III, and levodopa equivalent daily dose (LEDD); 3, 12, 24, and 36 months postoperatively—stimulation parameters, including use of directional stimulation and advanced programming techniques (i.e., bipolar; interleaving; interleave–interlink (IL–IL): a dual-frequency interleaving programming paradigm used to address both axial and appendicular symptoms in patients with PD or to improve symptom control in patients with ET) [6,7,8] and reason for use of directional stimulation. The reasons for using directional stimulation as documented in the chart were (a) better symptom control, (b) less side effects, (c) combination of better symptom control and less side effects, (d) improvement in battery drain, or (e) better TW percentage. TW percentage is defined as the percentage the current (mA) at therapeutic benefit can be increased before the side effect threshold is reached. For example, if the therapeutic benefit is observed at 2.0 mA and side effects at 4.0 mA, the TW percentage would be 100%. The patients programmed with directional stimulation were categorized based on the type of directional programming, which included (a) monopolar directional, (b) bipolar directional, (c) monopolar IL–IL directional, and (d) bipolar IL–IL directional. In those programmed with a non-directional advanced programming technique we documented whether directional testing was completed.

Statistical Analysis

All patients who at least had a 3-month follow-up visit were included in the analysis. If patients were lost to follow up (i.e., moved away from clinic, underwent lead revision/explant) beyond the 3-month time point, or had not yet reached the 12, 24, or 36-month time point, their data were included in the analysis up to the point that they were seen for programming in clinic.

Descriptive analysis was performed as appropriate. The Kolmogorov–Smirnov test was used to test the distributions confirming normality (PD: p = 0.2; ET: p = 0.2). In the patients with PD a linear mixed model with quadratic term in time was performed to examine the change in LEDD from pre-surgery to post-surgery. Pairwise comparisons with Tukey’s multiple comparison test were also performed to compare the LEDD between any two time points. Statistical significance was set at p < 0.05.

Results

Participants

Seventy-four patients (137 DBS leads) were identified (see Table 1). Fifty-six patients had PD (104 DBS leads) and 18 had ET (33 DBS leads). Of the 56 patients with PD, 47 were implanted in the subthalamic nucleus (STN), 3 in the globus pallidus internal segment (GPi), and 6 in the ventral intermediate nucleus (VIM) of the thalamus. The number of leads implanted in each target at each post-surgical time point (3, 12, 24, and 36 months post DBS surgery) are shown in Table 2. Of the patients with PD, four underwent revisions (8 DBS leads) while the remaining 52 patients were newly implanted. There were two patients (one PD and one ET) lost to follow-up after the 12-month time point because they moved out of state.

Table 1 Demographic data
Table 2 Number of DBS leads at different time points after surgery for Parkinson’s disease and essential tremor

Patients with Parkinson’s Disease

The percentage of all DBS leads (STN, GPi, and VIM) programmed with directional stimulation was relatively stable until the 36-month time point when it increased (Fig. 1 and Table 3). Because there may be a difference in phenotype in patients with PD with GPi or VIM leads, who represent a minority of our patients, the STN leads were analyzed separately. There was a similar trend in the STN only leads compared to all DBS leads (Table 3). The reasons for using directional stimulation were (a) better symptom control; (b) reduction in side effects; or (c) combination of better symptom control and reduction in side effects (Fig. 2a).

Fig. 1
figure 1

Percentage of patients with Parkinson’s disease and essential tremor programmed with a directional DBS electrode in at least one lead at 3, 12, 24, and 36 months post DBS surgery

Table 3 Characterization of leads in PD and ET using d-DBS at different time points after DBS surgery
Fig. 2
figure 2

Reason for use of directional electrode in a PD and b ET. The reason DBS leads were programmed with directional electrodes in Parkinson’s disease (PD) and essential tremor (ET) at 3, 12, 24, and 36 months after DBS surgery were (1) side effects, (2) better symptom control, or (3) combination of side effects and better symptom control. The additional reasons in ET were improved battery or therapeutic window (TW) percentage

Of the DBS leads that were programmed with non-directional stimulation, 49% were programmed with a conventional programming paradigm (i.e., monopolar configuration) at 3 months, 39% at 12 months, 39% at 24 months, and 35% at 36 months; and 23% were programmed with a more advanced programming paradigm (i.e., bipolar, interleaving, or IL–IL configuration) at 3 months, 31% at 12 months, 32% at 24 months, and 12% at 36 months. The opposite sequence, switching back from directional to non-directional stimulation occurred in two leads at 12 months and three leads at 24 months post DBS surgery because of better symptom control.

There was a significant reduction in the LEDD from pre-surgery to post-surgery at each time point (3, 12, 24, and 36 months post DBS surgery), but no significant change between each time point post-surgery (Table 4). The main effect of time was significant and there was a decreasing trend in LEDD (estimate = − 48.8, SE = 8.4, p < 0.0001).

Table 4 Reductions in levodopa equivalent daily dose (LEDD) compared to pre-surgery

Patients with Essential Tremor

The number of DBS leads programmed with directional stimulation was higher in ET than patients with PD (Fig. 1, Table 3). The reasons for using directional stimulation were (a) better symptom control, (b) reduction in side effects, (c) combination of better symptom control and reduction in side effects, or (d) improvement in battery or TW percentage (Fig. 2b). The switch from directional to non-directional stimulation occurred in two leads at 12 months, one lead at 24 months, and two leads at 36 months post DBS surgery because of better symptom control.

Of the DBS leads that were programmed with non-directional stimulation, 27% were programmed with a conventional programming paradigm (i.e., monopolar configuration) at 3 months, 23% at 12 months, 14% at 24 months, and 25% at 36 months; and 15% were programmed with an advanced programming paradigm (i.e., bipolar, interleaving, or IL–IL configuration) at 3 months, 27% at 12 months, 29% at 24 months, and 50% at 36 months.

All Patients

The most common type of directional programming for both PD and ET was a monopolar directional configuration; however, a subset of patients were programmed with a bipolar directional configuration (Table 5). We tested directional electrodes prior to implementing an advanced programming technique in the majority of patients. If we did not test directional electrodes it was because of the optimal electrode contact being a non-segmented electrode or IL–IL being used for axial symptoms in patients with PD [6, 8]. The directional and non-directional settings for each patient are shown in Table 1 of the Supplementary Material.

Table 5 Type of directional programming in patients with PD and ET at time points after DBS surgery

Discussion

Over the course of the study 39–68% of patients with PD and 50–72% of patients with ET had at least one lead programmed directionally in order to either improve symptom control or reduce side effects, an option not available with conventional omnidirectional stimulation. In general, we used directional stimulation more frequently in patients with ET than patients with PD, although by 36 months post DBS surgery a larger percentage of patients with PD were programmed with a directional electrode. Similarly, Zitman et al. reported that a larger percentage of patients with leads implanted in the VIM were programmed with a directional electrode compared to the STN and GPi [5]. In our practice the main reasons for using directional electrodes in PD and ET were to avoid side effects and to improve symptom control. In patients with ET, additional reasons were to reduce battery consumption and take advantage of a larger TW percentage. The last two reasons were not based on clinical necessity, which supports the notion that the higher, earlier use of directional configurations in patients with ET than in patients with PD was not because it was more needed but because of the simpler programming in ET. Programming for a monosymptomatic disorder such as ET is less complex, more time efficient, and small differences are easier to identify, especially with the help of visuals such as handwriting samples or Archimedes spiral drawings. In contrast, in PD such visual or quantitative documentation of bradykinesia and rigidity is not readily available in most clinics and subtle differences may therefore be more difficult to ascertain. As a result, when programming patients with PD we did not use directionality at initial programming, but rather switched to using directional electrodes at a later time when new solutions to reduce worsening symptoms while avoiding side effects were sought. While the percentage of patients programmed with directional electrodes at the 36-month time point increased, one needs to be cautious to draw firm conclusions because of the low number of patients that reached the 36-month time point. Despite having the option of using directional electrodes we continued to find other advanced programming techniques (i.e., bipolar, interleaving, or IL–IL configuration) beneficial in certain patients. In most of these patients we did test directional electrodes first prior to implementing an advanced programming technique unless the optimal contact was non-directional or the primary concern was axial symptoms where we would use the “dual” frequency programming paradigm, IL–IL. Prior to the advent of directional systems, we commonly used IL–IL in our practice to reduce axial symptoms in PD and stimulation-induced side effects in PD and ET. However, even when directional systems became available, we continued to find the IL–IL paradigm useful; this was perhaps because of our inexperience programming with directional systems early on or the unique spatial and temporal features of IL–IL that led to better symptom control [8]. Additionally, in a subset of our patients we found the combination of directional and an IL–IL or bipolar configuration useful.

One limitation of this study is its retrospective nature. Consequently, there was an unequal sample at each time point and the entries in the medical chart did not utilize a standardized survey. The lack of standardization was partially mitigated by using data from patients programmed by two programmers with identical styles of programming and documentation in the medical record. As a result, all elements of interest could be readily identified and extracted from the medical record. Both clinicians were early adopters of the Informity software that provides automatic calculations of TW percentage and power consumption which further streamlined the documentation process. While the uniformity of programming between the two clinicians is advantageous to summarize the experience in their practice, it does not allow generalization and other DBS programmers may have different experiences. However, data from other centers suggest a similar early adaptation rate [2,3,4,5, 9]. As time goes by, directional programming will likely be more utilized, buoyed by programming strategies that do not rely on trial-and-error method of programming, but instead on visualization software or biomarker use (such as beta oscillations) to identify optimal stimulation sites within the target, and the corresponding electrode segments. Finally, while LEDD reduction exceeded 50% in the PD group, a proxy measure for successful STN DBS surgery, MDS-UPDRS-III scores at each time point are lacking. However, this study did not aim to compare motor outcomes with directional versus non-directional stimulation but rather to assess the acceptance and implementation of the technique. Indirectly, this is an indication of its usefulness in optimizing stimulation outcomes. In this context it should be noted that the Progress study did not find differences in UPDRS-III scores, whereas patient and clinician preference significantly favored directional DBS over non-directional [1]. One way to interpret those results is that small but important benefits, for instance on gait, or side effects such as dysarthria may wash out in UPDRS-III summary scores but are important to patients and may motivate the clinician to change stimulation parameters.

Conclusion

Directional leads offer an additional programming option that can potentially improve the benefits of DBS therapy. We used directional electrodes more frequently in ET than in PD which we hypothesize is not because of a lesser need but because of differences in workflow and assessments between programming for ET vs PD that were especially noticeable early in our experience with directional programming. In our practice we have now established a routine of performing a traditional omnidirectional monopolar survey during the first postoperative visit and testing different segments at the optimal contact at the next visit. Dividing the monopolar survey in two sessions is more time efficient and prevents excessive patient fatigue. Over time we saw an increase in the use of directional stimulation in our patients with PD. This may be attributed to changes in workflow, experience, or the need to improve progressive symptoms. In this study we found that over a 36-month time period we switched to directional stimulation in 39–68% of patients with PD and 50–72% of patients with ET. This seems to justify our decision to switch to directional leads when they first became available.