Dopamine and inhibitory action control: evidence from spontaneous eye blink rates
- First Online:
- Cite this article as:
- Colzato, L.S., van den Wildenberg, W.P.M., van Wouwe, N.C. et al. Exp Brain Res (2009) 196: 467. doi:10.1007/s00221-009-1862-x
The inhibitory control of actions has been claimed to rely on dopaminergic pathways. Given that this hypothesis is mainly based on patient and drug studies, some authors have questioned its validity and suggested that beneficial effects of dopaminergic stimulants on response inhibition may be limited to cases of suboptimal inhibitory functioning. We present evidence that, in carefully selected healthy adults, spontaneous eyeblink rate, a marker of central dopaminergic functioning, reliably predicts the efficiency in inhibiting unwanted action tendencies in a stop-signal task. These findings support the assumption of a modulatory role for dopamine in inhibitory action control.
KeywordsResponse inhibitionDopamineSpontaneous eyeblink
The ability to stop ongoing actions is an important characteristic of cognitive control and flexibility (Logan 1994). Increasing evidence suggests that it emerges mainly from interactions between the prefrontal cortex (PFC) and the basal ganglia (Aron et al. 2003; Miller and Cohen 2001; van den Wildenberg et al. 2006). For instance, functional imaging studies have indicated the involvement of the PFC in response inhibition by comparing trials on which subjects executed a speeded response to ‘‘go’’ signals with trials on which subjects were required to withhold their response upon a ‘‘no-go’’ (Garavan et al. 1999; Kelly et al. 2004;) or a ‘‘stop’’ signal (Rubia et al. 2001, 2003). In the standard stop-signal task (Logan and Cowan 1984), participants are first presented with a stimulus telling them to execute a particular response, which may or may not be followed by a stop-signal calling for the immediate abortion of that response. Versions of this task have been used to investigate the efficiency to stop various sorts of cognitive processes and so performance on it can be considered to diagnose the individual efficiency of actively inhibiting one’s “thoughts and actions” (Logan 1994; Logan and Cowan 1984).
In recent years, central serotonin (5-HT) function has been thought to be a critical component of behavioral inhibition and impulse control (Eagle et al. 2007; Evenden 1999; Linnoila et al. 1983; Robbins 2007). However, in recent clinical studies, 5-HT manipulations failed to affect SSRT, measured by the stop-signal task. Neither 5-HT depletion nor treatment with citalopram, a selective 5-HT reuptake inhibitor (SSRI), significantly affected SSRT in healthy human volunteers (Chamberlain et al. 2006; Clark et al. 2005). Furthermore, the absence of an effect of citalopram on SSRT was reproduced in rats (Eagle et al. 2008), showing that the failure of 5-HT to influence this form of inhibition translates consistently across species.
It is unlikely that any single neuromodulatory mechanism can explain the plethora of experimental factors that are known to modulate behavioral inhibition. However, dopamine (DA) represents a particularly likely candidate, given the recent pattern of results reported by Eagle et al. (2009). The authors tested the effect of central 5-HT depletion in rats on two aspects of behavioral inhibition, SSRT and ‘waiting’, using the stop-signal task. 5-HT depletion had no effects on SSRT or any other primary measure of the stop-signal task. However, within the same task, there was a deficit in ‘waiting’ in 5-HT-depleted rats when they were required to withhold from responding in the terminal element of the stop-signal task for an extended period. Interestingly, d-amphetamine had dose-dependent, but not 5-HT-dependent effects on SSRT. Conversely, the dose that decreased SSRT (0.3 mg/kg) impaired the ability to wait, again independently of 5-HT manipulation. These findings suggest that SSRT and “waiting” are distinct measures of behavioral inhibition, and that 5-HT may be critical for the “waiting” component, while dopamine may be crucial for SSRT.
Along these lines, Enticott et al. (2008) in contrast to Badcock et al. (2002), found that schizophrenia patients who suffer from dopaminergic imbalance in the basal ganglia inhibit responses more slowly than control subjects. Another study showed response inhibition deficits only among the undifferentiated, but not the paranoid, early-onset schizophrenia patients (Bellgrove et al. 2006). Moreover, a study with Parkinson’s patients, who suffer from loss of dopaminergic neurons in the basal ganglia, showed longer SSRT (Gauggel et al. 2004) and impaired suppression of conflicting responses (Wylie et al. 2009) compared to matched controls.
Unfortunately, these studies have major confounds given that the schizophrenia and Parkinson’s patients were taking antipsychotic drugs and l-DOPA, respectively, (which both act on the dopaminergic system). The results obtained in these studies may, thus, have mainly measured effects of medication use. Ideally, patient studies should test patients “on” and “off” medication, but for obvious ethical issues these kinds of studies are difficult to perform.
Very recently, Colzato et al. (2007a) observed that recreational users of cocaine, who are likely to suffer from reduced Dopamine D2 receptors in the striatum (Volkow et al. 1999), needed significantly more time to inhibit responses to stop signals than non users. The findings of these studies converge on the notion that the basal ganglia play a critical role in the suppression of responses that are incorrect or no longer relevant, and fit with the assumption that dopamine, which innervates these circuits, may play a role in modulating response inhibition (see Mink 1996 for a review). Note, however, that assessing the exact causal relation between inhibitory control functions and cocaine is complicated by the possibility of pre-existent neuro-developmental factors. Recent evidence showed that monkeys having preexisting lowered D2 receptor densities demonstrate higher risks to use cocaine and to become addicted (Nader et al. 2006), and that chronic users may suffer preexisting problems in inhibitory control (Bechara 2005).
Purpose of this study
The present experiment was motivated by the suggestion that dopamine may play a crucial role in response inhibition (Mink 1996). Given that this hypothesis is mainly based on patient and drug studies, some authors have questioned its validity and suggested that beneficial effects of dopaminergic stimulants on response inhibition may be limited to subjects whose inhibitory efficiency is suboptimal (e.g., De Wit et al. 2002; Scheres et al. 2003). We therefore were interested to test whether a dopaminergic impact on response inhibition efficiency could be demonstrated in healthy subjects.
Our measure of DA functioning was the spontaneous eyeblink rate (EBR), a well-established clinical indicator (Shukla 1985) thought to index dopamine production in the striatum (Blin et al. 1990; Karson 1983; Taylor et al. 1999). The idea that EBR reflects dopaminergic functioning is first of all supported by clinical observations in patients with DA-related dysfunctions. For example, EBRs are elevated in schizophrenia patients (Freed 1980), who in PET studies demonstrate elevated striatal dopamine uptake, both, on and off medication (Hietala et al. 1999; Lindström et al. 1999) but EBRs are reduced in recreational cocaine users (Colzato et al. 2008c), and in Parkinson’s patients (Deuschel and Goddemeier 1998), two populations suffering from reduced functioning of D2 receptors and severe losses of nigrostriatal dopaminergic cells, respectively (Dauer and Przedborski 2003; Volkow et al. 1999). Repetitive behavior disorders, related to lower levels of plasma concentrations of the dopamine metabolite homovanillic acid (HVA) (Lewis et al. 1996), are associated as well with lower EBR rate (Bodfish et al. 1995; MacLean et al. 1985). Very recently Colzato et al. (2009a) showed that the level of psychoticism, which has been associated with dopaminergic activity (Gray et al. 1994), was predicted by EBR: people with higher scores on the psychoticism scale showed higher EBRs.
Second, pharmacological studies in nonhuman primates and humans have shown that DA agonists, as apomorphine, and antagonists increase and decrease EBRs, respectively (Blin et al. 1990; Kleven and Koek 1996). Third, a genetic study in humans demonstrated a strong association between EBR and the DRD4/7 genotype, which is related to the control of striatal DA release (Dreisbach et al. 2005).
Further, albeit more indirect, evidence for the idea that EBR reflects dopaminergic activity comes from studies showing that EBR reliably predicts behavioral performance on cognitive tasks that have been associated with dopaminergic functioning (e.g., Dreisbach et al. 2005; Colzato et al. 2007b, 2008b, 2009b). Taken altogether, the available evidence suggests that EBR provides a reliable measure of dopaminergic functioning.1
To ascertain that our subjects were in good mental health, we selected them with the Mini International Neuropsychiatric Interview (MINI; Lecrubier et al. 1997a, b), a well-established brief diagnostic tool in the clinical and stress literature (Sheehan et al. 1998; Elzinga et al. 2007, 2008) that screens for several psychiatric disorders including, among others, schizophrenia, depression, mania, ADHD, and obsessive compulsive disorder.
Even though such a screening procedure is (unfortunately) rather uncommon in research on inhibitory control, it is important because preexisting psychiatric disorders (such as schizophrenia, ADHD, and obsessive compulsive disorder) are known to affect response inhibition (Rosenberg et al. 1997; Schachar and Logan 1990; Thoma et al. 2007). Also of relevance was the age of participants, which we therefore considered in our analyses: while inhibitory control is apparently unrelated to general intelligence (Logan 1994), inhibitory efficiency seems to decline at later stages of the life span (Williams et al. 1999).
Finally, to assess response-inhibition functioning, we employed a standard version of the stop-signal task, in which participants responded to the direction of a green arrow by pressing a button with the left or right index finger. The stop signal was a sudden and unpredictable color change of the arrow to red, signaling a deliberate effort to refrain from responding. As pointed out above, we considered SSRT to indicate the efficiency of inhibitory control, with longer SSRT pointing to less efficient inhibitory processing.
The hypothesis that dopaminergic pathways are crucial in driving inhibitory control clearly predicts a relationship between individual dopaminergic functioning and inhibitory efficiency, which in our design translates into the prediction of a correlation between EBR and SSRT. Which direction this correlation should have is however more difficult to predict. The perhaps more obvious expectation is that of a negative sign, that is, more availability of dopamine (associated with higher EBR) should improve inhibitory processes and thus reduce SSRT. However, the interactions between striatal dopamine supplies and cognitive control functions are complicated.
According to the model proposed by Frank et al. (2007), the basal ganglia support adaptive decision-making by modulating the selection of frontal cortical action plans. In short, two main neuronal populations in the striatum have opposing effects on action selection via output projections through the globus pallidus, thalamus, and back to the cortex. Activity in “Go” neurons facilitates the execution of a cortical response, whereas “NoGo” activity suppresses competing responses. Dopamine bursts and dips that occur during positive and negative outcomes drive Go learning (via D1 receptors) to seek rewarding actions, and NoGo learning (via D2 receptors) to avoid actions that are non-rewarding. Complementing this functionality, the subthalamic nucleus (STN) provides a self-adaptive dynamic control signal that temporarily prevents the execution of any response, depending on decision conflict. According to this model more dopamine than optimal (as associated with higher EBR) decreases activity in the indirect pathway (NoGo), a process that would enhance the competition between responses. As a consequence of this enhanced competition, we would expect longer SSRT, leading to a positive correlation. In any case, however, it is clear that some correlation between EBR and SSRT should be obtained if dopamine is really involved in inhibitory control.
Twenty-seven young healthy adults (20 women and 7 men, mean age = 23.5 SD = 3.7) served as subjects for partial fulfillment of course credit or a financial reward. Participants were recruited via ads posted on community bulletin boards and by word of mouth. Following Colzato et al. (2007a, b, 2008a, b, c) subjects were selected with the MINI (Lecrubier et al. 1997a, b). The following exclusion criteria were applied: no Axis 1 psychiatric disorder (DSM-IV), including ‘substance abuse’; no clinically significant medical disease; no use of medication. Written informed consent was obtained from all subjects; the protocol was approved by the local ethical committee (Leiden University, Institute for Psychological Research). All participants served in two sessions (held on the same day), one for recording EBRs and the other for conducting the stop-signal task.
Apparatus and stimuli
The experiment was controlled by a ACPI uniprocessor PC running on an Intel Celeron 2.8-GHz processor, attached to a Philips 109B6 17 in., LightFrame 3, 96 dpi with refreshrate van 120-Hz monitor. Responses were made by pressing the “Z” or “?” of the QWERTY computer keyboard with the left and right index finger, respectively. Participants were required to react quickly and accurately by pressing the left and right key in response to the direction of a left- or right-pointing green arrow (go trials) of about 3.5 × 2.0 cm with the corresponding index finger.
A BioSemi ActiveTwo system (BioSemi Inc., Amsterdam, The Netherlands) was used to record the EBR. Following Colzato et al. (2007a, b), eye movements were recorded, with two vertical (one upper, one lower) and two horizontal (one left, one right) Ag–AgCl electrodes, for 6-min eyes-open segments under resting conditions. The vertical electrooculogram (EOG), which recorded the voltage difference between two electrodes placed above and below the left eye, was used to detect eye blinks. The horizontal EOG, which recorded the voltage difference between electrodes placed lateral to the external canthi, was used to measure horizontal eye movements. Given that spontaneous EBR is supposed to be stable during daytime but increases in the evening (8:30 p.m., as reported by Barbato et al. 2000), data were never collected after 5 p.m. Additionally, we asked participants to avoid alcohol and nicotine consumption and to sleep sufficiently the day before the recording. Participants were comfortably seated in front of a blank poster with a cross in the center, located about 1 m from the participant. The participants were alone in the room and asked to look at the cross in a relaxed state and to not move their head or activate their facial muscles (factors known to produce artifact in EOG recordings).
The experiment consisted of a 30-min session in which participants completed a version of the task adopted from van den Wildenberg et al. (2006). Arrows were presented pseudorandomly, with the constraint that they signaled left- and right-hand responses equally often. Arrow presentation was response-terminated. Intervals between subsequent go signals varied randomly, but equiprobably, from 1,250 to 1,750 ms in steps of 125 ms. During these interstimulus intervals, a white fixation point (3 mm in diameter) was presented. The green arrow changed to red on 30% of the trials, upon which the choice response had to be aborted (stop trials). A staircase-tracking procedure dynamically adjusted the delay between the onset of the go signal and the onset of the stop signal to control inhibition probability (Levitt 1971). After a successfully inhibited stop trial, stop-signal delay in the next stop trial increased by 50 ms, whereas the stop-signal delay decreased by 50 ms in the next stop trial when the participant was unable to stop. This algorithm ensured that motor actions were successfully inhibited in about half of the stop trials, which yielded accurate estimates of SSRT (Band et al. 2003; see Fig. 1). It compensated for differences in choice RT between participants. The stop task consisted of five blocks of 104 trials each, the first of which served as a practice block to obtain stable performance.
Median RT to go signals and SSRT on to stop signals were individually assessed to index response execution and response inhibition, respectively. To test our main hypothesis that dopamine modulates response inhibition, we ran a Pearsons’s correlation test, which examined the association between EBR and the individually calculated SSRT. We also explored the relationship between EBR and response execution. Given recent studies showing gender differences for EBR and for SSRT (Dreisbach et al. 2005; Li et al. 2006; Müller et al. 2007; Mulvihill et al. 1997), independent samples t tests were performed for analysis of EBR and SSRT differences between men and women. A significance level of P < 0.05 was adopted for all statistical tests.
Data were examined using Brain Vision Analyzer (Brain Products™ GmbH, Munich, Germany). We defined an eyeblink as a voltage change of 100 uv in a time interval of 500 ms. Our sample of subjects had EBRs ranging from 3.8 to 31.4 per min (mean = 14.0, SD = 7.9), which according to our assumptions should represent a sufficiently wide range of tonic dopaminergic functioning.
Median RT to go signals (372 ms, SD = 26 ms) and SSRT to stop signals (209 ms, SD = 30 ms) were individually assessed to index response execution and response inhibition, respectively. Overall, participants were able to stop their responses on stop-signal trials successfully in about half of the time a stop signal instructed them to do so (51%, SD = 2.9%), indicating that the dynamic tracking algorithm worked. The percentage of choice errors to go signals was low (1.3%, SD = 1.4).
Sex differences No significant sex differences were obtained for EBR, t = 1.13, P = 0.26, and SSRT, t = 0.99, P = 0.32.
Our findings show that in healthy people the spontaneous EBR reliably predicts the efficiency in inhibiting action tendencies in a stop-signal task. As participants were screened for several psychiatric disorders in the current study, we can rule out an account in terms of preexisting psychiatric disorders (as schizophrenia, ADHD, and obsessive compulsive disorder) that have been associated with dopaminergic abnormalities (Davis et al. 1991; Tripp and Wickens 2007; Pooley et al. 2007). Given that our female and male participants did not significantly differ in EBR and SSRT measurements, we doubt that our results can be attributed to sex differences.
Even though the correlative nature of our findings does not directly speak to the underlying causal relations, the observed pattern does fit with previous demonstration that schizophrenia patients (Enticott et al. 2008), Parkinson’s patients (van den Wildenberg et al. 2006; Gauggel et al. 2004) and recreational cocaine users (Colzato et al. 2007a, b)—populations suffering from dopaminergic imbalance in the basal ganglia—have more trouble inhibiting their actions in response to stop signals. Taken together, these observations support Mink’s (1996) hypothesis that dopamine plays a crucial role in inhibitory control not only in populations with suboptimal inhibitory efficiency (De Wit et al. 2002; Scheres et al. 2003) but in healthy people as well. Moreover, the positive sign of the correlation between our marker of striatal dopamine supply (EBR) and inhibitory control (SSRT) is consistent with Frank et al.’s model (2007) that more dopamine than optimal (associated with higher EBR) decreases synaptic plasticity in the indirect pathway, thus producing longer SSRT.
This leaves the question of how dopamine might modulate inhibitory control and why Parkinson’s patients and cocaine users, who have reduced dopamine function, also show longer SSRT (Gauggel et al. 2004; Colzato et al. 2007a, b). Even though the available evidence may be taken to point to a linear relationship between response inhibition and dopamine level, it may be as for other cognitive function as working memory (Goldman-Rakic et al. 2000), that the relationship between response inhibition and dopamine level may actually follow an inverted U-shaped function. According to this idea, it is an average dopamine level that allows for optimal cognitive performance, whereas too high or too low levels impair cognitive processes. The assumption of an inverted U-shaped performance function of individual dopamine levels is also consistent with a recent observation of Akbari Chermahini and Hommel (2009), who studied the relationship between creativity and dopamine. Spontaneous EBRs predicted performance in divergent thinking, a subcomponent of creativity that has been associated with enhanced dopaminergic functioning (Ashby et al. 1999; Eysenck 1993). Interestingly for our purposes, the relationship followed an inverted U-shaped function with average EBRs producing better performance than low or high EBRs. Also of interest, a behavioral genetics study has found a link between divergent thinking and the DRD2 TAQ IA polymorphism (Reuter et al. 2006). This polymorphism affects the density of DA-D2 receptors; the very receptor family that is impaired in cocaine users who, as pointed out above, perform poorly on inhibition tasks (Colzato et al. 2007a, b).
These encouraging observations aside, the hypothesis of an inverted U-shaped function between SSRT and DA levels certainly requires more direct investigation using different paradigms, such as psychopharmacological studies, but it seems essential that individual baseline levels of DA are taken into account. Indeed, as pointed out by Cools et al. (2001) and Akbari Chermahini and Hommel (2009), different individuals are likely to have different baseline levels of DA (be it through genetic variation, drug abuse, or other factors) and may therefore exhibit differential sensitivity to the positive and negative effects of dopaminergic drugs and manipulations.
Even if it is not possible to completely exclude that EBR correlates with functioning of other neurotransmitters, several patients, animal and drugs studies showed that the “spontaneous” eyeblink rate seems to be modulated by the dopaminergic system while the “conditioned” eyeblink produced by acoustic startle seems to be driven by the serotonergic system. Graham et al. (2002) found that Ketanserin, a serotonin (5-HT) receptor antagonist drug, significantly suppressed prepulse inhibition of the eye response while haloperidol, a D2 dopamine receptor blocking antipsychotic drug, had no effect on prepulse inhibition.
We thank Wouter Kool and Kim Ouwehand for their enthusiasm and invaluable assistance in recruiting, testing the participants of this study and collecting the data.
This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.