Abstract
Appropriately attending to threatening environmental stimuli is evolutionarily adaptive and crucial for survival. This study revealed that nonconscious attentional modulation of disgust has different behavioral and event-related potential (ERP) patterns, as compared to fear and anger. To facilitate its evolutionary purpose of avoidance, disgust first diverts rather than attracts attention. Accordingly, the N1 was smaller in a validly than in an invalidly disgust-cued condition. Furthermore, the frontal P3a for disgust, anger, and fear was found to be larger in the valid than in the invalid condition, which was interpreted as an involuntary switching of attention toward threat-related events to mobilize cognitive resources for action or defense. On the contrary, the parietal P3b only occurred at the conscious level; the enhanced P3b indicated that more cognitive resources were being allocated toward the task-relevant but previously less attended location, to ensure the effective achievement of task goals. In addition, group comparisons between individuals with low and high disgust sensitivity showed that the ERP differences between the disgust and the anger/fear conditions at the unconscious level may be attributed only to individuals with high disgust sensitivity. These findings, together with previous knowledge of the effects of fear and anger on attention, strengthen our confidence in the two-stage scheme of attentional modulation by threats, which consists of an early stage of bottom-up response scaling of sensory processing (reflected by the P1 and N1) and a later stage of top-down integration and regulation of emotion and behavior (reflected by the P3).
Similar content being viewed by others
Studies of the attentional modulation of emotion could provide precious insights into the basic functional architecture of the human mind (Pourtois & Vuilleumier, 2006; Yiend, 2010). It is generally considered that emotional—particularly, threatening—stimuli rapidly capture attention (even in patients with attentional deficits), which constitutes one of the most central cognitive abilities controlling behaviors (Cacioppo & Gardner, 1999; Domínguez-Borràs, Saj, Armony, & Vuilleumier, 2012; Pourtois & Vuilleumier, 2006; Pourtois, Schettino, & Vuilleumier, 2013; Taylor & Fragopanagos, 2005).
The dot-probe paradigm (Brosch, Sander, Pourtois, & Scherer, 2008; MacLeod, Mathews, & Tata, 1986) with emotional faces/words/pictures as cues is frequently employed to investigate the attentional orienting/disengagement effect of emotion (MacLeod et al., 1986; Pourtois & Vuilleumier, 2006). However, most of the previous emotional dot-probe studies focused on the attention enhancements of fear and anger; this literature has demonstrated that fearful/angry faces can bias spatial attention toward threat-related locations and can enhance occipital visual activation, as well as the attention-related P1 amplitude (Hillyard, Vogel, & Luck, 1998), in response to a target following valid rather than invalid fearful/angry facial cues (e.g., Phelps, Ling, & Carrasco, 2006; Pourtois, Grandjean, Sander, & Vuilleumier, 2004; Pourtois, Schwartz, Seghier, Lazeyras, & Vuilleumier, 2006).
Disgust, as one of the basic emotions recognized since Darwin (1872), is found across all cultures. From an evolutionary perspective, disgust represents responses to a wide range of stimulus triggers that signal physical or psychological uncleanliness (Chapman & Anderson, 2012; Oaten, Stevenson, & Case, 2009). The adaptive role of disgust is to facilitate the avoidance of contamination (Jones, 2007). So far as we know, only four studies have directly examined the spatial attention avoidance of disgust. Using auditory and visual versions of the dot-probe task, Bertels, Kolinsky, Coucke, and Morais (2013) and Cisler and Olatunji (2010) found that the reaction time to the target was longer after a valid than after an invalid disgusting cue, indicating that disgust elicited attentional avoidance and perhaps oriented attention to the opposite side of disgusting cues. Recently, our group (Liu, Zhang, & Luo, 2015) demonstrated with the event-related potential (ERP) technique that, at the early (sensory) processing stage (100–200 ms), angry faces elicited a larger, whereas disgusted faces elicited a smaller, occipital P1 component for validly than for invalidly cued targets. However at the later (control) processing stage (300 ms and thereafter), angry faces generated a smaller, whereas disgusted faces generated a larger, P3 for validly than for invalidly cued targets. We then proposed a two-stage attentional modulation of disgust: First, disgust initiates bottom-up attention suppression, to facilitate the evolutionary role of minimizing the exposure to contamination; then, a top-down control mechanism becomes involved and orients (maybe compensatorily) more attention toward the location of disgusted-face cues, so as to ensure a quick response to disgust-related information (Liu et al., 2015). More recently, Zimmer, Keppel, Poglitsch, and Ischebeck (2015) performed a multisensory dot-probe study, which verified the early attention avoidance effect of disgust and repeated the P3 finding observed by Liu et al. Taken together, these results suggest that disgust can first quickly suppress attention and then redirect that attention toward its location via the top-down control mechanism.
Given the adaptive and survival value of threatening emotions (fear, anger, and disgust), humans have evolved specialized neural systems for rapid responses to these signals without the need for conscious awareness (Tamietto & de Gelder, 2010; Williams et al., 2006). Numerous “attention-by-emotion” studies, which used fearful/angry faces and pictures as their experimental materials, have demonstrated that emotional cues significantly influence attentional orientation, both when they are supraliminally (Cooper & Langton, 2006; Koster, Crombez, Verschuere, & De Houwer, 2004; Mogg & Bradley, 1999; Öhman, Flykt, & Esteves, 2001; Pourtois et al., 2004) and subliminally (Carlson, Fee, & Reinke, 2009; Fox, 2002; Mogg & Bradley, 1999) displayed. However, whereas the studies mentioned in the previous paragraph successfully demonstrated how consciously perceived (usually with a presentation duration of 100 ms) disgusting triggers modulate the spatial orientation of attention, behavioral and neural evidence for nonconscious attentional modulation by disgust have hardly been found.
As a follow-up to Liu et al. (2015), in the present study we examined how disgust, as compared with anger and fear, influences spatial attention on the nonconscious level (i.e., emotional cues were presented for 17 ms) using a masked version of the dot-probe paradigm (Carlson, Cha, Harmon-Jones, Mujica-Parodi, & Hajcak, 2014; Carlson, Cha, & Mujica-Parodi, 2013). The first aim of this study was to explore whether the nonconscious attentional orienting of disgust shows different behavioral and ERP patterns than do other threatening emotions (fear and anger), as we had found in our conscious study (Liu et al., 2015). Previous emotional studies have suggested that, whereas the occipital P1 (100–140 ms) usually reflects early attentional modulation on the conscious level (Brosch et al., 2008; Liu et al., 2015; Pollak & Tolley-Schell, 2003; Pourtois et al., 2004), the occipital N1 (150–190 ms) is a positive correlator with participants’ behavioral index of attentional orienting at the nonconscious level (Carlson & Reinke, 2010; Liddell, Williams, Rathjen, Shevrin, & Gordon, 2004; Williams et al., 2004). Accordingly, we expected that validly presented subliminal disgusting (vs. fearful/angry) cues would evoke a reduced (vs. an enhanced) target-locked N1, when compared with the invalidly cued condition. In addition, we expected that the intensity of attention avoidance caused by disgust would correlate with the self-report measure of disgust sensitivity.
This nonconscious study was inspired by the notion that partially separable neural systems, though both involve the amygdala, underlie conscious and nonconscious levels of the attentional modulation of emotion. Although conscious modulation is driven mainly by a frontoparietal cortical network, nonconscious attentional orienting is largely mediated by a subcortical pathway and the amygdala–anterior cingulate network (Carlson et al., 2013; Mulckhuyse & Theeuwes, 2010; Pourtois et al., 2006; Williams et al., 2006). In addition, studies of patients with unilateral spatial neglect have suggested that whereas the amygdala responds to both consciously and nonconsciously perceived fearful emotions, the anterior insula and somatosensory and motor cortices only respond to emotions on the conscious level (Tamietto et al., 2015). Understanding the neural bases of nonconscious versus conscious attentional modulation of emotion will clarify the integration of cortical and subcortical functions in the human brain (Tamietto & de Gelder, 2010). The second aim of this study was to propose, on the basis of the scalp ERP data, an integrated model regarding the similarities and differences between nonconscious and conscious attentional modulation by threats. We hypothesized that the two-stage pattern observed in Liu et al. (2015) might be generalized to a more universal model, with an amygdala–sensory cortical pathway working at an early stage of bottom-up attention amplification/suppression (the conscious P1 and the nonconscious N1), and the amygdala–frontoparietal cortices (including anterior cingulate cortex, ACC) working at a later stage of cognitive integration and top-down attention control (the P3 component) (Brosch, Scherer, Grandjean, & Sander, 2013; Carlson et al., 2013; Carlson, Mujica-Parodi, Harmon-Jones, & Hajcak, 2012; Liddell et al., 2004; Taylor & Fragopanagos, 2005). Furthermore, previous P3 studies have shown that the frontal P3a (related to activity in the ACC) and the parietal P3b (related to temporal–parietal cortical activity) index attentional modulation at different levels (Goldstein, Spencer, & Donchin, 2002; Polich, 2003, 2007). Thus, we expected that distinctive P3 distributions might appear between nonconscious and conscious studies.
Method
Participants
One hundred healthy people (50 males, 50 females; age range = 18 to 23 years) were recruited from Shenzhen University in China as paid participants. They were randomly assigned to two groups: a behavioral experiment (n = 40; 20 males, 20 females) and an ERP experiment (n = 60; 30 males, 30 females). Written informed consent was obtained from every participant prior to the experiment. The experimental protocol was approved by the Ethics Committee of Shenzhen University, and this study was performed strictly in accordance with the approved guidelines.
Stimuli
The face pictures were selected from the Chinese Facial Affective Picture System (Gong, Huang, Wang, & Luo, 2011), with equal numbers of face pictures of males and females. A total of 120 faces (20 disgusted, 20 angry, 20 fearful, and 60 neutral faces) were used. Scrambled-face masks were generated using the 60 neutral faces. Each facial picture was divided into an 8 × 9 matrix of tiles and then randomly rearranged. See the supplementary materials for detailed information.
Procedure
The experiment consisted of three blocks (disgust, anger, and fear), each containing 160 trials. The order of the three blocks was counterbalanced across participants.
The masked version of the dot-probe task was designed according to previous studies (Carlson et al., 2014; Carlson et al., 2013; Lim, Ho, & Mullette-Gillman, 2015; Liu et al., 2015; D. Zhang, Wang, Luo, & Luo, 2012). As is shown in Fig. 1, each trial had a 17-ms cue that consisted of two faces. We selected 17 ms as the presentation period of the facial expressions because many previous studies had suggested that healthy adults reported no awareness of masked faces of this duration (Dannlowski et al., 2006; Liddell et al., 2005; Lim et al., 2015; Williams et al., 2006; D. Zhang et al., 2012). In each block, the cue was represented by one category of emotional face and a neutral face. Each emotional face was presented eight times in a random order in corresponding blocks. After the cue, two scrambled faces were randomly selected from the 60 facial masks and appeared for 83 ms. Then a target (one up- or down-pointing triangle) was presented for a duration of 150 ms. In valid (vs. invalid) trials, the target appeared at the location previously occupied by the emotional (vs. the neutral) face. The valid and invalid trials were presented in a random order with equal probabilities (50% each). The left/right sides for the target were counterbalanced across the trials. After the target presentation, participants were required to respond quickly regarding the location of the triangle.
Participants were required to respond only to one kind of triangle (the up- OR the down-pointing one) during the experiment. The assignment of the target as an up- or down-pointing triangle was counterbalanced between participants. The only difference between the EEG and behavioral tasks was that 10% and 50% of trials required a motor response, respectively (Brosch et al., 2008; Liu et al., 2015). Similar with Brosch et al. (2008), we included both behavioral and ERP experiments in this study. In the ERP experiment, only 10% of trials required a motor response so that no motor response would be required on the other 90% trials. When average ERPs were computed for 90% of the trials, we had sufficient trials that were not contaminated by motor execution. However, 10% of the trials in the ERP experiment (i.e., only eight trials per condition) could not provide reliable statistics for the behavioral measures. Thus a behavioral experiment was added, and the number of response trials was increased from 10% to 50% so as to collect enough behavioral responses for statistical analyses.
Behavioral measures
In this study we analyzed the accuracy rates and response times (RTs) recorded in the ERP experiment (n = 60, 10% target trials, eight in each condition) and in the behavioral experiment (n = 40, 50% target trials, 40 in each condition).
EEG recording
Brain electrical activity was recorded referentially against the left mastoid and re-referenced offline to the average of the left and right mastoids, by a 64-channel amplifier with a sampling frequency of 250 Hz (NeuroScan Inc., Herndon, USA). Electroencephalography (EEG) data were collected with electrode impedances kept below 5 kΩ.
Self-report measures
After EEG collection, participants were required to finish two questionnaires for assessing their disgust sensitivity and anxiety level. Disgust sensitivity was examined by using the Disgust Scale–Revised (DS-R; Haidt, McCauley, & Rozin, 1994; Olatunji, et al., 2007). The DS-R is a self-report personality scale to measure individual differences in sensitivity to disgust. This 25-item scale has three subscales: core disgust (including food, animal, and body products), animal-reminder disgust (death and body envelope violations), and contamination disgust (concerns about interpersonal transmission of essences). The DS-R scores range from 0 to 100, with high scores corresponding to high disgust sensitivity.
Anxiety level was examined by using the Trait form of Spielberger’s State–Trait Anxiety Inventory (STAI-T; Shek, 1993; Spielberger, Gorsuch, Lushene, Vagg, & Jacobs, 1983). The STAI-T scores range from 20 to 80, with high scores indicating a high level of trait anxiety.
Routine analysis of ERP data
Ocular artifacts were removed from the EEGs by using a regression procedure implemented with the NeuroScan software (Scan 4.3). Then the EEG data were filtered with a 0.01- to 30-Hz filter. The filtered data were segmented beginning 100 ms prior to the onsets of the targets (i.e., triangles). In this study, we focused on ERPs that were time-locked to the targets (triangles) rather than the facial cues. Therefore, all epochs were baseline-corrected with respect to the mean voltage over the 100 ms preceding the onset of the targets. To prevent the ERP results from being contaminated by movement-related potentials, the average ERPs of the 60 participants were computed on the basis of nonresponse trials (80 × 90% = 72 trials per condition).
In this nonconscious study, the P3 component was found to peak at frontal instead of parietal areas according to the ERP topographies. Therefore, in this study we analyzed the latencies and amplitudes of the occipital N1 and the frontal P3 across different sets of electrodes in accordance with the relevant literatures (Carlson & Reinke, 2010; Luck, 2005; Liu et al., 2015; D. Zhang et al., 2013; X. Zhang, Guo, Zhang, Lou, & Ding, 2015). The mean amplitude of the N1 was calculated at the O1, O2, PO3, PO4, PO7, and PO8 electrode sites (time window = 160–200 ms). The mean amplitude of the frontal P3 was calculated at FC1, FCz, FC2, C1, Cz, and C2 (time window = 330–430 ms). The peak latencies of the two components were manually detected on the basis of individual ERP waveforms.
In addition to ERPs time-locked to the targets, we also analyzed the cue-evoked ERPs across the three emotional blocks (disgust, anger, and fear). Please see the supplementary materials for the analysis parameters and statistical results.
Principle component analysis of ERP data
When analyzing the ERP components of interest (the P3a and P3b), a troublesome problem is their temporal and spatial overlap. This phenomenon may decrease the reliability of the results obtained (Nelson, Patrick, Collins, Lang, & Bernat, 2011). Principal component analysis (PCA) is a useful tool for the statistical decomposition of ERPs and may help to solve this issue (Foti, Weinberg, Dien, & Hajcak, 2011; MacNamara, Ochsner, & Hajcak, 2011; D. Zhang et al., 2013). The ERP PCA Toolkit (EP Toolkit, version 250) was employed to conduct a two-step PCA procedure in this study (Dien, 2010a). According to the comparative study of Dien (2010b), Promax rotation and Infomax rotation were used with temporal and spatial PCA, respectively. See the supplementary materials for a more detailed introduction to PCA.
Source localization of the ERP components
Standardized low-resolution brain electromagnetic tomography (sLORETA, 04/15/2015 update) was employed to explore the possible neuronal generators of the PCA-derived ERP components (www.uzh.ch/keyinst/loreta.htm; Pascual-Marqui, 2002).
Statistics
Our statistical analysis was performed using SPSS Statistics 20.0. The significance level was set at .05. Descriptive data are presented as the mean ± the standard error. All the behavioral and ERP measures could be approximately considered Normal distributions according to the Shapiro–Wilk test. The data were also screened for outliers (defined by data points three standard deviations above or below the mean), and none were detected.
Two-way repeated measures analyses of variance (ANOVAs) were performed on the behavioral and ERP data, with Emotion of the Faces (disgust, anger, or fear) and Cue Validity (valid or invalid) as within-subjects factors. Post-hoc multiple comparisons were conducted using the Bonferroni test.
Confidence interval adjustment
Degrees of freedom were corrected by Greenhouse–Geisser correction whenever appropriate.
Furthermore, two-tailed Pearson’s r correlations was calculated between the ERP measures and the self-report measures of disgust sensitivity and trait anxiety. Correction for multiple comparisons was based on the Holm–Bonferroni method.
Results
This section mainly reports the statistical findings on the effects of emotion and cue validity. Please see the supplementary materials for statistical results on the effects of disgust sensitivity. This study defined cue validity as follows: A valid cue means that the target and the emotional face in the cue appeared at the same location in a trial; and an invalid cue means that the target and the emotional face in the cue appeared at the opposite locations in a trial.
Behavior
Accuracy rate
The accuracy rates in the behavioral and the ERP experiments were 97.9% ± 2.40% and 98.8% ± 4.55%. No significant difference was found between conditions.
RTs in the nonconscious experiment
In the behavioral experiment, the interaction of emotion with cue validity was significant [F(2, 78) = 5.94, p = .004, η p 2 = .132; Fig. 2]. For disgusted facial cues [F(1, 39) = 4.67, p = .037], the RT in the validly cued condition (282 ± 7.71 ms) was longer than that in the invalidly cued condition (277 ± 8.36 ms). In contrast, for angry [F(1, 39) = 4.17, p = .048] and fearful facial cues [F(1, 39) = 4.37, p = .043], the RT in the validly cued condition (anger = 273 ± 8.24 ms, fear = 272 ± 8.18 ms) was shorter than that in the invalidly cued condition (anger = 277 ± 8.04 ms, fear = 276 ± 8.13 ms).
In the ERP experiment, no significant difference was found between conditions.
Revisiting RTs in the conscious experiment
To give an integrated interpretation of the mechanism for attentional modulation by threats, we revisited the RT data collected in the previous conscious study (Liu et al., 2015; see the supplementary materials for a dataset description). The ANOVA was performed with Emotion of the Faces (disgust vs. anger) and Validity of the Cues (valid vs. invalid) as within-subjects factors.
In the behavioral experiment, the interaction of emotion with cue validity was significant [F(1, 29) = 16.9, p < .001, η p 2 = .368]. For disgusted facial cues [F(1, 29) = 7.24, p = .012], the RT was longer in the validly cued condition (230 ± 6.36 ms) than in the invalidly cued condition (224 ± 5.94 ms). In contrast, for angry [F(1, 29) = 9.72, p = .004] facial cues, the RT was shorter in the validly cued condition (226 ± 6.99 ms) than in the invalidly cued condition (232 ± 6.58 ms).
In the ERP experiment, no significant difference was found between conditions.
Routine ERP (nonconscious dataset)
N1
The interaction of emotion with cue validity was significant on the N1 amplitudes [F(2, 118) = 31.6, p < .001, η p 2 = .349; Fig. 3]. For disgusted facial cues [F(1, 59) = 14.2, p < .001], the N1 in the validly cued condition (−2.40 ± 0.24 μV) was reduced as compared to that in the invalidly cued condition (−3.15 ± 0.36 μV). In contrast, for angry [F(1, 59) = 25.5, p < .001] and fearful [F(1, 59) = 24.3, p < .001] facial cues, the N1 in the validly cued condition (anger = −2.85 ± 0.27 μV, fear = −3.40 ± 0.37 μV) was larger than that in the invalidly cued condition (anger = −1.98 ± 0.33 μV, fear = −2.55 ± 0.42 μV).
The main effect of cue validity was significant for the N1 amplitudes [F(1, 59) = 7.11, p = .010, η p 2 = .108]. Valid cues were associated with larger N1 amplitudes (−2.88 ± 0.18 μV) than were invalid cues (−2.56 ± 0.25 μV).
No significant effect was found on the peak latency of the N1 (average = 178 ± 14.3 ms).
Frontal P3
The main effect of cue validity was significant for P3 amplitudes [F(1, 59) = 23.6, p < .001, η p 2 = .286; Fig. 3]. Valid cues were associated with larger P3 amplitudes (10.5 ± 0.22 μV) than were invalid cues (9.62 ± 0.28 μV).
No significant effect was found on the peak latency of the P3 (average = 372 ± 13.1 ms).
Correlation between ERP and self-report measures
The average score for disgust sensitivity (measured by DS-R) in the 100 participants was 52.9 ± 1.22. The average score for trait anxiety (measured by STAI-T) in the 100 participants was 40.1 ± 0.93. No significant correlation was apparent between the two measures (r = −.06, p = .623).
To directly investigate the attentional modulation of emotion in the dot-probe experiment, here we employed a differential measurement to quantify ERP amplitudes (i.e., the attentional bias score; see also in Liu et al., 2015; Lubman, Peters, Mogg, Bradley, & Deakin, 2000; Townshend & Duka, 2001). The attentional bias score was defined as the difference in ERP amplitudes between validly and invalidly cued conditions. In particular, the attentional bias score for N1 amplitudes was calculated by subtracting the N1 amplitudes in the validly cued condition from the associated N1 amplitudes in the invalidly cued condition. The attentional bias score for P3 amplitudes was calculated by subtracting the frontal-P3 amplitudes in the invalidly cued condition from the associated P3 amplitudes in the validly cued condition.
In total, 12 correlations were performed between the scores for disgust sensitivity/trait anxiety and the attentional bias scores for the N1/P3 amplitudes in the disgusted, angry, and fearful conditions. When the cue was presented using disgusted faces, the attentional bias scores for the N1 amplitudes (r = −.41, p = .001, corrected p = .012) and the frontal-P3 amplitudes (r = .39, p = .002, corrected p = .022) correlated significantly with the scores for disgust sensitivity (Fig. 4). However, when the cue was presented using angry and fearful faces, the attentional bias scores of neither the N1 amplitudes (r = −.01, p = .936, for anger; r = −.07, p = .611, for fear) nor the frontal-P3 amplitudes (r = −.06, p = .634, for anger; r = −.10, p = .438, for fear) correlated significantly with the scores for disgust sensitivity.
No significant correlation was found between the scores for trait anxiety and the ERPs.
PCA-based ERP
In this section, the 62-channel ERP data from two data sets—that is, the conscious (Liu et al., 2015; see the supplementary materials for the methods of this dataset; Fig. 5a) versus nonconscious (Fig. 5c) attentional modulation by emotion—were put into a PCA. To make the PCA results comparable between the two datasets, only the disgust and anger conditions were included for the nonconscious data.
According to scree plots created by the EP Toolkit, we finally extracted 7 temporal factors × 3 spatial factors from each of the two datasets, yielding 21 temporospatial factor combinations.
Revisiting the conscious ERP data
The five factor combinations that accounted for the largest amounts of variance in the original ERP data (total variance = 76.4%) were recognized as closely corresponding to the P3b, slow wave, P3a, and right/left P1 ERP components in terms of their time courses and scalp distributions (Table 1).
The waveforms of the two P3-related factor combinations (PCA-P3a and PCA-P3b) were reconstructed back into voltage space for further statistical analysis. According to Fig. 5b and Table 1, the mean amplitude of the PCA-P3a was calculated at Fz, FC1, FCz, FC2, and Cz (time window = 280–400 ms); the mean amplitude of the PCA-P3b was calculated at CPz, P1, Pz, P2, and POz (time window = 500–680 ms).
The main effect of cue validity was significant for the P3a [F(1, 29) = 8.43, p = .007, η p 2 = .225; Fig. 5b]: Valid cues were associated with larger P3a amplitudes (10.2 ± 0.45 μV) than were invalid cues (9.33 ± 0.59 μV).
The interaction of emotion with cue validity was significant for the P3b [F(1, 29) = 21.4, p < .001, η p 2 = .425; Fig. 5b]. For disgusted facial cues [F(1, 29) = 11.0, p = .002], the P3b in the validly cued condition (20.0 ± 0.73 μV) was larger than that in the invalidly cued condition (18.4 ± 0.87 μV). In contrast, for angry facial cues [F(1, 29) = 12.0, p = .002], the P3b in the validly cued condition (18.2 ± 0.82 μV) was smaller than that in the invalidly cued condition (19.8 ± 0.91 μV).
Decomposition of the nonconscious ERP data
The five factor combinations that accounted for the largest amounts of variance in the original ERP data (total variance = 72.7%) were recognized to closely correspond to the P3a, anterior/posterior slow wave, and right/left N1 ERP components in terms of their time courses and scalp distributions (Table 2).
The waveforms of the P3-related factor combination—namely, the PCA-P3a—were reconstructed back into voltage space for further statistical analysis. According to Fig. 5d and Table 2, the mean amplitude of the PCA-P3a was calculated at Fz, FC1, FCz, FC2, and Cz (time window = 280–400 ms).
The main effect of cue validity was significant for the P3a [F(1, 59) = 23.6, p < .001, η p 2 = .286; Fig. 5d]: Valid cues were associated with larger P3a amplitudes (10.5 ± 0.22 μV) than were invalid cues (9.62 ± 0.28 μV).
Source localization of PCA-based ERP
Source localization was performed on the conscious dataset using sLORETA (because the conscious data included both the P3a and P3b). The PCA-based components to the onsets of the targets were contrasted with a 100-ms pretarget period of baseline. Statistics were calculated using the log-F-ratio statistic with voxel-wise randomization tests (5,000 random permutations). The randomization tests were based on statistical nonparametric mapping tools and corrected for multiple comparisons. The statistically activated regions (p < .05) were then localized using the xjView toolbox (Version 8.14, www.alivelearn.net/xjview) according to the Montreal Neurological Institute (MNI) coordinates. The results suggested that (1) significantly enhanced cortical sources at the PCA-P3a latency were located in the limbic lobe (anterior cingulate cortex, Brodmann’s area [BA] 24, MNI coordinates = [−10, 22, 27]; Fig. 6a); (2) significantly enhanced cortical sources at the PCA-P3b latency were in the parietal lobe (somatosensory association cortex, BA 7, MNI coordinates = [−10, −60, 70]; Fig. 6b).
Discussion
Nonconscious attentional modulation of disgust, anger, and fear
Appropriately attending to threatening stimuli in the environment is evolutionarily adaptive and crucial for survival (Pourtois & Vuilleumier, 2006). In line with our conscious study (Liu et al., 2015), in the present work we further examined the nonconscious attentional modulation of disgust, as compared with other threatening emotions—fear and anger. Although some researchers (e.g., Carlson et al., 2014; Carlson et al., 2013) had already performed intensive studies on attentional bias to backward-masked threatening (e.g., fearful) faces, to the best of our knowledge this was the first study to assess the spatial orienting of disgust in the absence of awareness.
Through this study we aimed to test the hypothesis that, to facilitate the evolutionary purposes of disgust, this emotion first diverts rather than attracts attention (see the following conscious studies: Bertels et al., 2013; Cisler & Olatunji, 2010; Liu et al., 2015; Zimmer et al., 2015). Our results revealed that when participants were presented with angry and fearful facial cues at the nonconscious level, the occipital N1 component, which has been proved to be sensitive to spatial attention (Hillyard et al., 1998), showed larger amplitudes for validly than for invalidly cued targets (see also in Carlson & Reinke, 2010). Conversely, the N1 amplitudes were significantly smaller for validly than for invalidly cued targets in the disgusted condition. This N1 result was well in line with the behavior data—masked angry/fearful cues facilitated, but masked disgusted cues slowed down, behavioral responses in the validly cued condition. According to Carlson et al. (2013; Carlson & Reinke, 2010), the inhibitory effect of subliminal disgusted faces may be mediated by an amygdala response eliciting suppressed sensory processing in the cortex (reflected by the reduced N1). The clear departure of the disgust-induced N1s between validly and invalidly cued conditions implies bottom-up attentional suppression, which offers timely protection from the risk of physical/psychological contamination (Curtis, de Barra, & Aunger, 2011). In addition, our data from the cue-evoked N170 (see the supplementary materials) further demonstrated that the structural encoding of subliminal faces was less activated when the cue contained a disgusted face.
An unexpected finding was that the frontal P3 displayed the same pattern between disgust, anger, and fear, with larger amplitudes in the valid than in the invalid condition. This result seems inconsistent with those from the conscious study (Liu et al., 2015), in which the parietal P3 amplitudes were larger for valid disgusted cues but smaller for valid angry cues. To resolve the issue, we employed the PCA procedure, resulting in a P3a and P3b for the conscious data, but only a P3a for the nonconscious data (Fig. 5). In general, the frontal P3a and the parietal P3b are considered to index attentional allocation at different levels (Polich, 2003, 2007): Whereas the P3a reflects online monitoring of frontal attention mechanisms, the P3b reflects context update and voluntary attention adjustment. Accordingly, the P3a is related to activity in the ACC, whereas the P3b originates from activation in the temporal–parietal junction and parietal cortical areas (Bledowski et al., 2004; Polich, 2003). In the context of emotional influences on spatial attention, the P3a has been associated with orienting and involuntary switch of attention toward a novel and significantly threatening stimulus, so as to mobilize cognitive and behavioral resources for action or defense (Friedman, Cycowicz, & Gaeta, 2001; Halgren & Marinkovic, 1995; Johnston, Miller, & Burleson, 1986; Polich, 2003). Since the orienting response is thought to be independent of conscious deliberation (Kenemans, 1992), the increases in frontal attention allocation for disgust, anger, and fear (reflected by the enhanced P3a in validly cued condition) suggests priority for performing a more comprehensive risk assessment of threat-related stimuli (van Hooff, Devue, Vieweg, & Theeuwes, 2013).
The ERP results of this study are consistent with those of Liddell et al. (2004) and Williams et al. (2004), who examined the perception of subliminally and supraliminally presented fearful faces, using only three ERP electrodes (Fz, Cz, and Pz). These authors found that, on the one hand, subliminal fear enhanced the N1 (called the N2 in their studies, due to electrode limitations) and the P3a, which were associated with the initial sensory processing of salient stimuli (N1) and involuntary orienting responses for novel and threatening stimuli (P3a), respectively. On the other hand, the P3b showed differences between supraliminal fear and neutral perception, which was proposed to reflect the conscious registration of threatening events and subsequent update of the stimulus context (Halgren & Marinkovic, 1995).
Furthermore, going beyond the prior work (Liu et al., 2015), in this study we statistically compared the emotional effects on spatial attention between individuals with low and high disgust sensitivity. It is known that responses to disgust elicitors vary considerably among individuals (Borg, de Jong, Renken, & Georgiadis, 2013; Lim et al., 2015; Sherlock, Zietsch, Tyber, & Jern, 2016). Our results revealed that the intensity of attention modulation by disgust (reflected by attentional bias scores of the N1 and P3a amplitudes) significantly correlated with self-report measures of disgust sensitivity (Fig. 4). Furthermore, group comparisons between those with low and high disgust sensitivity (see the supplementary materials) showed that the enhanced N1 for invalidly versus validly disgusted cues in the unconscious condition may be attributed only to the individuals with high disgust sensitivity. Similarly, the enhanced frontal P3 for validly versus invalidly disgusted cues may also be attributed only to these individuals. These results in terms of disgust sensitivity further confirmed the distinct effect of disgust, which is different from the effects of anger and fear when modulating attention.
How threats modulate attention: The ERP-based mechanism
Another aim of the present study was to generalize the results of Liu et al. (2015) and to propose a neural model for conscious and nonconscious attentional modulation by threats.
The Liu et al. (2015) findings, together with the nonconscious results presented in the present study, provide converging evidence for a two-stage scheme of attentional modulation by threatening emotions (Fig. 7). At the early stage (100 to 200 ms), a bottom-up pathway (involving the amygdala and back-projection to sensory cortices) functions as a response scaling of sensory processing (reflected by the P1 and N1), which may magnify the sensory perception of fear and angry but suppress the perception of disgust. At the later stage (250 ms and thereafter), the top-down integration pathway (involving a frontoparietal route to the amygdala) plays an important role in the regulation of emotion and behavior (reflected by the P3). Furthermore, source localization suggested that the P3a was localized to the ACC, which enjoys reciprocal connections with the amygdala and may support quick orienting to signals of biological significance, even in the absence of conscious awareness (Williams et al., 2006). As a result, the P3a was found to show the same pattern for the fear, anger, and disgust conditions. Unlike the P3a, the P3b was localized to the parietal lobe (somatosensory association cortex, BA 7), where is believed to play a role in visual–motor coordination. At the stage of voluntary control, the phenomenon of the task-relevant P3b (Luck, 2005) indicated that the brain may allocate more cognitive resources toward the task-relevant but previously less perceived/analyzed location, to ensure the effective achievement of task goals (Liu et al., 2015). Since the location of disgusted faces received less attention and perception at the early stage, the voluntary-orienting mechanism reflected by the P3b is likely to provide compensatory responses at the later stage, which may then allocate more attention toward the location previously occupied by a disgusted face.
This proposal is in line with previous studies that have examined the brain systems involved in emotional processing, attention modulation, and their interaction (Brosch et al., 2013; Liddell et al., 2005; Liddell et al., 2004; Pourtois et al., 2013; Tamietto & de Gelder, 2010; Taylor & Fragopanagos, 2005; Williams et al., 2006). In particular, LeDoux (1996) suggested that fear recruits the amygdala via two neural streams: Whereas the low-level sensory input is sent directly to the amygdala for rapid and automatic responses, more detailed analysis of fear signals relies on a slower, cortico-amygdala pathway (see also in Tamietto & de Gelder, 2010). More recently, de Gelder, Hortensius, and Tamietto (2012) sketched a dual-route model, with a reflexive route mainly supported by the amygdala and a voluntary route based mainly on cortical networks. Similarly, Pourtois et al. (2013) suggested that emotion may enhance processing efficiency through gain control mechanisms in the attention system, whereas this procedure is also modulated by the amygdala and interconnected prefrontal regions. Furthermore, Carlson et al. (2013; Carlson et al., 2009; Carlson et al., 2012; Carlson & Reinke, 2010; Carlson et al., 2011) proposed that both supraliminal and subliminal emotional cues modulate spatial attention, but likely through partially separate neural systems and attentional subprocesses: Masked fearful faces facilitate spatial attention through a neural network consisting of the amygdala, ACC, and visual cortex, whereas unmasked fearful faces mainly recruit frontoparietal cortical attention networks to modulate attention (see also Brosch et al., 2013).
However, prior to this study, the above-mentioned studies had only focused on the emotional attention biases caused by fear and anger, so they proposed that threat signals always facilitate spatial attention, and that the evolutionarily precoded responses to signals of potential threat are consistently characterized by the early allocation of attention to the locations of threats (Carlson et al., 2014; Carlson et al., 2012). The inclusion of disgust (Liu et al., 2015, and the present study) undeniably added new notions for the interaction of emotion and attention, which indicated that the initial bottom-up modulation of attention includes not only the magnification (for fear and anger), but also the suppression (for disgust), of sensory responses.
Finally, three limitations should be pointed out regarding appropriate interpretation of the present results. First, it has been proved that both conscious and nonconscious emotional processes involve several subcortical regions (for reviews, see Mitchell & Greening, 2012; Smith & Lane, 2015). Unfortunately, the EEG/ERP technique only detects neural activity that occurs in superficial layers of the brain—that is, the cortex (Luck, 2005). Thus, a more sophisticated mechanism for the attentional modulation by threats should be explored further using other techniques, such as magnetic resonance imaging and positron emission tomography. Second, Carlson et al. (2012) proposed that, whereas masked emotional faces primarily influence spatial attention by modulating orienting (e.g., Carlson & Reinke, 2008, 2010), unmasked emotional faces typically influence the disengagement of attention from the threat location (e.g., Cooper & Langton, 2006; Koster et al., 2004). Our behavioral data nicely support this notion (see the supplementary materials); however, the same conclusion cannot be reflected by our ERP data, since no straightforward linkage has currently been found between attention orientation/disengagement and ERP measures. Third, in this study we examined the behavioral and ERP differences between low- and high-disgust-sensitive groups. However, we could not exclude the possibility that the observed group differences were simply due to different perceptual thresholds between the two groups—that is, that individuals with high disgust sensitivity had lower perceptual thresholds for disgusted faces. We suggest a more strict experimental design in future work, which should include a test to measure individual perceptual thresholds for emotional faces.
In summary, the present study demonstrated that nonconscious attentional modulation of disgust has different behavioral and ERP patterns than do other threatening emotions (fear and anger). In light of previous findings (Liu et al., 2015), we concluded that disgust, compared with anger and fear, influences spatial attention differently at both conscious and nonconscious levels. On the basis of these results, we proposed an integrated model regarding the similarities and differences between nonconscious and conscious attentional modulation by threatening emotions.
References
Bertels, J., Kolinsky, R., Coucke, D., & Morais, J. (2013). When a bang makes you run away: Spatial avoidance of threatening environmental sounds. Neuroscience Letters, 535, 78–83.
Bledowski, C., Prvulovic, D., Hoechstetter, K., Scherg, M., Wibral, M., Goebel, R., & Linden, D. E. (2004). Localizing P300 generators in visual target and distracter processing: A combined event-related potential and functional magnetic resonance imaging study. Journal of Neuroscience, 24, 9353–9360.
Borg, C., de Jong, P. J., Renken, R. J., & Georgiadis, J. R. (2013). Disgust trait modulates frontal-posterior coupling as a function of disgust domain. Social Cognitive and Affective Neuroscience, 8, 351–358.
Brosch, T., Sander, D., Pourtois, G., & Scherer, K. R. (2008). Beyond fear rapid spatial orienting toward positive emotional stimuli. Psychological Science, 19, 362–370.
Brosch, T., Scherer, K. R., Grandjean, D., & Sander, D. (2013). The impact of emotion on perception, attention, memory, and decision-making. Swiss Medical Weekly, 143, w13786.
Cacioppo, J. T., & Gardner, W. L. (1999). Emotion. Annual Review of Psychology, 50, 191–214. doi:10.1146/annurev.psych.50.1.191
Carlson, J. M., & Reinke, K. S. (2008). Masked fearful faces modulate the orienting of covert spatial attention. Emotion, 8, 522–529.
Carlson, J. M., & Reinke, K. S. (2010). Spatial attention-related modulation of the N170 by backward masked fearful faces. Brain and Cognition, 73, 20–27. doi:10.1016/j.bandc.2010.01.007
Carlson, J. M., Fee, A. L., & Reinke, K. S. (2009). Backward masked snakes and guns modulate spatial attention. Evolutionary Psychology, 7, 527–537.
Carlson, J. M., Reinke, K. S., LaMontagne, P. J., & Habib, R. (2011). Backward masked fearful faces enhance contralateral occipital cortical activity for visual targets within the spotlight of attention. Social Cognitive and Affective Neuroscience, 6, 639–645. doi:10.1093/scan/nsq076
Carlson, J. M., Mujica-Parodi, L. R., Harmon-Jones, E., & Hajcak, G. (2012). The orienting of spatial attention to backward masked fearful faces is associated with variation in the serotonin transporter gene. Emotion, 12, 203–207.
Carlson, J. M., Cha, J., & Mujica-Parodi, L. R. (2013). Functional and structural amygdala–anterior cingulate connectivity correlates with attentional bias to masked fearful faces. Cortex, 49, 2595–2600. doi:10.1016/j.cortex.2013.07.008
Carlson, J. M., Cha, J., Harmon-Jones, E., Mujica-Parodi, L. R., & Hajcak, G. (2014). Influence of the BDNF genotype on amygdalo-prefrontal white matter microstructure is linked to nonconscious attention bias to threat. Cerebral Cortex, 24, 2249–2257. doi:10.1093/cercor/bht089
Chapman, H. A., & Anderson, A. K. (2012). Understanding disgust. Annals of the New York Academy of Sciences, 1251, 62–76.
Cisler, J. M., & Olatunji, B. O. (2010). Components of attentional biases in contamination fear: Evidence for difficulty in disengagement. Behaviour Research Therapy, 48, 74–78.
Curtis, V., de Barra, M., Aunger, R. (2011). Disgust as an adaptive system for disease avoidance behaviour. Philosophical Transactions of the Royal Society B: Biological Sciences, 366, 389–401.
Cooper, R. M., & Langton, S. R. (2006). Attentional bias to angry faces using the dot-probe task? It depends when you look for it. Behaviour Research and Therapy, 44, 1321–1329.
Dannlowski, U., Kersting, A., Donges, U. S., Lalee-Mentzel, J., Arolt, V., & Suslow, T. (2006). Masked facial affect priming is associated with therapy response in clinical depression. European Archives of Psychiatry and Clinical Neuroscience, 256, 215–221.
Darwin, C. (1872). The expression of the emotions in man and animals. Chicago, IL: University of Chicago Press.
de Gelder, B., Hortensius, R., & Tamietto, M. (2012). Attention and awareness each influence amygdala activity for dynamic bodily expressions—A short review. Frontiers in Integrative Neuroscience, 6, 54.
Dien, J. (2010a). The ERP PCA Toolkit: An open source program for advanced statistical analysis of event-related potential data. Journal of Neuroscience Methods, 187, 138–145.
Dien, J. (2010b). Evaluating two-step PCA of ERP data with Geomin, Infomax, Oblimin, Promax, and Varimax rotations. Psychophysiology, 47, 170–183. doi:10.1111/j.1469-8986.2009.00885.x
Domínguez-Borràs, J., Saj, A., Armony, J. L., & Vuilleumier, P. (2012). Emotional processing and its impact on unilateral neglect and extinction. Neuropsychologia, 50, 1054–1071.
Foti, D., Weinberg, A., Dien, J., & Hajcak, G. (2011). Event-related potential activity in the basal ganglia differentiates rewards from nonrewards: Temporospatial principal components analysis and source localization of the feedback negativity. Human Brain Mapping, 32, 2207–2216.
Fox, E. (2002). Processing emotional facial expressions: The role of anxiety and awareness. Cognitive, Affective, & Behavioral Neuroscience, 2, 52–63. doi:10.3758/CABN.2.1.52
Friedman, D., Cycowicz, Y. M., & Gaeta, H. (2001). The novelty P3: An event-related brain potential (ERP) sign of the brain’s evaluation of novelty. Neuroscience & Biobehavioral Reviews, 25, 355–373. doi:10.1016/S0149-7634(01)00019-7
Goldstein, A., Spencer, K. M., & Donchin, E. (2002). The influence of stimulus deviance and novelty on the P300 and Novelty P3. Psychophysiology, 39, 781–790.
Gong, X., Huang, Y., Wang, Y., & Luo, Y. (2011). Revision of the Chinese facial affective picture system. Chinese Mental Health Journal, 25, 40–46.
Haidt, J., McCauley, C., & Rozin, P. (1994). Individual differences in sensitivity to disgust: A scale sampling seven domains of disgust elicitors. Personality and Individual Differences, 16, 701–713.
Halgren, E., & Marinkovic, K. (1995). Neurophysiological networks integrating human emotions. In M. S. Gazzaniga (Ed.), The cognitive neurosciences (pp. 1137–1151). Cambridge, MA: MIT Press.
Hillyard, S. A., Vogel, E. K., & Luck, S. J. (1998). Sensory gain control (amplification) as a mechanism of selective attention: Electrophysiological and neuroimaging evidence. Philosophical Transactions of the Royal Society B, 353, 1257–1270. doi:10.1098/rstb.1998.0281
Johnston, V. S., Miller, D. R., & Burleson, M. H. (1986). Multiple P3s to emotional stimuli and their theoretical significance. Psychophysiology, 23, 684–694.
Jones, D. (2007). Moral psychology: The depths of disgust. Nature, 447, 768–771.
Kenemans, J. L. (1992). Visual stimulus change and the orienting reaction: Event-related potential evidence for a two-stage process. Biological Psychology, 33, 97–114.
Koster, E. H. W., Crombez, G., Verschuere, B., & De Houwer, J. (2004). Selective attention to threat in the dot probe paradigm: Differentiating vigilance and difficulty to disengage. Behaviour Research and Therapy, 42, 1183–1192. doi:10.1016/j.brat.2003.08.001
LeDoux, J. E. (1996). The emotional brain. London, UK: Weidenfeld Nicolson.
Liddell, B. J., Williams, L. M., Rathjen, J., Shevrin, H., & Gordon, E. (2004). A temporal dissociation of subliminal versus supraliminal fear perception: An event-related potential study. Journal of Cognitive Neuroscience, 16, 479–486. doi:10.1162/089892904322926809
Liddell, B. J., Brown, K. J., Kemp, A. H., Barton, M. J., Das, P., Peduto, A.,…Williams, L. M. (2005). A direct brainstem–amygdala–cortical “alarm” system for subliminal signals of fear. NeuroImage, 24, 235–243.
Lim, J., Ho, P. M., & Mullette-Gillman, O. A. (2015). Modulation of incentivized dishonesty by disgust facial expressions. Frontiers in Neuroscience, 9, 250.
Liu, Y., Zhang, D., & Luo, Y. (2015). How disgust facilitates avoidance: An ERP study on attention modulation by threats. Social Cognitive and Affective Neuroscience, 10, 598–604.
Lubman, D. I., Peters, L. A., Mogg, K., Bradley, B. P., & Deakin, J. F. (2000). Attentional bias for drug cues in opiate dependence. Psychological Medicine, 30, 169–175.
Luck, S. J. (2005). An introduction to the event-related potential technique. London, UK: MIT Press.
MacLeod, C., Mathews, A., & Tata, P. (1986). Attentional bias in emotional disorders. Journal of Abnormal Psychology, 95, 15–20. doi:10.1037/0021-843X.95.1.15
MacNamara, A., Ochsner, K. N., & Hajcak, G. (2011). Previously reappraised: The lasting effect of description type on picture-elicited electrocortical activity. Social Cognitive and Affective Neuroscience, 6, 348–358.
Mitchell, D. G., & Greening, S. G. (2012). Conscious perception of emotional stimuli: Brain mechanisms. The Neuroscientist, 18, 386–398.
Mogg, K., & Bradley, B. P. (1999). Orienting of attention to threatening facial expressions presented under conditions of restricted awareness. Cognition and Emotion, 13, 713–740.
Mulckhuyse, M., & Theeuwes, J. (2010). Unconscious attentional orienting to exogenous cues: A review of the literature. Acta Psychologica, 134, 299–309.
Nelson, L. D., Patrick, C. J., Collins, P., Lang, A. R., & Bernat, E. M. (2011). Alcohol impairs brain reactivity to explicit loss feedback. Psychopharmacology, 218, 419–428.
Oaten, M., Stevenson, R. J., & Case, T. I. (2009). Disgust as a disease-avoidance mechanism. Psychological Bulletin, 135, 303–322.
Öhman, A., Flykt, A., & Esteves, F. (2001). Emotion drives attention: Detecting the snake in the grass. Journal of Experimental Psychology: General, 130, 466–478. doi:10.1037/0096-3445.130.3.466
Olatunji, B. O., Williams, N. L., Tolin, D. F., Sawchuck, C. N., Abramowitz, J. S., Lohr, J. M., & Elwood, L. S. (2007). The disgust scale: Item analysis, factor structure, and suggestions for refinement. Psychological Assessment, 19, 281–297.
Pascual-Marqui, R. D. (2002). Standardized low-resolution brain electromagnetic tomography (sLORETA): Technical details. Methods and Findings in Experimental and Clinical Pharmacology, 24, 5–12.
Phelps, E. A., Ling, S., & Carrasco, M. (2006). Emotion facilitates perception and potentiates the perceptual benefits of attention. Psychological Science, 17, 292–299. doi:10.1111/j.1467-9280.2006.01701.x
Polich, J. (2003). Theoretical overview of P3a and P3b. In J. Polich (Ed.), Detection of change: Event-related potential and fMRI findings (pp. 83–98). Boston, MA: Kluwer Academic Press.
Polich, J. (2007). Updating P300: An integrative theory of P3a and P3b. Clinical Neurophysiology, 118, 2128–2148.
Pollak, S. D., & Tolley-Schell, S. A. (2003). Selective attention to facial emotion in physically abused children. Journal of Abnormal Psychology, 112, 323–335.
Pourtois, G., & Vuilleumier, P. (2006). Dynamics of emotional effects on spatial attention in the human visual cortex. Progress in Brain Research, 156, 67–91.
Pourtois, G., Grandjean, D., Sander, D., & Vuilleumier, P. (2004). Electrophysiological correlates of rapid spatial orienting towards fearful faces. Cerebral Cortex, 14, 619–633. doi:10.1093/cercor/bhh023
Pourtois, G., Schwartz, S., Seghier, M. L., Lazeyras, F., & Vuilleumier, P. (2006). Neural systems for orienting attention to the location of threat signals: An event-related fMRI study. NeuroImage, 31, 920–933. doi:10.1016/j.neuroimage.2005.12.034
Pourtois, G., Schettino, A., & Vuilleumier, P. (2013). Brain mechanisms for emotional influences on perception and attention: What is magic and what is not. Biological Psychology, 92, 492–512.
Shek, D. T. (1993). The Chinese version of the State–Trait Anxiety Inventory: Its relationship to different measures of psychological well-being. Journal of Clinical Psychology, 49, 349–358.
Sherlock, J. M., Zietsch, B. P., Tyber, J. M., & Jern, P. (2016). The quantitative genetics of disgust sensitivity. Emotion, 16, 43–51.
Smith, R., & Lane, R. D. (2015). The neural basis of one’s own conscious and unconscious emotional states. Neuroscience & Biobehavioral Reviews, 57, 1–29. doi:10.1016/j.neubiorev.2015.08.003
Spielberger, C. D., Gorsuch, R. L., Lushene, R., Vagg, P. R., & Jacobs, G. A. (1983). Manual for the State–Trait Anxiety Inventory. Palo Alto, CA: Consulting Psychologists Press.
Tamietto, M., & de Gelder, B. (2010). Neural bases of the non-conscious perception of emotional signals. Nature Reviews Neuroscience, 11, 697–709.
Tamietto, M., Cauda, F., Celeghin, A., Diano, M., Costa, T., Cossa, F. M.,…de Gelder, B. (2015). Once you feel it, you see it: Insula and sensory–motor contribution to visual awareness for fearful bodies in parietal neglect. Cortex, 62, 56–72.
Taylor, J. G., & Fragopanagos, N. F. (2005). The interaction of attention and emotion. Neural Networks, 18, 353–369.
Townshend, J., & Duka, T. (2001). Attentional bias associated with alcohol cues: Differences between heavy and occasional social drinkers. Psychopharmacology, 157, 67–74.
van Hooff, J. C., Devue, C., Vieweg, P. E., & Theeuwes, J. (2013). Disgust- and not fear-evoking images hold our attention. Acta Psychologica, 143, 1–6. doi:10.1016/j.actpsy.2013.02.001
Williams, L. M., Liddell, B. J., Rathjen, J., Brown, K. J., Gray, J., Phillips, M.,…Gordon, E. (2004). Mapping the time course of nonconscious and conscious perception of fear: An integration of central and peripheral measures. Human Brain Mapping, 21, 64–74.
Williams, L. M., Liddell, B. J., Kemp, A. H., Bryant, R. A., Meares, R. A., Peduto, A. S., & Gordon, E. (2006). Amygdala–prefrontal dissociation of subliminal and supraliminal fear. Human Brain Mapping, 27, 652–661.
Yiend, J. (2010). The effects of emotion on attention: A review of attentional processing of emotional information. Cognition and Emotion, 24, 3–47. doi:10.1080/02699930903205698
Zhang, D., Wang, L., Luo, Y., & Luo, Y. J. (2012). Individual differences in detecting rapidly presented fearful faces. PLoS ONE, 7, e49517. doi:10.1371/journal.pone.0049517
Zhang, D., Gu, R., Wu, T., Broster, L. S., Luo, Y., Jiang, Y., & Luo, Y. J. (2013). An electrophysiological index of changes in risk decision-making strategies. Neuropsychologia, 51, 1397–1407.
Zhang, X., Guo, Q., Zhang, Y., Lou, L., & Ding, D. (2015). Different timing features in brain processing of core and moral disgust pictures: An event-related potentials study. PLoS ONE, 10, e0128531. doi:10.1371/journal.pone.0128531
Zimmer, U., Keppel, M. T., Poglitsch, C., & Ischebeck, A. (2015). ERP evidence for spatial attention being directed away from disgusting locations. Psychophysiology, 52, 1317–1327.
Author note
This study was funded by the National Natural Science Foundation of China (Grant Nos. 31571120, 31300867, and 31300869) and the National Key Basic Research Program of China (973 Program, Grant No. 2014CB744600).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zhang, D., Liu, Y., Wang, L. et al. Mechanisms for attentional modulation by threatening emotions of fear, anger, and disgust. Cogn Affect Behav Neurosci 17, 198–210 (2017). https://doi.org/10.3758/s13415-016-0473-9
Published:
Issue Date:
DOI: https://doi.org/10.3758/s13415-016-0473-9