Skip to main content

The dot-probe task to measure emotional attention: A suitable measure in comparative studies?

Abstract

For social animals, attending to and recognizing the emotional expressions of other individuals is of crucial importance for their survival and likely has a deep evolutionary origin. Gaining insight into how emotional expressions evolved as adaptations over the course of evolution can be achieved by making direct cross-species comparisons. To that extent, experimental paradigms that are suitable for investigating emotional processing across species need to be developed and evaluated. The emotional dot-probe task, which measures attention allocation toward emotional stimuli, has this potential. The task is implicit, and subjects need minimal training to perform the task successfully. Findings in nonhuman primates, although scarce, show that they, like humans, have an attentional bias toward emotional stimuli. However, the wide literature on human studies has shown that different factors can have important moderating effects on the results. Due to the large heterogeneity of this literature, these moderating effects often remain unnoticed. We here review this literature and show that subject characteristics and differences in experimental designs affect the results of the dot-probe task. We conclude with specific recommendations regarding these issues that are particularly relevant to take into consideration when applying this paradigm to study animals.

For social animals, primates included, the fast and accurate recognition of emotion signals of other individuals is of crucial importance for the maintenance of social bonds, group cohesion and ultimately for group survival. For example, emotion signals play a significant role in warning against predators, getting help or support in difficult situations and in partner choice. The origin and evolution of these emotion signals is a topic of extensive research (e.g., Darwin, 1872/1965; Micheletta, Whitehouse, Parr & Waller, 2015; Parr & Waller, 2006). Examples of questions that are frequently addressed in the literature are: What are the functions of emotion signals? Are emotion signals of humans and other primates comparable? Do they serve similar functions in different species? Are they perceived in similar ways by conspecifics? In contrast to humans, who can explicitly report on their emotions, no option for such a direct measurement is available in nonhuman primates. Nonetheless, observation research in primates has beautifully demonstrated that they can efficiently respond to others’ emotions and that emotion regulatory behaviors such as reconciliation, consolation or empathic responses foster close, long-term bonds with group members (de Waal, 2008; de Waal & van Roosmalen, 1979; Palagi, Dall’Olio, Demuru, & Stanyon, 2014; Spoor & Kelly, 2004). However, although all primates thus display observable emotional behaviors at least to some extent, the neurocognitive mechanisms underlying explicit behaviors are not directly visible and therefore remain largely uncovered. Previous research in humans and some nonhuman primates has shown that one component of individual’s sensitivity to others’ emotions, is heightened immediate attention to their affective states (Phelps, Ling, Carrasco, 2006; Schupp, Junghofer, Weike, & Hamm, 2003; Vuilleumier, 2005). To gain insight into the above-posed questions, we will focus this review on emotional attention and specifically, on the usability of a task that can measure this implicitly. The dot-probe task is a paradigm that is often used in psychology and has a lot of potential for testing emotional attention across primate species, primarily because this test is implicit and because evidence, although still scarce, is accumulating that it does not require verbal instruction and that subjects need no or minimal training to perform the test successfully (King, Kurdziel, Meyer, & Lacreuse, 2012; Kret, Jaasma, Bionda, & Wijnen, 2016; Parr, Modi, Siebert, & Young, 2013).

The aim of this article is threefold. First, we support the view that emotion signals and their perception are adaptations. There are different ways to support this view. We provide a functional analysis (do emotion signals and their perception result in an increase in survival chances?), neuroscientific support (are these signals associated with identifiable brain regions or neural circuits?), cross-cultural support (are these signals universal traits?), cross-species support (are these signals present in species with which humans share a relatively recent ancestor?), and attentional support (do emotion signals attract more attention than neutral signals?). In this way, we provide a nomological network of evidence around the hypothesis that emotion signals and their perception are adaptations (Schmitt & Pilcher, 2004).

Second, we address the question whether the dot-probe task is a relevant paradigm to investigate the perception of emotions across species, in order to find further support for the evolutionary view on emotion signals and their recognition. As this task is often used in psychology research, the goal is to examine whether this task can provide a solution to testing emotional processing comparatively. Emotion perception can also be investigated through observation research, but experimental tasks can tap into more unconscious processes not highlighted by observation research. In turn, the dot-probe task has some shortcomings as well, which will be discussed in this article. Yet, we will investigate the additive function of this task in emotion perception research.

Because of the limited number of studies with the dot-probe task in nonhuman species, we decided to include a review of human studies with the dot-probe task, to get a better understanding of the task and its strengths and limitations. Thus, the third aim of this article is to present a review of human studies with the dot-probe task, with the specific goal in mind to learn from these studies on how to develop a version of this task that can be useful for comparisons with nonhuman primates.

Emotion signals as adaptations

Evolutionary psychologists argue that the expression and recognition of emotional events or signals are adaptations that evolved in social animals. They suggest that emotion expression and the perception of emotions have a specific function, are heritable, and lead to an increase in fitness. Our ancestors evolved in an environment where encounters with resources and danger were unpredictable. Chances of survival were therefore dependent on the ability of an individual to efficiently locate these events. Food and mating partners had to be located, whereas dangers had to be quickly noted and then avoided (Öhman, Flykt, & Esteves, 2001).

Because it is often hard to find conclusive evidence for the existence of a psychological adaptation, Schmitt and Pilcher (2004) proposed to develop a nomological network of evidence concerning the proposed adaptation. An example of such a network is presented in Fig. 1. First, one has to start with a good theory or a functional analysis of the adaptation. In addition, data can be gathered from different disciplines, including psychology, medicine, neuroscience, genetics, ethology, and anthropology. When the data are consistent with the hypothesis that the proposed psychological phenomenon is an adaptation, then the hypothesis is considered to be well-supported.

Fig. 1
figure 1

The nomological network of evidence for evaluating a psychological adaptation. Evidence from various scientific disciplines is combined to assess the evidentiary breadth and depth of the hypothesis that emotion expression is an adaptation. (Adapted from Schmitt & Pilcher, 2004, and Ploeger & van der Hoort, 2015)

Here we will discuss a functional analysis (“theoretical” evidence), neuroscientific studies (physiological evidence), cross-cultural studies, cross-species studies (phylogenetic evidence), and studies on attentional processes (psychological evidence).

Functional analysis

Emotional expressions can heighten the chances of survival in two ways. First, emotional expressions can have a direct benefit to the expresser (Darwin, 1872/1965; Lee, Susskind, & Anderson, 2013). Fearful expressions enlarge the visual field by widening the eyes, such that threatening events can be detected faster. In contrast, disgusted expressions narrow the visual field, which reduces the amount of aversive perceptual input (Susskind et al., 2008). A surprised expression elicits similar changes in facial musculature as a fearful expression, whereas anger resembles a disgusted expression (Susskind & Anderson, 2008). Second, emotional expressions have a communicative function. Emotional expressions convey important information about the presence of positive and negative affairs in the environment. A fearful or threatening facial expression of another individual might signal danger in the environment. A sad facial expression may be a request for help and a disgusted facial expression can be an important signal for noxious stimuli (Seidel, Habel, Kirschner, Gur, & Derntl, 2010).

For emotional expressions to be adaptive, the perception of these expressions should have an effect on expressers’ and/or observers’ behavior. Expressions of fear and anger should elicit escape behavior to efficiently avoid potentially dangerous situations. In contrast, positive expressions should elicit approach behavior as these expressions signal desirable situations such as the presence of food (Lang, Bradley & Cuthbert, 1998). This line of reasoning is supported by the finding that different emotions are accompanied by different action tendencies. A total of 18 action readiness modes have been proposed, including approach, avoidance, attending, and submission (Frijda, 1987).

To initiate an action tendency in response to the perception of an emotional stimulus, the perceptual system should be effectively linked to the motor system. This link then results in the unconscious predisposition to either approach or avoid a situation (Frijda, Kuipers, & ter Schure, 1989; Öhman & Mineka, 2001). The link between the perceptual and motor systems has been investigated experimentally by presenting subjects with positive and negative pictures (Marsh, Ambady, & Kleck, 2005). In Marsh et al.’s study, half of the subjects had to push a lever away from their body in response to negative pictures and pull the lever toward themselves in response to positive images. The other half of the subjects had to respond to the pictures in the opposite directions. The results showed that subjects were much faster to react to the stimuli in the congruent situation (positive/pull and negative/push) than in the incongruent situation (positive/push and negative/pull) (for similar results, see Roelofs, Elzinga, & Rotteveel, 2005; Seidel et al., 2010; but see also Wilkowski & Meier, 2010). This suggests that positive stimuli evoke an unconscious approach tendency, whereas negative stimuli evoke an avoidance tendency. In the study by Seidel et al., subjects showed an unconscious avoidance tendency following angry faces and an approach tendency following happy, fearful and sad faces. Faces with a disgusted expression did not evoke a specific action tendency, which might be due to the inclusion of two different kinds of disgusted expressions—that is, a nonsocial food-offense disgust, in addition to an individual-related disgust. In addition, these action tendencies not only apply to hand movements, but generalize to whole body movements (Stins et al., 2011). Specifically, these researchers showed that subjects stepped faster toward a smiling than toward an angry face, yet no significant difference was found for stepping away from these faces. Overall, these studies support the view that unconscious action tendencies play an important role in the processing of an emotional stimulus.

Neuroscientific studies

On the neuronal level, emotional expressions, like other visual stimuli, are processed via the neocortex, mainly by visual areas constituting the what- and where-pathways (Kret, Pichon, Grezes, & de Gelder, 2011; for reviews, see Kragel, Knodt, Hariri, & LaBar, 2016; Vuilleumier & Pourtois, 2007). Additionally, a subcortical route might also exist that travels from the relay nuclei of the thalamus directly to the amygdala, superior colliculus, and pulvinar (de Gelder, Frissen, Barton, & Hadjikhani, 2003; Garvert, Friston, Dolan, & Garrido, 2014; LeDoux, 1996). This route supports fast and unconscious processing of emotional stimuli, which in turn modulates processes in cortical areas (Adolphs, 2002; Johnson, 2005). The subcortical route is most commonly activated by fearful faces, but also by smiling faces (for a review, see Zald, 2003) or emotional body language (de Gelder, Van den Stock, Meeren, Sinke, Kret & Tamietto, 2010). The existence of a subcortical route is further supported by the finding that the amygdala is activated in response to masked fearful faces, even in the absence of conscious perception (e.g., Whalen et al., 1998). In addition, patients with lesions in visual cortical areas can discriminate the valence of emotional expressions (e.g., de Gelder, Vroomen, Pourtois, & Weiskrantz, 1999), which is also accompanied by activation of the right amygdala (Pegna, Khateb, Lazeyras, & Seghier, 2005).

The perception of a facial expression induces three consecutive processes (Phillips, Drevets, Rauch, & Lane, 2003). First, the significance of the stimulus is identified, which is followed by the induction of an affective state by autonomic, neuroendocrinic, and behavioral responses. Finally, the first two processes are modulated such that the produced behavior is contextually appropriate. These processes seem to be supported by two neural systems: A ventral system subserves the more automatic processes such as the identification of the stimulus and autonomic responses, and a dorsal system regulates the integration of the emotional input and cognitive processes such as selective attention (Phillips et al., 2003).

Cross-cultural studies

Cross-cultural comparisons suggest that most emotional expressions are universal over different cultures. In a classic study, Ekman and colleagues (1969) discovered that a tribe in New Guinea expressed and interpreted facial expressions similarly to people in the West, even though this tribe had never been in contact with the Western culture before. This study was extended to 21 countries around the world, which all showed similar results (Ekman, 1973). Even though facial expressions appear to be universal, there are slight differences in expressions across cultures. These differences can be seen as “emotional dialects,” since cultural groups who live in proximity to each other are faster to recognize each other’s facial expressions than are more distant groups (for a review, see Elfenbein, 2013). However, the universality of emotional expressions is questioned by others (Crivelli, Russell, Jarillo, & Fernández-Dols, 2016; Gendron, Roberson, van der Vyver, & Barrett, 2014). A recent study indicated that adolescents of a tribe in visual isolation from the Western world interpreted a gasping face (by Western people seen as conveying fear) as conveying anger and threat (Crivelli et al., 2016). Thus, the interpretation of facial expressions can vary over cultures. Yet, in all cultures, facial expressions are still a sign to convey information about the environment or the emotions of the individual.

Cross-species studies

Cross-species comparisons suggest that emotional expressions are also universal over several species. Darwin (1872/1965) already mentioned that nonhuman primates and humans show similar emotional expressions. These emotional expressions are similar in both their morphological structure and their social function (Preuschoft & van Hooff, 1995; Parr & Waller, 2006; Parr, Waller, Vick, & Bard, 2007). For example, chimpanzees (de Waal, 2003) and bonobos (Palagi, 2008) express the “relaxed open mouth display” (ROM) during play, which is comparable to the human laugh (for more on play faces, see Palagi & Mancini, 2011). When the ROM was bidirectionally expressed during play, the time of playing increased (Waller & Dunbar, 2005), and laughing by one individual induced laughing in the other individual (Davila-Ross, Allcock, Thomas, & Bard, 2011). Moreover, the intensity and complexity of facial play displays in orangutans increased when recipient attention is directed toward the sender (Waller, Caeiro, & Davila-Ross, 2015). This suggests that the ROM signals social bonding and affection, similar to the function of the human laugh. In addition, the acoustic properties of tickling-induced laughter are homologous in great apes and humans, which indicates phylogenetic continuity from nonhuman displays to human emotional expressions (Preuschoft & van Hooff, 1997; Ross, Owren, & Zimmermann, 2009).

Nonhuman primates are not merely able to express emotions, but they also successfully distinguish emotional expressions. Chimpanzees are capable of discriminating an emotional facial expression made by two different individuals from that of a neutral expression of a third individual. Successful discrimination of different emotional expressions was dependent on the amount of shared features between the two expressions (Parr, Hopkins, & de Waal, 1998). Rhesus monkeys are also capable of discriminating emotional expressions from neutral expressions, yet they showed more difficulties with discriminating two distinct emotional expressions (Parr & Heintz, 2009). Moreover, macaque cardiac physiology is sensitive to the valence of passively viewed sensory stimuli (Bliss-Moreau, Machado, & Amaral, 2013).

Nonhuman primates and humans seem to process emotion signals in a similar way. Chimpanzees are capable of recognizing the valence of an emotional expression as they can successfully match emotional videos (e.g., showing favorite food and veterinarian procedures) to videos of emotional expressions with the same emotional meaning (Parr, 2001). Enhanced recognition memory for emotional stimuli, like in humans, was demonstrated as well (Kano, Tanaka, & Tomonaga, 2008). In addition, chimpanzees show cortical asymmetries in physiological responses when watching videos of conspecifics expressing a certain emotion similar to responses observed in humans (Parr & Hopkins, 2000; for a review, see Lindell, 2013). Monkeys seem to integrate facial expressions in higher processes as well. Facial expressions affected the gaze-following behavior in Barbary macaques (Teufel, Gutmann, Pirow, & Fischer, 2010) and longtailed macaques (Goossens, Dekleva, Reader, Sterck, & Bolhuis, 2008). Moreover, positive and negative expressions influenced object preference in capuchin monkeys (Morimoto & Fujita, 2012) and great apes (Buttelmann, Call, & Tomasello, 2009).

Similar emotional processing in humans and other primates is further supported by the discovery of similar (emotional) face processing circuits (e.g., Pinsk et al., 2009; Tsao, Moeller, & Freiwald, 2008) and affective picture processing (Hirata et al., 2013). In both humans and macaque monkeys the superior temporal sulcus (STS) is activated by the observation of emotional faces, which is lateralized to the right in humans but not in monkeys (de Winter et al., 2015). Moreover, macaques show STS and amygdala activation in response to faces, although this amygdala activation is dependent on the dimensions of the faces (Hoffman, Gothard, Schmid, & Logothetis, 2007). Also in chimpanzees, the STS and orbitofrontal cortex were activated during face processing, comparable to face processing in humans (Parr, Hecht, Barks, Preuss, & Votaw, 2009). However, differences in (emotional) face processing between humans and nonhuman primates are also reported (Polosecki et al., 2013; Zhu et al., 2013). Studies with similar techniques and experimental designs across species need to be conducted to identify the similarities and differences in primate emotion processing (for a review, see Yovel & Freiwald, 2013).

Studies on attentional processes

For emotion signals to be beneficial in threatening situations it is essential that these signals are rapidly detected in the environment. Over the course of evolution, the stimuli relevant for survival became automatic triggers of attention. For example, in visual search tasks, pictures of snakes elicited faster response times than pictures of flowers in both humans (Öhman, Flykt, & Esteves, 2001) and macaque monkeys (Shibasaki & Kawai, 2009). In addition, faces with an angry expression are detected faster in a visual search task than happy or neutral faces (e.g., Öhman, Lundqvist, & Esteves, 2001; Pinkham, Griffin, Baron, Sasson & Gur, 2010; for a review, see Frischen, Eastwood & Smilek, 2008; but see Becker, Anderson, Mortensen, Neufeld, & Neel, 2011) and search times for emotional faces are not affected by array size (e.g., Eastwood, Smilek & Merikle, 2001; Fox et al., 2000; Juth, Lundqvist, Karlsson, & Öhman, 2005). The faster detection of emotional faces is probably most influenced by the heightened arousal induced by these faces (Lundqvist, Bruce, & Öhman, 2015).

Instead of driving attention toward its location, a negative valued stimulus might also drive attention away from its location (Mogg et al., 2000). In some situations, it might be beneficial to maintain attention on current goals without being distracted by task-irrelevant negative cues. In addition, driving attention away from negative information might serve to reduce anxiety and maintain positive mood states. The process of driving attention away from negative stimuli seems mainly applicable to mildly aversive information (Koster, Verschuere, Crombez, & van Damme, 2005; Wilson & MacLeod, 2003).

Several concepts are related to the interaction between emotion and attention (Yiend, 2010). One of these concepts is “selection,” a process that selects that part of the total visual input that is relevant for further processing. This selection process depends on different aspects, for instance the properties of the situation at hand and the goals and expectations of the individual (Yantis, 1996). An important step in the selection process is the allocation of attention to a certain location. Stimuli that are simultaneously present compete for processing space (Desimone & Duncan, 1995). The stimuli present in the attended location become amplified and are therefore more likely to be selected for further processing. Attention is allocated more rapidly to a location containing emotional stimuli than to a location containing neutral stimuli. Moreover, the disengagement from a location containing emotional stimuli is more difficult. As a result, emotional stimuli are selected for further processing more often than neutral stimuli (e.g., Bradley et al., 1997; Yiend, 2010). The prioritized selection of emotional stimuli is even more enhanced when arousal increases (Lee, Sakaki, Cheng, Velasco, & Mather, 2014; Mather & Sutherland, 2011). The question remains whether nonhuman primates and humans process emotional stimuli similarly and whether the same attentional processes underlie their behavior.

The dot-probe task

An experimental paradigm that is potentially suitable for comparing the attentional processes involved in emotion perception in humans and nonhuman animals is the dot-probe task, first described by MacLeod, Mathews and Tata (1986). This task is implicit, does not require instruction and participants need no or minimal training to perform the test successfully (Parr et al., 2013). Nonhuman primates, even those un familiar to a computer screen, are capable of performing this task after just a few weeks of training (Kret et al., 2016). Hence, the dot-probe task is potentially suitable for testing children, patients with a mental disorder, nonhuman primates and several species of other animals. Comparing the performances of all these subgroups gives insight in the developmental course of emotion perception and enables cross-species comparisons. Eventually, essential insight in the process of the evolution of emotion signals as adaptations might be provided.

In the dot-probe task two stimuli are simultaneously displayed, each one on a different side of the screen. One or both of these stimuli have emotional value, for example, one represents a face with an angry expression and the other a face with a neutral expression. The presentation of these two stimuli is followed by an emotionally neutral task, which involves the detection or discrimination of a probe in the location of one of the stimuli. For example, a dot emerges either on the location of the angry face (the congruent condition) or on the location of the neutral face (the incongruent condition). The participants have to indicate by a button press whether the dot appeared on the right or the left side of the screen. If the participants’ attention was automatically directed to one side of the screen by one of the stimuli, in this case probably the angry face, then reaction times for detecting the dot at this location will be faster than reaction times for detecting the dot in the unattended location. Thus, the reaction times are a function of the interfering emotional expressions. To determine which stimulus attracted more attention, the average reaction time of congruent trials is most commonly subtracted from the average reaction time of incongruent trials, which provides the attentional bias score (MacLeod et al., 1986). A positive value indicates an attentional bias toward the emotional face, in this example an angry face. In contrast, a negative value indicates an attentional bias away from the emotional face as the reaction times were faster when the dot appeared at the location of the neutral face. This suggests avoidance of the emotional stimulus. A value of zero indicates that both stimuli received a similar amount of attention.

An attentional bias toward emotional stimuli might result from either faster orientation toward the emotional stimulus or difficulties with disengaging from the emotional stimulus. These two processes are both components of the attentional system (Posner & Petersen, 1990). Faster orientation toward emotional stimuli is called vigilance (Koster, Crombez, Verschuere, & de Houwer, 2004), which is often observed in individuals with high anxiety levels or individuals who suffer from hypersensitivity (Williams, Watts, MacLeod, & Mathews, 1997). Vigilance results in heightened sensitivity for negative information in clinical patients, which is associated with less efficient processing of information important for the ongoing behavior (Eysenck, 1992).

Recent research has suggested that the attentional bias toward emotional stimuli might not be related to vigilance, but is rather associated with difficulties in disengagement from the emotional stimuli (Koster et al., 2004). In this study, participants were not only presented with trials consisting of a neutral and emotional stimulus, but also with trials consisting of two neutral stimuli. The authors argued that it would be possible to disentangle the processes of vigilance and disengagement by comparing the reaction times on threat-neutral trials with neutral-neutral trials. Vigilance for threat should result in faster reaction times for congruent threat-neutral trials than for neutral-neutral trials. In contrast, difficulties with disengagement should result in slower reaction times for incongruent threat-neutral trials compared with neutral-neutral trials. Results showed that participants reacted slower on both congruent and incongruent threat-neutral trials than on neutral-neutral trials. This finding supports the view that an attentional bias toward threat is not necessarily a result of faster orientation toward emotional stimuli, but might also be related to difficulties disengaging from these stimuli. Future research might aim to investigate the exact attentional processes at play while performing the dot-probe task.

Although the dot-probe task has been used frequently by different researchers and is considered to be a valid task to measure attentional biases, the results of dot-probe studies show inconsistencies and the underlying mechanisms are not well understood. The large variability in the experimental procedure and the populations used for testing make it difficult to compare the results and conduct a reliable meta-analysis. To date, analyses indicated that the intensity of the threatening stimuli determines the magnitude of the attentional bias (Frewen, Dozois, Joanisse, & Neufeld, 2008) and that the attentional bias seems to be dependent on stimulus presentation time (Bar-Haim, Lamy, Pergamin, Bakermans-Kranenburg, & van IJzendoorn, 2007). Because many research articles had to be left out of the previous meta-analyses due to the large methodological variability in the literature, we will here provide a full review of the literature on dot-probe tasks and will discuss related methodological issues.

Review of dot-probe studies

The main goal of this review is to examine whether the dot-probe task is suitable to be used in cross-species research, which would give more insight in the process of emotion perception and the interaction between attentional processes and emotion signals in different species. We found three studies in which the dot-probe task was performed with nonhuman primates (King et al., 2012; Kret et al., 2016; Parr et al., 2013). These studies showed that both rhesus monkeys (King et al., 2012) and bonobos (Kret et al., 2016) were significantly faster in reacting to a dot replacing an emotional picture than to a dot replacing a neutral picture. The study by Kret et al (2016) also described an enhanced effect for the photographs that the keepers rated as particularly emotionally intense. In addition, oxytocin administration reduced rhesus monkeys’ attention to negative facial expressions (Parr et al., 2013) in similar ways as in humans (Kim et al., 2014). However, humans showed an initial bias toward negative facial expressions, which turned into avoidance after the administration of oxytocin. Rhesus monkeys, on the other hand, did not show an initial bias toward negative facial expressions but started to avoid these expressions after administration of oxytocin.

The performance of humans and rhesus monkeys on the dot-probe task is directly compared by Lacreuse and colleagues (2013). Humans showed an attentional bias toward negative human faces, yet they avoided negative valenced objects. Monkeys also showed an attentional bias toward negative monkey faces, but did not show a bias for objects. Thus, for social stimuli humans and monkeys show a similar pattern in attentional biases. These findings support the hypothesis that over evolution facial expressions became important emotion signals. One limitation of this study is that humans performed the dot-probe task by reacting to the probe with two keys on a keyboard, whereas the monkeys reacted directly on the probe by touching it on a touch screen. It is advisable to use the exact same device when comparing two groups. Responses via touch screens can differ from those via button boxes because two processes are at play. First, heightened vigilance for the location where a threatening stimulus was presented may lead to faster touching of this location. In addition, the threatening stimulus evokes an action tendency to push the stimulus away. This action tendency might intervene with the movement of the arm to touch the screen, a process that is not at play when pressing a button. Therefore, measures with touch screens and button boxes are not directly comparable. Yet, a recent study indicated that attentional bias in the dot probe task are found regardless of which response device is used (Kret, Muramatsu, & Matsuzawa, 2016).

Another limitation is the small sample size in nonhuman primate studies. This does not need to be a limitation, but it is wise to modify the procedure in order to get statistically reliable results. For example, it is recommended to increase the number of trials per individual, preferrably  spread out over different sessions to avoid habituation or fatigue and to also analyze the data within subjects (is the effect found in all subjects, or is it driven by one?). A specific recommendation for the application of the dot-probe task in samples with a small N, is to step away from the common procedure to calculate a bias score and subtract conditions, but to not average any data points and nest all trials within each subjects via a multilevel statistical procedure. That way, reaction times can be analyzed in two separate analyses, (1) as a function of the picture that appeared before on the location of the probe, (2) as a function of the picture that appeared before on the other location, the location opposite of the probe.

Because of the limited number of dot-probe studies with nonhuman primates and their limitations, we decided to include a review of studies with humans, to get a better understanding of the task and to evaluate whether it is a suitable measure for comparative studies. Because nonhuman primates are nonverbal, the studies discussed in this section are studies that conducted the visual dot-probe task, using images as stimuli instead of words.

Dot-probe studies were collected by searching on PubMed, using the terms “dot-probe,” “attention” and “emotion” as search criteria. The search was limited to studies published in English from January 1986 through January 2016. Appendix Table 1 provides a table of the 71 dot-probe studies reviewed. The table gives a reference of used parameters and results obtained with the visual dot-probe task. To enable good comparisons, studies were only included when attentional biases scores were calculated and analyzed. Moreover, studies that only reported group comparisons were excluded as this gives no information about the presence of attentional biases within the groups. In the remainder of this section, we will discuss several topics that should be taken into consideration when performing dot-probe studies.

Subjects

It has been suggested that attentional biases are highly influenced by personal characteristics (e.g., Pérez-Edgar, Bar-Haim, McDermott, Gorodetsky, et al., 2010; Wilson & MacLeod, 2003). Specific emotional and personal variables may determine whether an individual shows an attentional bias toward or away from emotional stimuli. It is still unclear how these personal characteristics influence performances on the dot-probe task (Mogg, Bradley, et al., 2000). In the next section, we will discuss how age, gender, level of anxiety and genotype might influence performances on the dot-probe task.

Age

It has been hypothesized that over age, the attention for emotional stimuli changes (Mather & Carstensen, 2003). A meta-analysis of emotion processing in younger and older adults showed only a limited effect of age on emotional processing (Murphy & Isaacowitz, 2008). Yet, this meta-analysis comprised several experimental procedures, and not only dot-probe studies. A study using the dot-probe task, showed that older people show higher vigilance for happy faces than younger people (Lindstrom et al., 2009). As it was reviewed by Nashiro and colleagues (2011), this shift in attention from negative to positive information with age is accompanied by reduced amygdala activation and enhanced prefrontal cortex activation in response to aversive stimuli.

Mather and Carstensen (2003) conducted in the dot-probe task with a younger age group (mean age of 25.4 years) and an older age group (mean age of 71.5 years). In their study, happy, sad and angry faces were used as the emotional stimuli, each paired with a neutral face. The younger group did not show differences in reaction times for neutral and emotional faces. In contrast, the older group showed an attentional bias toward happy faces and an attentional bias away from negative faces. This indicates that in older adults attention is typically  drawn toward positive information, whereas negative information is avoided. This avoidance of negative information in the older age group was found again when the authors replicated their study, yet in this experiment the participants did not show an attentional bias toward happy faces.

A study by Isaacowitz and colleagues (2006) did not find an attentional bias away from negative faces in an older age group, yet a study by Orgeta (2011) did. A more recent study found a bias toward negative faces when they were paired with neutral faces, yet this bias disappeared when the negative faces were paired with positive faces (Tomaszczyk & Fernandes, 2014). There are several possible explanations for the discrepancy in the results. First, all three studies that revealed an attentional bias used photographs as stimulus material, whereas Isaacowitz and colleagues used schematic faces. Second, Isaacowitz and colleagues used sad faces as negative stimuli whereas the other studies used a combination of sad and angry faces (Mather & Carstensen, 2003) or just angry faces (Orgeta, 2011; Tomaszczyk & Fernandes, 2014). Lastly, the stimulus presentation time differed between the studies. In Isaacowitz’s study the face pair was presented for 2,000 ms, as compared to 1,000 ms in the other studies. The longer presentation time might have caused slower and more variable reaction times for pressing the button, as attention might drift easily and multiple saccades can be made during this 2,000-ms time period. Isaacowitz et al. (2006) only reported bias scores, rendering reaction times invisible, so it is not possible to validate this presumption. The possible influences of these different parameters on results of the dot-probe task will be discussed in later sections.

Performance on the dot-probe task has also been investigated in children and adolescents. In these younger populations, attentional biases for emotional stimuli are usually not found (Heim-Dreger, Kohlmann, Eschenbeck, & Burkhardt, 2006; Pérez-Edgar et al., 2011; Roy et al., 2008; Susa, Pitică, Benga, & Miclea, 2012; Waters, Kokkoris, Mogg, Bradley, & Pine, 2010; Waters, Mogg, Bradley, & Pine, 2008), although a recent study found avoidance of negative faces and vigilance for positive faces in 8-year-old children (Brown et al., 2013). Brown et al. (2013) used a combination of facial expressions as negative stimuli whereas the other studies only used angry expressions, which might explain the different results. Moreover, in the studies that showed no bias, the children had to press a button the moment they perceived the probe, whereas in Brown’s study the children had to determine whether the probe was a triangle or a square. The second method requires more cognitive load and might therefore measure a different underlying processes.

On the basis of the studies named here, it seems that, in general, children do not yet show an attentional bias. However, whether children show an attentional bias seems to be highly dependent on the environment they grow up in. For instance, maternal emotional disorder influences attentional biases, especially in girls (Kujawa et al., 2011; Montagner et al., 2016). In addition, children who grew up in institutional care tend to show an attentional bias toward angry faces (Troller-Renfree, McDermott, Nelson, Zeanah, & Fox, 2015). This bias is not found in children who were placed in foster care. Moreover, these children in foster care do show an attentional bias toward happy faces, which was not found in institutional care children. Last, neighborhood crime is related to attentional biases in children. McCoy and colleagues (2016) showed that an increase in neighborhood crime was associated with a faster response to negative stimuli.

From an evolutionary point of view, it is most adaptive to attend to relevant stimuli in the environment, whether these are threats that one can fight or flee from, or a friendly smile from a potential caregiver. It may be far-fetched, and more research is surely needed to support this hypothesis, but given that the results seem to indicate that children and elderly have at least no strong bias toward threats, but in contrast, are more attuned toward positive cues, it is possible that attending to positive cues has greater survival benefits for the more vulnerable among us whom are dependent on care.

Gender

Most dot-probe studies were conducted with psychology students, which results in a subject pool of primarily young women. However, it is possible that men and women perform differently on the dot-probe task as a result of differences in emotional processing and attention. For instance, men and women might show subtle differences in brain activation patterns in response to the observation of emotional stimuli (Killgore & Yurgelun-Todd, 2001; Kret et al., 2011; for a review of sex differences in processing emotions, see Kret & de Gelder, 2012).

Dot-probe studies with adults indicate that gender might play a role in attentional biases. In a study performed by Tran and colleagues (2013) women showed an attentional bias toward angry faces, yet men did not express such a bias. Moreover, women showed vigilance for happy faces, whereas men showed avoidance of happy faces. In another dot-probe study a gender difference was not found on the behavioral level, yet women expressed an enhanced P1 amplitude in response to the probe as compared to men, in particular for happy faces (Pfabigan, Lamplmayr-Kragl, Pintzinger, Sailer, & Tran, 2014).

In dot-probe studies with children, gender effects are also observed (Hankin, Gibb, Abela, & Flory, 2010; Waters, Lipp, & Spence, 2004). The role of the mother seems to be of particular interest in this gender difference. Daughters of depressed mothers showed high vigilance for sad faces, yet boys did not express such a bias (Kujawa et al., 2011). Moreover, daughters of mothers with an emotional disorder showed increased attention toward threat, relative to daughters of healthy mothers. For boys, heightened attention toward threat was only found if their mother had non-comorbid mood disorder specifically (Montagner et al., 2016). These studies indicate that caution is needed when interpreting data of subject groups with both genders and demonstrates that they should at least be carefully matched.

Testosterone level

Testosterone level in both men and women influences selective attention to angry faces (van Honk et al., 1999; Wirth & Schultheiss, 2007). King and colleagues (2012) conducted a dot-probe task to investigate the effects of testosterone administration on attentional biases in rhesus monkeys. The rhesus monkeys were first trained on the dot-probe paradigm by making use of a touch screen procedure. The monkeys had to touch the dot on the screen to receive a food reward. During the actual task, pictures of objects with an emotional value and facial expressions of rhesus monkeys were shown. At baseline, monkeys were significantly faster in reacting to a dot replacing a negative face than to a dot replacing a neutral face. Differences in reaction times for positive faces and objects were not significant. When treated with testosterone, the monkeys showed an overall decrease in reaction times. Furthermore, they showed an attentional bias away from negative objects and an attentional bias toward positive faces. Thus, the administration of testosterone resulted in opposite attentional biases for threat as compared to the baseline. However, monkeys treated with a placebo showed the same changes in attentional biases. Therefore, it may be that repeatedly conducting the test—that is, 4 days a week for 4 months—caused these changes in attentional biases and not the administration of testosterone.

Anxiety and depression

Anxious and nonanxious individuals show differences in brain activation in response to emotional stimuli. EEG research indicated that anxious individuals show increased brain responses for emotional faces, irrespective of the emotional expression (Rossignol, Campanella, Bissot, & Philippot, 2013). Moreover, anxious individuals show abnormal brain activity in hippocampal areas during the disengagement from threat (Price et al., 2014). These differences in brain activation might co-occur with differences in performance on the dot-probe task.

It has been hypothesized that healthy people have an attentional threshold for emotional stimuli (MacLeod et al., 1986). Only highly threatening stimuli attract attention, whereas mildly threatening stimuli are ignored. People with an anxiety disorder might lack this threshold, such that mildly threatening stimuli are also attended. A dot-probe study indicated that both anxious and nonanxious individuals avoid low threatening stimuli and show vigilance for highly threatening stimuli. Yet, anxious individuals shift earlier from avoidance to vigilance than nonanxious individuals as threat increases (Wilson & MacLeod, 2003). Thus, it is important to take the threat level of the stimuli into account when conducting a dot-probe task with a mix of anxious and nonanxious individuals. Epidemiological studies have revealed that mental disorders, including anxiety and depression, are highly comorbid; almost half of the people with a mental disorder also meet the criteria of another disorder. Emotion-processing deficits have been reported in different disorders and result in difficulties with regulating emotions, and at the perceptual level in attentional biases and impaired recognition of emotional expressions. Disrupted emotion processing has therefore been proposed recently as a liability spectrum that underlies many different mental disorders (Kret & Ploeger, 2015). It is hard to disentangle effects on the dot-probe task that specifically relate to anxiety from those that could also be caused by accompanying depression. Indeed, a meta-analysis examining emotional-Stroop and dot-probe results supports the existence of biased attention to negative information in depression, as well (Peckham, McHugh, & Otto, 2010). In order to investigate disorder-specific effects, studies should include multiple patient groups and compare these with one another, and should also focus on individual differences in the healthy population (Kret & Ploeger, 2015).

In nonhuman primates, it is hard to determine whether the individuals differ in their anxiety levels, or in any other personal traits, yet there are some possible ways to assess anxiety in nonhuman primates (for a review, see Coleman & Pierre, 2014). Examples are provoked response tests, associative conditioning and startle response tests and cognitive bias tests. It is a possibility to perform one of these tests to assess anxiety in nonhuman subjects, but is not preferable as it might induce unnecessary stressful situations. Instead, the level of anxiety can be derived through observations in their normal group setting. Keepers can also make an estimation of which individuals might have higher levels of anxiety, and results of these animals should receive some extra attention during analysis, to check whether they significantly deviate from the results of the other individuals.

Serotonin transporter gene

Research has also looked into a potential genetic basis for individual differences in attentional biases, with a focus on a polymorphism in the promotor region of the serotonin transporter gene (5-HTTLPR). Serotonin has an important function in the brain and is associated with depression and anxiety disorders (Hariri & Holmes, 2006). Individuals who have a short allele in 5-HTTLPR show higher amygdala activity when observing threatening faces than did individuals with two long alleles (for a review, see Hariri & Holmes, 2006). This suggests that these individuals might also show differences in performance on the dot-probe task. Dot-probe studies showed vigilance for negative stimuli in short allele carriers (Carlson, Mujica-Parodi, Harmon-Jones, & Hajcak, 2012), whereas homozygous long allele carriers showed avoidance of negative stimuli (Carlson et al., 2012; Fox, Ridgewell, & Ashwin, 2009). Moreover, homozygous long allele carriers allocate attention toward positive stimuli (Fox et al., 2009). Overall, there seems to be a linear relationship between 5-HTTLPR genotype and attentional bias for emotional faces. Vigilance for angry faces decreases as a function of the number of long alleles, whereas vigilance for happy faces increases (Pérez-Edgar, Bar-Haim, McDermott, Gorodetsky, et al., 2010).

In addition, the interaction between genotype and stimulus presentation time seems an important moderator of attentional biases. For example, short allele carriers only showed vigilance for spider pictures at long stimulus presentation (Osinsky et al., 2008). Short allele carriers attended toward fearful faces only at short stimulus presentation whereas homozygous long allele carriers expressed a bias toward angry faces only at long stimulus presentation (Thomason et al., 2010).

Types of stimuli

The dot-probe task can be conducted with many different kinds of stimuli. Often, human faces are used, yet other studies used pictures of objects or scenes instead of faces. Furthermore, the valence of the used facial expressions varied between studies. Sometimes fearful expressions were used, whereas other studies focused on anger or sadness. Do these differences in stimuli have an effect on the performance on the dot-probe task?

Emotional intensity and valence

In the section about anxiety, we already mentioned that the threat level of the stimuli is important to take into consideration. A dot-probe study showed that, overall, reaction times slowed down linearly as the amount of threat increased (Koster et al., 2004). Thus, threatening information seems to exert a task-interference effect. At the same time, the attentional bias toward threat increases with increasing threat level for both threatening faces (Wilson & MacLeod, 2003) and threatening scenes (Koster, Crombez, Verschuere, & de Houwer, 2006; Koster et al., 2005; Mogg, McNamara, et al., 2000). Thus, participants’ overall reaction times decline, but the bias toward threatening stimuli increases.

It is therefore important to take emotional intensity into account when making group comparisons. In comparative studies, the used stimuli should have similar intensity levels and evoke similar levels of arousal. In studies with humans the participants can rate the stimuli on valence and arousal, such that studies with similarly rated stimuli can be compared. In addition, arousal in humans can be measured via psychophysiological measures such as skin conductance or heart rate. Applying most of these methods in animals first requires training such that they accept the invasive measurement apparatus. Other, noninvasive, methods may provide better alternatives. Keepers can be asked to judge the pictures on how emotional or intense they think their animals would perceive the stimuli, as has been done in the recent study in bonobos, in which emotional intensity of the stimulus was assessed by the keepers (Kret et al., 2016). Researchers found a positive correlation with the arousal-score of the keepers and the attention bias of bonobos. In other words, pictures of which keepers indicated that this would evoke arousal in bonobos indeed attracted bonobos attention more than pictures of which keepers rated low on arousal. Yet, what needs to be taken into account in these implicit tasks is that animal subjects may not recognize the images in the same way humans do. How keepers rate the images might not reliably reflect how animals would rate the images if they were able to, although a correlation between ratings an behavior suggests that the ratings are trustworthy. A more direct noninvasive measure can be to determine the level of arousal in humans and nonhuman primates through psychophysiological measures. In humans and monkeys alike, decreases in skin temperature indicate negative arousal (Parr, 2001). A relatively new method allows the noninvasive measurement of facial skin temperature with thermal cameras. This technique is already successfully applied in emotion perception studies with both humans (e.g., Nhan & Chau, 2010) and macaque monkeys (Kuraoka & Nakamura, 2011), and recently with chimpanzees as well (Kano, Hirata, Deschner, Behringer, & Call, 2016). It is a promising technique from which data even heart rates can be derived, yet it is expensive and dependent on general activity as walking and eating, even though this is to some extent controllable in an experimental setting (Kano et al., 2016).

In addition, measuring pupil dilation can be a noninvasive technique to determine arousal. Pupillary changes following a light reflex are larger when viewing emotionally arousing pictures. These changes covaried with changes in skin conductance, which indicates that the sympathetic nervous system plays a role in these pupillary changes (Bradley, Miccoli, Escrig, & Lang, 2008). The measurement of pupil dilation is already successfully used in noncommunicative humans (e.g., Al-Omar, Al-Wabil, & Fawzi, 2013) and in non-human primate research (e.g., Kret, Tomonaga, & Matsuzawa, 2014; Machado, Bliss-Moreau, Platt, & Amaral, 2011). For instance, subject-directed social cues and nonsocial nature documentary footage generated larger pupil diameters than videos of conspecifics engaging in naturalistic social interactions in rhesus macaques (Machado et al., 2011). The larger pupil diameter indicates heightened sympathetic arousal. Because both the measurements of pupil dilation and thermal imaging are applicable to both humans and nonhuman primates they might be suitable to use as more direct measure of arousal evoked by the stimuli.

To behaviorally check whether the subjects are aware of the valence of the stimuli, an additive matching to sample task can be added, independent of the dot-probe task. In this way researchers can check whether the subjects match the stimuli on valence as expected by the researchers. Yet, not all animals can perform such a task. With these animals an independent, additional passive viewing task can be performed next to the dot-probe task, while non-invasively measuring arousal. In this way can be checked which stimuli rated high on arousal and valence levels, indicated by an increase in heart rate and pupil size.

Differences in stimuli

Stimuli of different conditions might differ not only in terms of their emotional valence, but also on other, lower levels. The colors present in the stimuli might for instance play a role. The color red captures and holds attention in both positive and negative nonhuman stimuli, but not in neutral stimuli. In one study, this resulted in motor responses to the target stimuli being affected by attention lingering over the position where a red cue was flashed (Kuniecki, Pilarczyk, & Wichary, 2015). The same process might influence studies with facial stimuli. In happy faces the (white) teeth are more present than in most negative facial expressions. The white color of the teeth against a darker background might capture attention. The same holds for the visible eye-whites in expressions of fear or surprise. It is therefore important to build in several control conditions, such as including bodily expressions (de Valk, Wijnen, & Kret, 2015) or scrambled or inverted faces (Fox, 2002) and use a large stimulus set.

In addition, stimuli within a condition might differ in the response they evoke. Several dot-probe studies used a combination of facial expressions as negative stimuli, such as sadness and anger (Mather & Carstensen, 2003) or sadness, anger, fear and disgust (Brown et al., 2013; Mansell, Clark, Ehlers, & Chen, 1999). No distinction was made between these different facial expressions in the analysis. However, responses to these expressions in the dot-probe task might differ if these expressions evoke different action tendencies.

Previous studies looking into approach tendencies using joystick paradigms indicated that people are faster in pushing angry faces away than in pulling the faces toward them (Marsh et al., 2005; Roelofs et al., 2005; Seidel et al., 2010), whereas the opposite effect was found for fearful faces (Marsh et al., 2005). Moreover, in a touchscreen task, people are faster to touch angry stimuli (both faces and bodies) than neutral or fearful stimuli (de Valk et al., 2015). Fearful and angry expressions therefore seem to evoke different action tendencies, which might also affect people’s behavior on the dot-probe task. A study by Cooper et al. (2011) indicated that adding angry faces to a stimulus set of neutral, happy and fearful faces resulted in opposite attentional biases. Caution is therefore needed when using both fearful and angry faces as negative stimuli. If behavior in reaction to these faces is pooled, this might result in averaging out effects.

Additionally, gaze direction might be a complicating factor in the combination of fearful and angry facial expressions. The findings of action tendencies are based on faces with direct gaze. Direct gaze is an important signal in angry faces, because the anger is then directed to the observer. Yet, averted gaze might be more important in fearful faces because this indicates threat in the environment (Adams, Gordon, Baird, Ambady, & Kleck, 2003). This is supported by the finding that participants were faster to categorize angry faces with a direct gaze than to categorize angry faces with an averted gaze. In contrast, fearful faces were categorized more rapidly with an averted gaze (Adams & Kleck, 2003). Thus, when angry and fearful faces are both used as negative stimuli it might be useful to include angry faces with a direct gaze and fearful faces with an averted gaze to make the task ecologically more valid.

Human versus nonhuman

Threat is not only signaled by facial expressions. The human body as a whole is also an important messenger of the emotional state of a person (de Gelder, Van den Stock, Meeren, Sinke, Kret & Tamietto, 2010). For instance, increased levels of arousal were detected when anger was expressed by both the body and the face simultaneously (Kret, Roelofs, Stekelenburg, & de Gelder, 2013). Furthermore, emotion recognition is supported by both facial and bodily expressions as reaction times for categorizing emotional expressions increased when face and body expressed different emotions instead of the same emotion (Kret, Stekelenburg, Roelofs, & de Gelder, 2013; Shields, Engelhardt, & Ietswaart, 2012).

Facial and bodily emotional expressions also seem to have similar effects on attentional processes. Angry and fearful stimuli attract more attention than happy stimuli for both facial and bodily expressions (Kret, Stekelenburg, et al., 2013). In addition, facial and bodily expressions elicit similar approach biases (de Valk et al., 2015). Therefore, it would be interesting to examine whether facial and bodily expressions elicit similar attentional biases in the dot-probe task. Including bodily expressions as stimuli might add ecological validity to the task as in real life, faces are always encountered in the presence of a body. To our knowledge, no such study has been performed yet. Especially when testing non-human primates who have less experience with observing photographs or symbols, it might be worthwhile to keep the stimulus material as naturalistic as possible.

Apart from human stimuli, nonhuman stimuli like predators or weapons might also attract attention (Carlson, Fee, & Reinke, 2009). For instance, in a search task, fear-relevant pictures of snakes and spiders were detected more rapidly among fear-irrelevant pictures of flowers and mushrooms than the other way around (Öhman, Flykt, & Esteves, 2001). A similar result was found for angry and neutral faces (Öhman, Lundqvist, & Esteves, 2001), which suggests that threatening facial stimuli and threatening nonfacial stimuli elicit similar attentional biases.

In anxious individuals, a general vigilance for threatening stimuli was found, irrespective of the type of stimuli. Studies that used facial stimuli found that anxious people show an attentional bias toward emotional faces (e.g., Bradley, Mogg, Falla, & Hamilton, 1998; Ioannou, Mogg, & Bradley, 2004; Mogg, Philippot, & Bradley, 2004). Similar results were found in studies that used spider stimuli (e.g., Mogg & Bradley, 2006). The results of dot-probe studies conducted with nonanxious individuals are less consistent. Studies with nonfacial stimuli mostly did not find a bias for threat stimuli (e.g., Mogg & Bradley, 2006; Mogg, Bradley, et al., 2004; Lees et al., 2005). In contrast, some studies with facial stimuli found a bias toward threat (e.g., Tomaszczyk & Fernandes, 2014), some studies found a bias away from threat (e.g., Bradley et al., 1997) and yet other studies found no bias (e.g., Bradley et al., 1998). This inconsistency in results could be related to one or more parameters discussed in this review.

A complicating factor in comparing studies with human and nonhuman stimuli is the high individual variability in attentional biases for threatening stimuli. A dot-probe study with a stimulus set of pictures of angry faces, attacking dogs, attacking snakes, pointed weapons and violent scenes showed that 34% of the participants showed a general bias toward threat and 20.8% showed a general bias away from threat. However, 34% of the participants showed a bias toward some categories and away from other categories (Zvielli, Bernstein, & Koster, 2014). Whether the same pattern in individual differences can be found for biases for different negative faces needs to be investigated.

In sum, it is difficult to ascertain whether threatening human and threatening nonhuman stimuli evoke similar attentional processes. Future studies with both human and nonhuman stimuli might provide a clearer view on the attentional biases evoked by these stimuli. To our knowledge, only one study included both types of stimuli. This study showed that subjects displayed an attentional bias toward threatening faces and an attentional bias away from negative nonfacial stimuli (Lacreuse et al., 2013). This result suggests that facial and nonfacial stimuli have opposite effects on attentional biases. In order to gain evolutionary insights, it might also be worthwhile to compare non-human biological threats such as snakes with non-human non-biological threats such as guns. More studies should be performed to draw firmer conclusions.

Procedure

Not only do stimulus types differ across dot-probe studies, there are also many differences in experimental procedures. The possible influences of several of these differences will be discussed in the following section.

Stimulus presentation duration

The majority of dot-probe studies have used a stimulus presentation duration of 500 ms. A problem with a stimulus presentation duration of 500 ms might be that attention has already shifted between the stimuli during this time frame (as a saccade can be made within 200 ms). Thus, responses given after a stimulus presentation of 500 ms do not necessarily provide information about initial orientation (Bradley, Mogg, & Millar, 2000). To avoid this, some dot-probe studies used a shorter or even subliminal stimulus presentation. A meta-analysis showed that the effect size of subliminal presentation was twice as large as that of supraliminal presentation in data of anxious individuals (Bar-Haim et al., 2007). A more recent meta-analysis of 28 masked visual probe experiments (Hedger, Gray, Garner, & Adams, 2016) only found a small effect of threat bias, comparable to the effect of the supraliminal presentation in the review of Bar-Haim et al. (2007). They found that effects were substantially larger when the SOA between target and mask was >30 ms, a timing for which the majority of subjects can detect the stimulus (Pessoa, Japee, Sturman, & Ungeleider, 2006).

Direct comparisons between subliminal and supraliminal stimulus presentations revealed that for fearful facial expressions presentation duration does not have an effect on attentional biases. Fox (2002) compared a stimulus presentation of 500 ms with a stimulus presentation of 17 ms after which the stimuli were masked with a scrambled face. Participants showed similar attentional biases for fear, irrespective of stimulus presentation time. Another study found no main effect of presentation time on responses to happy and sad facial expressions but presentation time did have an effect on attentional biases for angry faces in an older age group of participants (Orgeta, 2011). Older subjects showed an attentional bias away from angry faces with a stimulus presentation of 1,000 ms, whereas stimulus presentations of 17 ms and 500 ms did not reveal an attentional bias. However, this effect was not found in the younger age group. Thus, subliminal presentation or a presentation of 500 ms does not seem to result in different attentional biases for facial stimuli. For threatening nonfacial stimuli the results are less consistent. Some studies did not find a difference in attentional biases for a presentation of 100 and 500 ms (Koster et al., 2005; Mingtian, Xiongzhao, Jinyao, Shuqiao, & Atchley, 2011). Another study found a bias toward threat at a presentation time of 500 ms, yet no bias was found for subliminal presentation of the stimulus material (Putman, 2011).

Longer presentation times than 500 ms have also been used in dot-probe studies. It is hypothesized that people, and anxious individuals in particular, initially attend to a threatening stimulus but then reallocate their attention to avoid the threat (e.g., Mathews, 1990). This hypothesis was confirmed by several dot-probe studies that directly compared participants’ attentional biases when stimuli were presented for either 500 ms or a longer period (i.e., 1,250, 1,500, or 2,000 ms). Attention allocation in nonanxious individuals is not affected by stimulus presentation time for either threatening facial stimuli (e.g., Bradley et al., 1998; Ioannou et al., 2004; Mogg, Bradley, et al., 2004; Pérez-Edgar, Bar-Haim, McDermott, Gorodetsky, et al., 2010) or threatening nonfacial stimuli (Lees et al., 2005; Mogg & Bradley, 2006; Mogg, Philippot, & Bradley, 2004). Anxious individuals initially show an attentional bias toward threat, yet when stimulus presentation time increased this bias disappeared or turned into avoidance of threat (Ioannou et al., 2004; Koster et al., 2005; Lees et al., 2005; Mogg, Bradley, et al., 2004; Mogg, Philippot, & Bradley, 2004).

However, an alternative explanation for these findings might be that with longer stimulus presentation durations, individuals make more eye movements (Stevens, Rist, & Gerlach, 2011). Research indicated that eye movements might influence the findings on a dot-probe task (Petrova, Wentura, & Bermeitinger, 2013). Attentional biases were calculated separately for trials in which subjects did and did not make an eye-movement. An attentional bias toward angry faces was found for trials without eye movements, yet this bias was not found for trials with eye movements and across all trials. An attentional bias toward angry faces across all trials was found when the participants were instructed not to make eye movements. Whether the observation that anxious individuals show vigilance for threat with short stimulus presentations and show avoidance of threat with longer stimulus presentations is due to eye movements should be further investigated.

It would be useful to limit the stimulus presentation to a duration in which usually no saccades are made. Petrova et al. (2013) suggested that a stimulus onset asynchrony, the time between the onset of the stimulus and the onset of the probe, less than 300 ms is sufficient to prevent eye movements. Thus, a stimulus presentation of 200–300 ms should be ideal in comparative studies.

Probe detection versus probe discrimination

Most dot-probe studies measure probe detection times, which means that participants have to react on the appearance of the probe as quickly as possible. In contrast, other studies measure probe discrimination times, which means that participants react on the identity of the probe. The identity of the probe can for instance be based on the shape or the color of the probe. An advantage of probe discrimination is that participants are encouraged to monitor both sides of the screen. Yet, probe discrimination requires more cognitive load than probe detection because of the arbitrary relationship between stimulus and response. Therefore, the findings of both versions of the dot-probe task might differ.

Both versions were directly compared in two studies that only differed in the required response to the probe. These studies measured attentional biases for threatening and happy facial expressions. In the first study, subjects had to indicate whether the probe appeared on the left or the right side of the screen (Mogg & Bradley, 1999). In the other study the subjects had to indicate whether the probe consisted of two vertically orientated dots (:) or two horizontally orientated dots (..) (Bradley et al., 1998). Both studies found an attentional bias toward threatening faces in anxious individuals and no bias for these faces in nonanxious individuals. The attentional biases for happy faces were less consistent. Nonetheless, these two versions seem to have similar sensitivity in measuring attentional biases for threatening facial expressions. However, the response times in the discrimination task were longer and more variable and resulted in higher error rates. This might especially be a disadvantage when testing clinical groups, children and nonhuman animals.

These findings indicate that both versions of the dot-probe task are only partially comparable. Both versions might therefore measure different attentional processes. The detection task might measure low level attentional processes, whereas the discrimination task measures a construct in which higher cognitive processes play a role. For research aiming at automatic emotion perception the probe detection task might be the best version in order to prevent higher cognitive processes to intervene.

Emotional-neutral pair versus emotional–emotional pair

It is common in dot-probe studies that an emotional valenced stimulus is presented next to a neutral stimulus. In this way it is examined whether the emotional valenced stimulus is attended or avoided. However, very few studies paired two emotional stimuli to investigate whether one type of emotional expression is attended or avoided more than another type.

To our knowledge, three studies have included angry–happy face pairs in the task and analyzed the response times on these trials. Two of these studies did not find an attentional bias for angry–neutral, happy–neutral, or angry–happy face pairs (Heim-Dreger et al., 2006; Pineles & Mineka, 2005). The participants reacted to the probes equally fast, irrespective of the presented face pairs. In contrast, the third study revealed an attentional bias toward angry faces, when they were paired with either a neutral or a happy face (Tomaszczyk & Fernandes, 2014). This result indicates vigilance for angry faces, irrespective of the other face in the face pair.

Unfortunately, to date no dot-probe study investigated attentional biases when face pairs consisted of two negatively valenced faces. For instance, attentional biases can be measured when an angry facial expression is paired with a fearful facial expression. A recent study by de Valk and colleagues (2015) suggested that these negative emotions can indeed evoke different responses. In this study participants’ movements toward emotional stimuli were investigated. Participants responded faster to angry stimuli than to neutral stimuli, yet movements to fearful stimuli did not differ from those to neutral stimuli. This was found for both facial and bodily stimuli.

In addition, next to pairs with the same valenced faces, it might also be recommended to include pairs of faces with the same emotional expression. Often, neutral–neutral pairs are used as a baseline, but same-emotion pairs can also function as a baseline. In combination with the same-valence pairs this can give interesting information with regard to the question whether stimuli evoke vigilance or difficulty to disengage attention.

Context

Several studies have suggested that context is an important parameter in assessing attentional biases. Particularly subjects with an anxiety disorder seem to be sensitive to the context in which the experiment takes place. When the experiment is conducted under stressful circumstances, anxious participants might alter their behavior. It is hypothesized that the additive stress causes a shift in processing priorities such that the attention for the task is reduced. For example, the Stroop interference effect of threatening words disappeared when snake-fearful participants were exposed to a snake during the test (Mathews & Sebastian, 1993). In a similar vein, a study by Kret & de Gelder (2013) showed that violent offenders attended more to task-irrelevant aggressive information than healthy controls.

Dot-probe studies have also shown that context can have an effect on attentional biases (e.g., Everaert, Spruyt, & de Houwer, 2013; Judah, Grant, Lechner, & Mills, 2013; Mills, Grant, Judah, & White, 2014). More importantly, context can have opposite effects in anxious and nonanxious participants. In one study, the dot-probe task was adjusted such that a prime word was shown before an angry-neutral face pair was presented (Helfinstein, White, Bar-Haim, & Fox, 2008). This prime word could either be a neutral or a social threat word, which determined the context of the following trial. The results showed that participants with high social anxiety scores showed an attentional bias toward angry faces when the trial was preceded by a neutral prime word. In contrast, this effect was not found when the trial was preceded by a threatening prime word. Participants with low social anxiety scores showed the opposite effect. They showed an attentional bias toward angry faces when preceded by a threatening prime word, yet no bias emerged after a neutral prime word.

Such opposite effects were also found when social stress was induced before the dot-probe task took place (Mansell et al., 1999). High anxious participants avoided both positive and negative faces in the social threat condition, whereas low anxious participants did not show an attentional bias. No attentional biases were found in the nonthreat condition, irrespective of anxiety level. Similar effects were found when social stress is induced between two sessions of the dot-probe task (Mills et al., 2014). High anxious participants show different attentional biases after stress anticipation for internal stimuli (i.e., heart rate images), whereas low anxious participants do not differ in biases on the two time points.

These findings suggest that context can affect performance on the dot-probe task on an individual level. Therefore, caution is needed when testing different clinical groups. Anxious participants can feel stressed by the procedure of the task even when this was not the original intention of the paradigm. This might result in biased performances of some groups of individuals and is also relevant when testing animals. In most primate studies, individuals are separated from their group, locked up in a small enclosure or even constrained in a monkey chair. All these factors are likely to increase stress and arousal, which might impact on the results of the dot-probe task. The recent study by Kret et al. (2016) in bonobos has proven that these invasive procedures are not always necessary to collect good quality data. With training, it is possible to position one individual behind the screen, while involving the rest of the group with another activity, but without separating them physically. That way, emotional responses are investigated when subjects are situated in their regular social group setting.

Reliability

The discussion of all these studies made clear that the dot-probe task has led to somewhat inconsistent results. However, as we have shown, a various set of parameters can explain this inconsistency. An alternative explanation for the inconsistent results might be that the dot-probe task in itself has insufficient reliability. Several studies indeed indicated that individual bias scores are not very stable over time (Brown et al., 2014; Cooper et al., 2011; Enock, Hofmann, & McNally, 2014; Kappenman, Farrens, Luck, & Proudfit, 2014; Schmukle, 2005; Staugaard, 2009; van Bockstaele et al., 2011; Waechter, Nelson, Wright, Hyatt, & Oakman, 2014; Waechter & Stolz, 2015), rendering the method not very suitable for investigating intra-individual differences. It is suggested that the low reliability might, at least in part, be explained by the high variability in response times between individuals (Waechter et al., 2014). Splitting groups of participants in low and high anxious groups does not result in higher reliability scores (Kappenman et al., 2014; Waechter et al., 2014; Waechter & Stolz, 2015).

Despite the inconsistency on the individual level, the task might still be a useful measure on the group level to make between-group comparisons. Average bias scores toward emotional faces over a whole group of participants are consistent both within and between sessions (Staugaard, 2009). Thus, on a group level, the dot-probe task is still informative about attentional biases toward emotional information. Moreover, bias scores seem to become more reliable after daily repetition of the task for several weeks (Enock et al., 2014). As nonhuman primates are first trained on the task, they perform the task several times a week over a certain period. The eventual data used in the analysis might therefore have a acceptable reliability.

Nonetheless it is important to keep in mind that the dot-probe task might have some pitfalls. To prevent researchers from producing inconsistent and irreproducible results, some issues might be considered. Researchers are encouraged to routinely report dot-probe reliability measures as is the norm in for example intelligence research. Further, it must be more thoroughly investigated how to improve the reliability of dot-probe measures. For instance, use adaptive response windows that provoke higher error rates, which may be more reliable performance measures (e.g., Schmukle & Egloff, 2006), or use different response time measures with higher reliability (e.g., Zvielli, Bernstein, & Koster, 2015).

Difference scores

The studies discussed in this review all calculated and reported a difference score between reaction times on the congruent trials and reaction times on the incongruent trials. We chose to only include studies that calculated difference scores because this is the most general way in which the results of a dot-probe task are reported. The best comparison between studies was therefore possible if this measure was taken as a reference point. However, this procedure is not necessarily the best approach, especially in studies with small sample sizes.

The largest downside of calculating the difference score is that half of the data is lost, as two data points are converted into one. This might in particular be a problem in studies with low sample sizes, for instance studies with nonhuman primates. By calculating difference scores, it becomes impossible to say whether the found effects are driven by either the congruent or the incongruent picture (i.e., the congruent or incongruent emotion). By examining the response times for the congruent and incongruent pictures separately, it becomes possible to gain deeper insight into the underlying mechanisms. Instead of computing a difference score, reaction times on the dot can simply be analyzed as a function of the picture category that was shown before on that location, either irrespective of, or in relation to the other, potentially distracting picture. In the statistical analyses, trials can be nested within individuals, no data needs to be averaged, and “target picture” and “distracting picture” and their interaction, can be included as factors in the regression model. It is important to keep this in mind when conducting analyses on dot-probe data, and choose the method that is best able to answer the research question at hand.

Conclusion

The dot-probe task is widely used to assess attentional biases as this task is implicit, does not require instruction, and subjects need no or minimal training to perform the test successfully. In this review, we discussed whether the dot-probe task is a suitable measure of attentional biases in comparative studies. Research showed inconsistent results, but it is difficult to compare the studies because of the variability in parameters and populations used for testing.

The personal characteristics of the participants influence the results of the dot-probe task. Participants’ anxiety level in particular is a confounding factor. Anxious individuals showed vigilance for mild and high threatening stimuli, whereas nonanxious individuals only showed vigilance for high threatening stimuli (e.g., Wilson & MacLeod, 2003). In addition, gender and age of the participants might influence attentional biases. For instance, older individuals seem to avoid negative information more than younger individuals (e.g., Mather & Carstensen, 2003). Moreover, several parameters such as context and variations in stimulus presentation influence the performances of anxious and nonanxious individuals in different ways (e.g., Helfinstein et al., 2008; Mogg, Bradley, et al., 2004; Mogg, Philippot, & Bradley, 2004). It is good to take in mind that, in comparative studies, it is hard to select primates on age and gender, due to low sample sizes. It is therefore suggested to next to on the group level, also look at the responses of each individual independently, to check whether there were age or gender differences in the tested sample.

The type of stimuli presented during the task also influences the results. The emotional intensity is an important factor, as mild and high threat evoke different attentional biases, at least in nonanxious individuals (e.g., Koster et al., 2005). In comparative studies, it is advised to check the valence and evoked arousal of the stimuli by noninvasive measures such as physiological measures including pupil dilation and thermal imaging or evaluations by the keepers. Furthermore, responses to different negative facial expressions should not be merged for analysis. Angry and fearful facial expressions might initiate opposite action tendencies (Marsh et al., 2005) and might therefore also evoke different attentional biases. Additionally, human and nonhuman stimuli possibly have opposite effects on attentional biases. Lacreuse et al. (2013) showed that subjects had an attentional bias toward threatening facial stimuli, whereas negative nonhuman stimuli were avoided. However, no other studies have compared human and nonhuman stimuli directly, which makes it difficult to compare both types of stimuli.

Lastly, differences in the experimental procedure of the dot-probe task might lead to different results. Most studies used a stimulus presentation of 500 ms. Shorter, subliminal, presentation does not seem to influence the results (Fox, 2002; Orgeta, 2011). Longer stimulus presentation seems to affect performances of anxious individuals who shifted from vigilance to avoidance of threat with longer stimulus presentation. Yet, nonanxious individuals did not show different attentional biases (e.g., Ioannou et al., 2004; Lees et al., 2005). In addition, whether the dot-probe task applied probe detection or probe discrimination does not seem to significantly affect results (Bradley et al., 1998; Mogg & Bradley, 1999). However, probe discrimination requires higher cognitive load and might therefore be less suitable to test children and nonhuman primates. A last point of importance is the context in which the dot-probe task is conducted. For example, the induction of an affective state alters performances (Everaert et al., 2013; Helfinstein et al., 2008). Comparing the results of dot-probe studies is only possible when the attentional biases were measured in a similar context. This gives to think about possible effects of restraining or forcing animals while performing the dot-probe task.

Overall, the described studies indicate that emotional expressions are processed effectively in the normal population. Especially high threatening stimuli seem to capture attention. The rapid allocation toward threatening stimuli initiates action tendencies to effectively cope with threat. Yet, more insight in the process of emotion perception can be provided by cross-species research. A direction for further research might therefore be to conduct dot-probe studies with nonhuman primates and other animals. The comparison of attentional biases in several related species might provide essential information about the evolution of emotional expressions. If individuals of several species display similar attentional biases with regard to emotional stimuli, then the underlying emotional circuits are probably also similar. This then indicates an evolutionary basis for the perception of emotion signals. The dot-probe task can be used to investigate which stimuli are relevant and attract species-specific attention. From that information, one can gain insight into the environmental pressures that have shaped and fine-tuned attentional mechanisms over evolutionary time. For example, Kret et al. (2016) showed that bonobos’ attention is drawn mostly to positive social-emotional pictures showing scenes of sex or grooming. Bonobos, as compared to chimpanzees and humans, evolved in a relatively safe and food-rich environment, without many predators or competition from rivaling groups. Possibly, for this relatively nonaggressive species, it is more relevant to keep track of positive social behaviors because these occur more frequently than threats or serious fights. It will be very interesting to see results from other primate species, such as the orangutan—the sole semi-solitary-living great ape—and from completely different species, such as birds—in particular, species in the highly social crow (corvid) and parrots (psittacine) families.

The dot-probe task is specifically suitable for comparative studies as the task can be conducted with nonhuman primates (King et al., 2012; Kret et al., 2016; Parr et al., 2013) and potentially also with other animals. Moreover, rhesus monkeys perform similar to humans when facial stimuli are used (Lacreuse et al., 2013). A good comparison between the performances of humans and nonhuman primates on the dot-probe task is possible when certain methodological points, as discussed in this review, are taken into account. Both groups should be matched on gender and relative age as closely as possible. In addition, the stimuli used should evoke similar levels of arousal in both humans and nonhuman primates. And last, the task set up should be the same, which includes stimulus presentation time, probe detection instead of probe discrimination and the usage of the same device for reacting to the probe. Conducting studies that enable reliable comparisons of emotional processing in human and nonhuman primates might eventually provide essential information about the evolution of processing conspecific’s emotion signals.

References

  • Adams, R. B., Gordon, H. L., Baird, A. A., Ambady, N., & Kleck, R. E. (2003). Effects of gaze on amygdala sensitivity to anger and fear faces. Science, 300, 1536.

    PubMed  Article  Google Scholar 

  • Adams, R. B., & Kleck, R. E. (2003). Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14, 644–647.

    PubMed  Article  Google Scholar 

  • Adolphs, R. (2002). Neural systems for recognizing emotion. Current Opinion in Neurobiology, 12, 169–177.

    PubMed  Article  Google Scholar 

  • Al-Omar, D., Al-Wabil, A., & Fawzi, M. (2013). Using pupil size variation during visual emotional stimulation in measuring affective states of non communicative individuals. In International Conference on Universal Access in Human–Computer Interaction (pp. 253–258). Berlin: Springer.

    Google Scholar 

  • Arndt, J. E., & Fujiwara, E. (2012). Attentional bias towards angry faces in trait-reappraisal. Personality and Individual Differences, 52, 61–66.

    Article  Google Scholar 

  • Bar-Haim, Y., Lamy, D., Pergamin, L., Bakermans-Kranenburg, M. J., & Van IJzendoorn, M. H. (2007). Threat-related attentional bias in anxious and nonanxious individuals: A meta-analytic study. Psychological Bulletin, 133, 1–24.

    PubMed  Article  Google Scholar 

  • Becker, D. V., Anderson, U. S., Mortensen, C. R., Neufeld, S. L., & Neel, R. (2011). The face in the crowd effect unconfounded: Happy faces, not angry faces, are more efficiently detected in single-and multiple-target visual search tasks. Journal of Experimental Psychology: General, 140, 637–659.

    Article  Google Scholar 

  • Bliss-Moreau, E., Machado, C. J., & Amaral, D. G. (2013). Macaque cardiac physiology is sensitive to the valence of passively viewed sensory stimuli. PLoS ONE, 8, e71170.

  • Bradley, M. M., Miccoli, L., Escrig, M. A., & Lang, P. J. (2008). The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology, 45, 602–607.

    PubMed  PubMed Central  Article  Google Scholar 

  • Bradley, B. P., Mogg, K., Falla, S. J., & Hamilton, L. R. (1998). Attentional bias for threatening facial expressions in anxiety: Manipulation of stimulus duration. Cognition and Emotion, 12, 737–753.

    Article  Google Scholar 

  • Bradley, B. P., Mogg, K., Millar, N., Bonham-Carter, C., Fergusson, E., & Jenkins, J. (1997). Attentional biases for emotional faces. Cognition and Emotion, 11, 25–42.

    Article  Google Scholar 

  • Bradley, B. P., Mogg, K., & Millar, N. H. (2000). Covert and overt orienting of attention to emotional faces in anxiety. Cognition and Emotion, 14, 789–808.

    Article  Google Scholar 

  • Brown, H. M., Eley, T. C., Broeren, S., MacLeod, C., Rinck, M. H. J. A., Hadwin, J. A., & Lester, K. J. (2014). Psychometric properties of reaction time based experimental paradigms measuring anxiety-related information-processing biases in children. Journal of Anxiety Disorders, 28, 97–107.

    PubMed  Article  Google Scholar 

  • Brown, H. M., McAdams, T. A., Lester, K. J., Goodman, R., Clark, D. M., & Eley, T. C. (2013). Attentional threat avoidance and familial risk are independently associated with childhood anxiety disorders. Journal of Child Psychology and Psychiatry, 54, 678–685.

    PubMed  Article  Google Scholar 

  • Buttelmann, D., Call, J., & Tomasello, M. (2009). Do great apes use emotional expressions to infer desires? Developmental Science, 12, 688–698.

    PubMed  Article  Google Scholar 

  • Carlson, J. M., Fee, A. L., & Reinke, K. S. (2009). Backward masked snakes and guns modulate spatial attention. Evolutionary Psychology, 7, 534–544.

    Article  Google Scholar 

  • Carlson, J. M., Mujica-Parodi, L. R., Harmon-Jones, E., & Hajcak, G. (2012). The orienting of spatial attention to backward masked fearful faces is associated with variation in the serotonin transporter gene. Emotion, 12, 203–207.

    PubMed  Article  Google Scholar 

  • Coleman, K., & Pierre, P. J. (2014). Assessing anxiety in nonhuman primates. Ilar Journal, 55, 333–346.

    PubMed  PubMed Central  Article  Google Scholar 

  • Cooper, R. M., Bailey, J. E., Diaper, A., Stirland, R., Renton, L. E., Benton, C. P., & Munafò, M. R. (2011). Effects of 7.5% CO2 inhalation on allocation of spatial attention to facial cues of emotional expression. Cognition and Emotion, 25, 626–638.

    PubMed  Article  Google Scholar 

  • Cooper, R. M., & Langton, S. R. (2006). Attentional bias to angry faces using the dot-probe task? It depends when you look for it. Behaviour Research and Therapy, 44, 1321–1329.

    PubMed  Article  Google Scholar 

  • Crivelli, C., Russell, J. A., Jarillo, S., & Fernández-Dols, J. M. (2016). The fear gasping face as a threat display in a Melanesian society. Proceedings of the National Academy of Sciences, 201611622.

  • de Gelder, B., Frissen, I., Barton, J., & Hadjikhani, N. (2003). A modulatory role for facial expressions in prosopagnosia. Proceedings of the National Academy of Sciences, 100, 13105–13110.

    Article  Google Scholar 

  • de Gelder B., Van den Stock J., Meeren H. K. M., Sinke C. B. A., Kret M. E., & Tamietto M. (2010). Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. Neuroscience & Biobehavioral Reviews 34(4), 513–527.

  • de Gelder, B., Vroomen, J., Pourtois, G., & Weiskrantz, L. (1999). Non‐conscious recognition of affect in the absence of striate cortex. NeuroReport, 10, 3759–3763.

    PubMed  Article  Google Scholar 

  • de Valk, J. M., Wijnen, J. G., & Kret, M. E. (2015). Anger fosters action: Fast responses in a motor task involving approach movements toward angry faces and bodies. Frontiers in Psychology, 6, 1240.

    PubMed  PubMed Central  Article  Google Scholar 

  • de Waal, F. B. M. (2003). Darwin’s legacy and the study of primate visual communication. Annals of the New York Academy of Sciences, 1000, 7–31.

    PubMed  Article  Google Scholar 

  • de Waal, F. B. M. (2008). Putting the altruism back into altruism: The evolution of empathy. Annual Review of Psychology, 59, 279–300.

    PubMed  Article  Google Scholar 

  • de Waal, F. B. M., & van Roosmalen, A. (1979). Reconciliation and consolation among chimpanzees. Behavioral Ecology and Sociobiology, 5, 55–66.

    Article  Google Scholar 

  • de Winter, F. L., Zhu, Q., van den Stock, J., Nelissen, K., Peeters, R., de Gelder, B., & Vandenbulcke, M. (2015). Lateralization for dynamic facial expressions in human superior temporal sulcus. NeuroImage, 106, 340–352.

    PubMed  Article  Google Scholar 

  • Darwin, C. (1965). Expression of the emotions in man and animals. New York, NY: Philosophical Library. (Original work published in 1872).

  • Davila-Ross, M., Allcock, B., Thomas, C., & Bard, K. A. (2011). Aping expressions? Chimpanzees produce distinct laugh types when responding to laughter of others. Emotion, 11, 1013.

    PubMed  Article  Google Scholar 

  • Davis, J. S., Fani, N., Ressler, K., Jovanovic, T., Tone, E. B., & Bradley, B. (2014). Attachment anxiety moderates the relationship between childhood maltreatment and attention bias for emotion in adults. Psychiatry Research, 217, 79–85.

    PubMed  PubMed Central  Article  Google Scholar 

  • Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual-attention. Annual Review of Neuroscience, 18, 193–222.

    PubMed  Article  Google Scholar 

  • Eastwood, J. D., Smilek, D., & Merikle, P. M. (2001). Differential attentional guidance by unattended faces expressing positive and negative emotion. Perception & Psychophysics, 63, 1004–1013.

    Article  Google Scholar 

  • Ekman, P. (1973). Cross-cultural studies of facial expression. In P. Ekman (Ed.), Darwin and facial expression: A century of research in review (pp. 169–222). Cambridge: Malor Books.

    Google Scholar 

  • Ekman, P., Sorenson, E. R., & Friesen, W. V. (1969). Pan-cultural elements in facial displays of emotion. Science, 164, 86–88.

    PubMed  Article  Google Scholar 

  • Eldar, S., Yankelevitch, R., Lamy, D., & Bar-Haim, Y. (2010). Enhanced neural reactivity and selective attention to threat in anxiety. Biological Psychology, 85, 252–257.

    PubMed  Article  Google Scholar 

  • Elfenbein, H. A. (2013). Nonverbal dialects and accents in facial expressions of emotion. Emotion Review, 5, 90–96.

    Article  Google Scholar 

  • Enock, P. M., Hofmann, S. G., & McNally, R. J. (2014). Attention bias modification training via smartphone to reduce social anxiety: A randomized, controlled multi-session experiment. Cognitive Therapy and Research, 38, 200–216.

    Article  Google Scholar 

  • Everaert, T., Spruyt, A., & De Houwer, J. (2013). On the malleability of automatic attentional biases: Effects of feature-specific attention allocation. Cognition and Emotion, 27, 385–400.

    PubMed  Article  Google Scholar 

  • Eysenck, M. W. (1992). Anxiety: The cognitive perspective. East Sussex: Erlbaum.

    Google Scholar 

  • Fox, E. (2002). Processing emotional facial expressions: The role of anxiety and awareness. Cognitive, Affective, & Behavioral Neuroscience, 2, 52–63.

    Article  Google Scholar 

  • Fox, E., Lester, V., Russo, R., Bowles, R. J., Pichler, A., & Dutton, K. (2000). Facial expressions of emotion: Are angry faces detected more efficiently? Cognition and Emotion, 14, 61–92.

    PubMed  PubMed Central  Article  Google Scholar 

  • Fox, E., Ridgewell, A., & Ashwin, C. (2009). Looking on the bright side: biased attention and the human serotonin transporter gene. Proceedings of the Royal Society B, 276, 1747–1751.

    PubMed  PubMed Central  Article  Google Scholar 

  • Frewen, P. A., Dozois, D. J., Joanisse, M. F., & Neufeld, R. W. (2008). Selective attention to threat versus reward: Meta-analysis and neural-network modeling of the dot-probe task. Clinical Psychology Review, 28, 307–337.

    PubMed  Article  Google Scholar 

  • Frijda, N. H. (1987). Emotion, cognitive structure, and action tendency. Cognition and Emotion, 1, 115–143.

    Article  Google Scholar 

  • Frijda, N. H., Kuipers, P., & ter Schure, E. (1989). Relations among emotion, appraisal, and emotional action readiness. Journal of Personality and Social Psychology, 57, 212–228.

    Article  Google Scholar 

  • Frischen, A., Eastwood, J. D., & Smilek, D. (2008). Visual search for faces with emotional expressions. Psychological Bulletin, 134, 662–676.

    PubMed  Article  Google Scholar 

  • Fritzsche, A., Watz, H., Magnussen, H., Tuinmann, G., Löwe, B., & Leupoldt, A. (2013). Cognitive biases in patients with chronic obstructive pulmonary disease and depression—A pilot study. British Journal of Health Psychology, 18, 827–843.

    PubMed  Article  Google Scholar 

  • Garvert, M. M., Friston, K. J., Dolan, R. J., & Garrido, M. I. (2014). Subcortical amygdala pathways enable rapid face processing. NeuroImage, 102, 309–316.

    PubMed  PubMed Central  Article  Google Scholar 

  • Gendron, M., Roberson, D., van der Vyver, J. M., & Barrett, L. F. (2014). Perceptions of emotion from facial expressions are not culturally universal: Evidence from a remote culture. Emotion, 14, 251–262.

    PubMed  PubMed Central  Article  Google Scholar 

  • Gibb, B. E., Schofield, C. A., & Coles, M. E. (2009). Reported history of childhood abuse and young adults’ information processing biases for facial displays of emotion. Child Maltreatment, 14, 148–156.

    PubMed  Article  Google Scholar 

  • Goossens, B. M., Dekleva, M., Reader, S. M., Sterck, E. H., & Bolhuis, J. J. (2008). Gaze following in monkeys is modulated by observed facial expressions. Animal Behaviour, 75, 1673–1681.

    Article  Google Scholar 

  • Gotlib, I. H., Krasnoperova, E., Yue, D. N., & Joormann, J. (2004). Attentional biases for negative interpersonal stimuli in clinical depression. Journal of Abnormal Psychology, 113, 127–135.

    Article  Google Scholar 

  • Hankin, B. L., Gibb, B. E., Abela, J. R., & Flory, K. (2010). Selective attention to affective stimuli and clinical depression among youths: Role of anxiety and specificity of emotion. Journal of Abnormal Psychology, 119, 491–501. 

  • Hariri, A. R., & Holmes, A. (2006). Genetics of emotional regulation: The role of the serotonin transporter in neural function. Trends in Cognitive Sciences, 10, 182–191.

    PubMed  Article  Google Scholar 

  • Hedger, N., Gray, K. L., Garner, M., & Adams, W. J. (2016). Are visual threats prioritized without awareness? A critical review and meta-analysis involving 3 behavioral paradigms and 2696 observers. Psychological Bulletin, 142, 934–968. 

  • Heim-Dreger, U., Kohlmann, C. W., Eschenbeck, H., & Burkhardt, U. (2006). Attentional biases for threatening faces in children: Vigilant and avoidant processes. Emotion, 6, 320–325.

    PubMed  Article  Google Scholar 

  • Helfinstein, S. M., White, L. K., Bar-Haim, Y., & Fox, N. A. (2008). Affective primes suppress attention bias to threat in socially-anxious individuals. Behaviour Research and Therapy, 46, 799–810.

    PubMed  PubMed Central  Article  Google Scholar 

  • Hirata, S., Matsuda, G., Ueno, A., Fukushima, H., Fuwa, K., Sugama, K., & Hasegawa, T. (2013). Brain response to affective pictures in the chimpanzee. Scientific Reports, 3, 1342. 

  • Hoffman, K. L., Gothard, K. M., Schmid, M. C., & Logothetis, N. K. (2007). Facial-expression and gaze-selective responses in the monkey amygdala. Current Biology, 17, 766–772.

    PubMed  Article  Google Scholar 

  • Hommer, R. E., Meyer, A., Stoddard, J., Connolly, M. E., Mogg, K., Bradley, B. P., & Brotman, M. A. (2014). Attention bias to threat faces in severe mood dysregulation. Depression and Anxiety, 31, 559–565.

    PubMed  Article  Google Scholar 

  • Ioannou, M. C., Mogg, K., & Bradley, B. P. (2004). Vigilance for threat: Effects of anxiety and defensiveness. Personality and Individual Differences, 36, 1879–1891.

    Article  Google Scholar 

  • Isaacowitz, D. M., Wadlinger, H. A., Goren, D., & Wilson, H. R. (2006). Is there an age-related positivity effect in visual attention? A comparison of two methodologies. Emotion, 6, 511–516.

    PubMed  Article  Google Scholar 

  • Johnson, M. H. (2005). Subcortical face processing. Nature, 6, 766–774.

    Google Scholar 

  • Johnson, A. L., Gibb, B. E., & McGeary, J. (2010). Reports of childhood physical abuse, 5-HTTLPR genotype, and women’s attentional biases for angry faces. Cognitive Therapy and Research, 34, 380–387.

    Article  Google Scholar 

  • Joormann, J., & Gotlib, I. H. (2007). Selective attention to emotional faces following recovery from depression. Journal of Abnormal Psychology, 116, 80–85.

    PubMed  Article  Google Scholar 

  • Joormann, J., Talbot, L., & Gotlib, I. H. (2007). Biased processing of emotional information in girls at risk for depression. Journal of Abnormal Psychology, 116, 135–143.

    PubMed  Article  Google Scholar 

  • Judah, M. R., Grant, D. M., Lechner, W. V., & Mills, A. C. (2013). Working memory load moderates late attentional bias in social anxiety. Cognition and Emotion, 27, 502–511.

    PubMed  Article  Google Scholar 

  • Juth, P., Lundqvist, D., Karlsson, A., & Öhman, A. (2005). Looking for foes and friends: Perceptual and emotional factors when finding a face in the crowd. Emotion, 5, 379–395.

    PubMed  Article  Google Scholar 

  • Kano, F., Hirata, S., Deschner, T., Behringer, V., & Call, J. (2016). Nasal temperature drop in response to a playback of conspecific fights in chimpanzees: A thermo-imaging study. Physiology and Behavior, 155, 83–94.

    PubMed  Article  Google Scholar 

  • Kano, F., Tanaka, M., & Tomonaga, M. (2008). Enhanced recognition of emotional stimuli in the chimpanzee (Pan troglodytes). Animal Cognition, 11, 517–524.

    PubMed  Article  Google Scholar 

  • Kappenman, E. S., Farrens, J. L., Luck, S. J., & Proudfit, G. H. (2014). Behavioral and ERP measures of attentional bias to threat in the dot-probe task: Poor reliability and lack of correlation with anxiety. Frontiers in Psychology, 5, 1368.

    PubMed  PubMed Central  Article  Google Scholar 

  • Killgore, W. D., & Yurgelun-Todd, D. A. (2001). Sex differences in amygdala activation during the perception of facial affect. NeuroReport, 12, 2543–2547.

  • Kim, Y. R., Oh, S. M., Corfield, F., Jeong, D. W., Jang, E. Y., & Treasure, J. (2014). Intranasal oxytocin lessens the attentional bias to adult negative faces: A double blind within-subject experiment. Psychiatry Investigation, 11, 160–166.

    PubMed  PubMed Central  Article  Google Scholar 

  • King, H. M., Kurdziel, L. B., Meyer, J. S., & Lacreuse, A. (2012). Effects of testosterone on attention and memory for emotional stimuli in male rhesus monkeys. Psychoneuroendocrinology, 37, 396–409.

    PubMed  Article  Google Scholar 

  • Koster, E. H. W., Crombez, G., Verschuere, B., & de Houwer, J. (2004). Selective attention to threat in the dot probe paradigm: Differentiating vigilance and difficulty to disengage. Behaviour Research and Therapy, 42, 1183–1192.

    PubMed  Article  Google Scholar 

  • Koster, E. H., Crombez, G., Verschuere, B., & de Houwer, J. (2006). Attention to threat in anxiety-prone individuals: Mechanisms underlying attentional bias. Cognitive Therapy and Research, 30, 635–643.

    Article  Google Scholar 

  • Koster, E. H. W., Verschuere, B., Crombez, G., & van Damme, S. (2005). Time-course of attention for threatening pictures in high and low trait anxiety. Behaviour Research and Therapy, 43, 1087–1098.

    PubMed  Article  Google Scholar 

  • Kragel, P. A., Knodt, A. R., Hariri, A. R., & LaBar, K. S. (2016). Decoding spontaneous emotional states in the human brain. PLoS Biology, 14, e2000106.

    PubMed  PubMed Central  Article  Google Scholar 

  • Kret, M. E., & De Gelder, B. (2012). A review on sex differences in processing emotion signals. Neuropsychologia, 50, 1211–1221.

    PubMed  Article  Google Scholar 

  • Kret M. E., & de Gelder B. (2013). When a smile becomes a fist: the perception of facial and bodily expressions of emotion in violent offenders. Experimental Brain Research 228(4), 399–410.

  • Kret, M. E., Jaasma, L., Bionda, T., & Wijnen, J. G. (2016). Bonobos (Pan paniscus) show an attentional bias towards conspecifics’ emotions. Proceedings of the National Academy of Sciences, 113, 3761–3766. 

  • Kret M. E., & Ploeger A. (2015) Emotion processing deficits: A liability spectrum providing insight into comorbidity of mental disorders. Neuroscience & Biobehavioral Reviews 52, 153–171

  • Kret, M. E., Pichon, S., Grezes, J., & de Gelder, B. (2011). Similarities and differences in perceiving threat from dynamic faces and bodies. An fMRI study. NeuroImage, 54, 1755–1762.

    PubMed  Article  Google Scholar 

  • Kret, M. E., Roelofs, K., Stekelenburg, J. J., & de Gelder, B. (2013). Emotion signals from faces, bodies and scenes influence observers’ face expressions, fixations and pupil-size. Frontiers in Human Neuroscience, 7(810), 1–9. 

  • Kret, M. E., Stekelenburg, J. J., Roelofs, K., & de Gelder, B. (2013). Perception of face and body expressions using electromyography, pupillometry and gaze measures. Frontiers in Psychology, 4(28), 1–12. 

  • Kret, M. E., Tomonaga, M., & Matsuzawa, T. (2014). Chimpanzees and humans mimic pupil-size of conspecifics. PLoS ONE, 9, e104886.

    PubMed  PubMed Central  Article  Google Scholar 

  • Kujawa, A. J., Torpey, D., Kim, J., Hajcak, G., Rose, S., Gotlib, I. H., & Klein, D. N. (2011). Attentional biases for emotional faces in young children of mothers with chronic or recurrent depression. Journal of Abnormal Child Psychology, 39, 125–135.

    PubMed  PubMed Central  Article  Google Scholar 

  • Kuniecki, M., Pilarczyk, J., & Wichary, S. (2015). The color red attracts attention in an emotional context: An ERP study. Frontiers in Human Neuroscience, 9, 212.

    PubMed  PubMed Central  Article  Google Scholar 

  • Kuraoka, K., & Nakamura, K. (2011). The use of nasal skin temperature measurements in studying emotion in macaque monkeys. Physiology and Behavior, 102, 347–355.

    PubMed  Article  Google Scholar 

  • Lacreuse, A., Schatz, K., Strazzullo, S., King, H. M., & Ready, R. (2013). Attentional biases and memory for emotional stimuli in men and male rhesus monkeys. Animal Cognition, 16, 861–871.

    PubMed  Article  Google Scholar 

  • Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1998). Emotion, motivation, and anxiety: Brain mechanisms and psychophysiology. Biological Psychiatry, 44, 1248–1263.

    PubMed  Article  Google Scholar 

  • LeDoux, J. (1996). The emotional brain. New York: Simon & Schuster.

    Google Scholar 

  • Lee, T. H., Sakaki, M., Cheng, R., Velasco, R., & Mather, M. (2014). Emotional arousal amplifies the effects of biased competition in the brain. Social Cognitive and Affective Neuroscience, 9, 2067–2077.

    PubMed  PubMed Central  Article  Google Scholar 

  • Lee, D. H., Susskind, J. M., & Anderson, A. K. (2013). Social transmission of the sensory benefits of eye widening in fear expressions. Psychological Science, 24, 957–965.

    PubMed  Article  Google Scholar 

  • Lees, A., Mogg, K., & Bradley, B. P. (2005). Health anxiety, anxiety sensitivity, and attentional biases for pictorial and linguistic health-threat cues. Cognition and Emotion, 19, 453–462.

    PubMed  Article  Google Scholar 

  • Lindell, A. K. (2013). Continuities in emotion lateralization in human and non-human primates. Frontiers in Human Neuroscience, 7(464), 1–9. 

  • Lindstrom, K. M., Guyer, A. E., Mogg, K., Bradley, B. P., Fox, N. A., Ernst, M., & Bar-Haim, Y. (2009). Normative data on development of neural and behavioral mechanisms underlying attention orienting toward social–emotional stimuli: An exploratory study. Brain Research, 1292, 61–70.

    PubMed  Article  Google Scholar 

  • Lundqvist, D., Bruce, N., & Öhman, A. (2015). Finding an emotional face in a crowd: Emotional and perceptual stimulus factors influence visual search efficiency. Cognition and Emotion, 29, 621–633.

    PubMed  Article  Google Scholar 

  • Machado, C. J., Bliss-Moreau, E., Platt, M. L., & Amaral, D. G. (2011). Social and nonsocial content differentially modulates visual attention and autonomic arousal in rhesus macaques. PLoS ONE, 6, e26598.

    PubMed  PubMed Central  Article  Google Scholar 

  • MacLeod, C., Mathews, A., & Tata, P. (1986). Attentional bias in emotional disorders. Journal of Abnormal Psychology, 95, 15–20.

    PubMed  Article  Google Scholar 

  • Mansell, W., Clark, D. M., Ehlers, A., & Chen, Y. (1999). Social anxiety and attention away from emotional faces. Cognition and Emotion, 13, 673–690.

    Article  Google Scholar 

  • Marsh, A. A., Ambady, N., & Kleck, R. E. (2005). The effects of fear and anger facial expressions on approach- and avoidance-related behaviors. Emotion, 5, 119–124.

    PubMed  Article  Google Scholar 

  • Mather, M., & Carstensen, L. L. (2003). Aging and attentional biases for emotional faces. Psychological Science, 14, 409–415.

    PubMed  Article  Google Scholar 

  • Mather, M., & Sutherland, M. R. (2011). Arousal-biased competition in perception and memory. Perspectives on Psychological Science, 6, 114–133.

    PubMed  PubMed Central  Article  Google Scholar 

  • Mathews, A. (1990). Why worry? The cognitive function of anxiety. Behaviour Research and Therapy, 28, 455–468.

    PubMed  Article  Google Scholar 

  • Mathews, A., & Sebastian, S. (1993). Suppression of emotional Stroop effects by fear-arousal. Cognition and Emotion, 7, 517–530.

    Article  Google Scholar 

  • McCoy, D. C., Roy, A. L., & Raver, C. C. (2016). Neighborhood crime as a predictor of individual differences in emotional processing and regulation. Developmental Science, 19, 164–174.

    PubMed  Article  Google Scholar 

  • Micheletta, J., Whitehouse, J., Parr, L. A., & Waller, B. M. (2015). Facial expression recognition in crested macaques (Macaca nigra). Animal Cognition, 18, 985–990.

    PubMed  Article  Google Scholar 

  • Mills, A. C., Grant, D. M., Judah, M. R., & White, E. J. (2014). The influence of anticipatory processing on attentional biases in social anxiety. Behavior Therapy, 45, 720–729.

    PubMed  Article  Google Scholar 

  • Mingtian, Z., Xiongzhao, Z., Jinyao, Y., Shuqiao, Y., & Atchley, R. A. (2011). Do the early attentional components of ERPs reflect attentional bias in depression? It depends on the stimulus presentation time. Clinical Neurophysiology, 122, 1371–1381.

    PubMed  Article  Google Scholar 

  • Miskovic, V., & Schmidt, L. A. (2012). Early information processing biases in social anxiety. Cognition and Emotion, 26, 176–185.

    PubMed  Article  Google Scholar 

  • Mogg, K., & Bradley, B. P. (1999). Some methodological issues in assessing attentional biases for threatening faces in anxiety: A replication study using a modified version of the probe detection task. Behaviour Research and Therapy, 37, 595–604.

    PubMed  Article  Google Scholar 

  • Mogg, K., & Bradley, B. P. (2006). Time course of attentional bias for fear-relevant pictures in spider-fearful individuals. Behaviour Research and Therapy, 44, 1241–1250.

    PubMed  Article  Google Scholar 

  • Mogg, K., Bradley, B. P., Dixon, C., Fisher, S., Twelftree, H., & McWilliams, A. (2000). Trait anxiety, defensiveness and selective processing of threat: An investigation using two measures of attentional bias. Personality and Individual Differences, 28, 1063–1077.

    Article  Google Scholar 

  • Mogg, K., Bradley, B., Miles, F., & Dixon, R. (2004). Time course of attentional bias for threat scenes: Testing the vigilance–avoidance hypothesis. Cognition and Emotion, 18, 689–700.

    Article  Google Scholar 

  • Mogg, K., McNamara, J., Powys, M., Rawlinson, H., Seiffer, A., & Bradley, B. P. (2000). Selective attention to threat: A test of two cognitive models of anxiety. Cognition and Emotion, 14, 375–399.

    Article  Google Scholar 

  • Mogg, K., Philippot, P., & Bradley, B. P. (2004). Selective attention to angry faces in clinical social phobia. Journal of Abnormal Psychology, 113, 160–165.

    PubMed  Article  Google Scholar 

  • Montagner, R., Mogg, K., Bradley, B. P., Pine, D. S., Czykiel, M. S., Miguel, E. C., & Salum, G. A. (2016). Attentional bias to threat in children at-risk for emotional disorders: Role of gender and type of maternal emotional disorder. European Child and Adolescent Psychiatry, 25, 735–742.

  • Morimoto, Y., & Fujita, K. (2012). Capuchin monkeys (Cebus apella) use conspecifics’ emotional expressions to evaluate emotional valence of objects. Animal Cognition, 15, 341–347.

    PubMed  Article  Google Scholar 

  • Murphy, N. A., & Isaacowitz, D. M. (2008). Preferences for emotional information in older and younger adults: a meta-analysis of memory and attention tasks. Psychology and Aging, 23, 263–286.

    PubMed  Article  Google Scholar 

  • Nashiro, K., Sakaki, M., & Mather, M. (2011). Age differences in brain activity during emotion processing: Reflections of age-related decline or increased emotion regulation. Gerontology, 58, 156–163.

    PubMed  PubMed Central  Article  Google Scholar 

  • Nhan, B. R., & Chau, T. (2010). Classifying affective states using thermal infrared imaging of the human face. IEEE Transactions on Biomedical Engineering, 57, 979–987.

    PubMed  Article  Google Scholar 

  • Orgeta, V. (2011). Avoiding threat in late adulthood: Testing two life span theories of emotion. Experimental Aging Research, 37, 449–472.

  • Osinsky, R., Reuter, M., Küpper, Y., Schmitz, A., Kozyra, E., Alexander, N., & Hennig, J. (2008). Variation in the serotonin transporter gene modulates selective attention to threat. Emotion, 8, 584–588.

    PubMed  Article  Google Scholar 

  • Öhman, A., Flykt, A., & Esteves, F. (2001). Emotion drives attention: Detecting the snake in the grass. Journal of Experimental Psychology: General, 130, 466–478.

    Article  Google Scholar 

  • Öhman, A., Lundqvist, D., & Esteves, F. (2001). The face in the crowd revisited: A threat advantage with schematic stimuli. Journal of Personality and Social Psychology, 80, 381–396.

    PubMed  Article  Google Scholar 

  • Öhman, A., & Mineka, S. (2001). Fears, phobias, and preparedness: Toward an evolved module of fear and fear learning. Psychological Review, 108, 483–522.

    PubMed  Article  Google Scholar 

  • Palagi, E. (2008). Sharing the motivation to play: The use of signals in adult bonobos. Animal Behaviour, 75, 887–896.

    Article  Google Scholar 

  • Palagi, E., Dall’Olio, S., Demuru, E., & Stanyon, R. (2014). Exploring the evolutionary foundations of empathy: Consolation in monkeys. Evolution and Human Behavior, 35, 341–349.

    Article  Google Scholar 

  • Palagi, E., & Mancini, G. (2011). Playing with the face: Playful facial “chattering” and signal modulation in a monkey species (Theropithecus gelada). Journal of Comparative Psychology, 125, 11–21. 

  • Parr, L. A. (2001). Cognitive and physiological markers of emotional awareness in chimpanzees (Pan troglodytes). Animal Cognition, 4, 223–229.

    PubMed  Article  Google Scholar 

  • Parr, L. A., Hecht, E., Barks, S. K., Preuss, T. M., & Votaw, J. R. (2009). Face processing in the chimpanzee brain. Current Biology, 19, 50–53.

    PubMed  Article  Google Scholar 

  • Parr, L. A., & Heintz, M. (2009). Facial expression recognition in rhesus monkeys, Macaca mulatta. Animal Behaviour, 77, 1507–1513.

    PubMed  PubMed Central  Article  Google Scholar 

  • Parr, L. A., & Hopkins, W. D. (2000). Brain temperature asymmetries and emotional perception in chimpanzees, Pan troglodytes. Physiology and Behavior, 71, 363–371.

    PubMed  Article  Google Scholar 

  • Parr, L. A., Hopkins, W. D., & de Waal, F. (1998). The perception of facial expressions by chimpanzees, Pan troglodytes. Evolution of Communication, 2, 1–23.

    Article  Google Scholar 

  • Parr, L. A., Modi, M., Siebert, E., & Young, L. J. (2013). Intranasal oxytocin selectively attenuates rhesus monkeys’ attention to negative facial expressions. Psychoneuroendocrinology, 38, 1748–1756.

    PubMed  PubMed Central  Article  Google Scholar 

  • Parr, L. A., & Waller, B. M. (2006). Understanding chimpanzee facial expression: Insights into the evolution of communication. Social Cognitive and Affective Neuroscience, 1, 221–228.

    PubMed  PubMed Central  Article  Google Scholar 

  • Parr, L. A., Waller, B. M., Vick, S. J., & Bard, K. A. (2007). Classifying chimpanzee facial expressions using muscle action. Emotion, 7, 172–181.

  • Peckham A. D., Johnson S. L., & Gotlib I. H. (2015) Attentional bias in euthymic bipolar I disorder. Cognition and Emotion 30(3), 472–487

  • Peckham, A. D., McHugh, R. K., & Otto, M. W. (2010). A meta-analysis of the magnitude of biased attention in depression. Depression and Anxiety, 27, 1135–1142.

    PubMed  Article  Google Scholar 

  • Pegna, A. J., Khateb, A., Lazeyras, F., & Seghier, M. L. (2005). Discriminating emotional faces without primary visual cortices involves the right amygdala. Nature Neuroscience, 8, 24–25.

    PubMed  Article  Google Scholar 

  • Pérez-Edgar, K., Bar-Haim, Y., McDermott, J. M., Chronis-Tuscano, A., Pine, D. S., & Fox, N. A. (2010). Attention biases to threat and behavioral inhibition in early childhood shape adolescent social withdrawal. Emotion, 10, 349–357.

    PubMed  PubMed Central  Article  Google Scholar 

  • Pérez-Edgar, K., Bar-Haim, Y., McDermott, J. M., Gorodetsky, E., Hodgkinson, C. A., Goldman, D., & Fox, N. A. (2010). Variations in the serotonin-transporter gene are associated with attention bias patterns to positive and negative emotion faces. Biological Psychology, 83, 269–271.

    PubMed  Article  Google Scholar 

  • Pérez-Edgar, K., Kujawa, A., Nelson, S. K., Cole, C., & Zapp, D. J. (2013). The relation between electroencephalogram asymmetry and attention biases to threat at baseline and under stress. Brain and Cognition, 82, 337–343.

    PubMed  PubMed Central  Article  Google Scholar 

  • Pérez-Edgar, K., Reeb-Sutherland, B. C., McDermott, J. M., White, L. K., Henderson, H. A., Degnan, K. A., & Fox, N. A. (2011). Attention biases to threat link behavioral inhibition to social withdrawal over time in very young children. Journal of Abnormal Child Psychology, 39, 885–895.

    PubMed  PubMed Central  Article  Google Scholar 

  • Pessoa, L., Japee, S., Sturman, D., & Ungerleider, L. G. (2006). Target visibility and visual awareness modulate amygdala responses to fearful faces. Cerebral Cortex, 16, 366–375.

    PubMed  Article  Google Scholar 

  • Petrova, K., Wentura, D., & Bermeitinger, C. (2013). What happens during the stimulus onset asynchrony in the dot-probe task? Exploring the role of eye movements in the assessment of attentional biases. PLoS ONE, 8, e76335. 

  • Pfabigan, D. M., Lamplmayr-Kragl, E., Pintzinger, N. M., Sailer, U., & Tran, U. S. (2014). Sex differences in event-related potentials and attentional biases to emotional facial stimuli. Frontiers in Psychology, 5, 1477.

    PubMed  PubMed Central  Article  Google Scholar 

  • Phelps, E. A., Ling, S., & Carrasco, M. (2006). Emotion facilitates perception and potentiates the perceptual benefits of attention. Psychological Science, 17, 292–299.

    PubMed  PubMed Central  Article  Google Scholar 

  • Phillips, M. L., Drevets, W. C., Rauch, S. L., & Lane, R. (2003). Neurobiology of emotion perception I: The neural basis of normal emotion perception. Biological Psychiatry, 54, 504–514.

    PubMed  Article  Google Scholar 

  • Pineles, S. L., & Mineka, S. (2005). Attentional biases to internal and external sources of potential threat in social anxiety. Journal of Abnormal Psychology, 114, 314–318.

    PubMed  Article  Google Scholar 

  • Pinkham, A. E., Griffin, M., Baron, R., Sasson, N. J., & Gur, R. C. (2010). The face in the crowd effect: Anger superiority when using real faces and multiple identities. Emotion, 10, 141–146.

    PubMed  Article  Google Scholar 

  • Pinsk, M. A., Arcaro, M., Weiner, K. S., Kalkus, J. F., Inati, S. J., Gross, C. G., & Kastner, S. (2009). Neural representations of faces and body parts in macaque and human cortex: A comparative FMRI study. Journal of Neurophysiology, 101, 2581–2600.

    PubMed  PubMed Central  Article  Google Scholar 

  • Ploeger, A., & van der Hoort, B. (2015). Evolutionary psychology as a metatheory for the social sciences: How to gather interdisciplinary evidence for a psychological adaptation. Review of General Psychology, 19, 381–392.

    Article  Google Scholar 

  • Polosecki, P., Moeller, S., Schweers, N., Romanski, L. M., Tsao, D. Y., & Freiwald, W. A. (2013). Faces in motion: selectivity of macaque and human face processing areas for dynamic stimuli. Journal of Neuroscience, 33, 11768–11773.

    PubMed  PubMed Central  Article  Google Scholar 

  • Posner, M. I., & Petersen, S. E. (1990). The attention system of the human brain. Annual Review of Neuroscience, 13, 25–42.

    PubMed  Article  Google Scholar 

  • Preuschoft, S., & van Hooff, J. A. R. A. M. (1995). Homologizing primate facial displays: A critical review of methods. Folia Primatologica, 65, 121–137.

    Article  Google Scholar 

  • Preuschoft, S. & van Hooff, J. A. R. A. M. (1997). Variation in primate affiliative displays: an exercise in behavior phylogeny. In S. Preuschoft & J. A. R. A. M. van Hooff (Eds.), "Laughter" and "Smiling" in Macaques: an Evolutionary Perspective (pp. 1995–211) (785 p.).

  • Price, R. B., Siegle, G. J., Silk, J. S., Ladouceur, C. D., McFarland, A., Dahl, R. E., & Ryan, N. D. (2014). Looking under the hood of the dot-probe task: And fMRI study in anxious youth. Depression and Anxiety, 31, 178–187.

    PubMed  PubMed Central  Article  Google Scholar 

  • Putman, P. (2011). Resting state EEG delta–beta coherence in relation to anxiety, behavioral inhibition, and selective attentional processing of threatening stimuli. International Journal of Psychophysiology, 80, 63–68.

    PubMed  Article  Google Scholar 

  • Roelofs, K., Elzinga, B. M., & Rotteveel, M. (2005). The effects of stress-induced cortisol responses on approach–avoidance behavior. Psychoneuroendocrinology, 30, 665–677.

    PubMed  Article  Google Scholar 

  • Ross, M. D., Owren, M. J., & Zimmermann, E. (2009). Reconstructing the evolution of laughter in great apes and humans. Current Biology, 19, 1106–1111.

    Article  Google Scholar 

  • Rossignol, M., Campanella, S., Bissot, C., & Philippot, P. (2013). Fear of negative evaluation and attentional bias for facial expressions: An event-related study. Brain and Cognition, 82, 344–352.

    PubMed  Article  Google Scholar 

  • Roy, A. K., Vasa, R. A., Bruck, M., Mogg, K., Bradley, B. P., Sweeney, M., & Pine, D. S. (2008). Attention bias toward threat in pediatric anxiety disorders. Journal of the American Academy of Child and Adolescent Psychiatry, 47, 1189–1196.

    PubMed  PubMed Central  Article  Google Scholar 

  • Schmitt, D. P., & Pilcher, J. J. (2004). Evaluating evidence for psychological adaptation: How do we know one when we see one? Psychological Science, 15, 643–649.

    PubMed  Article  Google Scholar 

  • Schmukle, S. C. (2005). Unreliability of the dot probe task. European Journal of Personality, 19, 595–605.

    Article  Google Scholar 

  • Schmukle, S. C., & Egloff, B. (2006). Assessing anxiety with extrinsic Simon tasks. Experimental Psychology, 53, 149–160.

    PubMed  Article  Google Scholar 

  • Schupp, H. T., Junghofer, M., Weike, A. I., & Hamm, A. O. (2003). Attention and emotion: An ERP analysis of facilitated emotional stimulus processing. NeuroReport, 14, 1107–1110.

    PubMed  Article  Google Scholar 

  • Seidel, E. M., Habel, U., Kirschner, M., Gur, R. C., & Derntl, B. (2010). The impact of facial emotional expressions on behavioral tendencies in women and men. Journal of Experimental Psychology: Human Perception and Performance, 36, 500–507.

    PubMed  PubMed Central  Google Scholar 

  • Shane, M. S., & Peterson, J. B. (2007). An evaluation of early and late stage attentional processing of positive and negative information in dysphoria. Cognition and Emotion, 21, 789–815.

    Article  Google Scholar 

  • Shibasaki, M., & Kawai, N. (2009). Rapid detection of snakes by Japanese monkeys (Macaca fuscata): An evolutionarily predisposed visual system. Journal of Comparative Psychology, 123, 131–135.

    PubMed  Article  Google Scholar 

  • Shields, K., Engelhardt, P. E., & Ietswaart, M. (2012). Processing emotion information from both the face and body: An eye-movement study. Cognition and Emotion, 26, 699–709.

    PubMed  Article  Google Scholar 

  • Spoor, J. R., & Kelly, J. R. (2004). The evolutionary significance of affect in groups: Communication and group bonding. Group Processes Intergroup Relations, 7, 398–412.

    Article  Google Scholar 

  • Staugaard, S. R. (2009). Reliability of two versions of the dot-probe task using photographic faces. Psychology Science Quarterly, 51, 339–350.

    Google Scholar 

  • Stevens, S., Rist, F., & Gerlach, A. L. (2011). Eye movement assessment in individuals with social phobia: Differential usefulness for varying presentation times? Journal of Behavior Therapy and Experimental Psychiatry, 42, 219–224.

    PubMed  Article  Google Scholar 

  • Stins, J. F., Roelofs, K., Villan, J., Kooijman, K., Hagenaars, M. A., & Beek, P. J. (2011). Walk to me when I smile, step back when I’m angry: Emotional faces modulate whole-body approach–avoidance behaviors. Experimental Brain Research, 212, 603–611.

    PubMed  PubMed Central  Article  Google Scholar 

  • Susa, G., Pitică, I., Benga, O., & Miclea, M. (2012). The self regulatory effect of attentional control in modulating the relationship between attentional biases toward threat and anxiety symptoms in children. Cognition and Emotion, 26, 1069–1083.

    PubMed  Article  Google Scholar 

  • Susskind, J. M., & Anderson, A. K. (2008). Facial expression form and function. Communicative and Integrative Biology, 1, 148–149.

    PubMed  PubMed Central  Article  Google Scholar 

  • Susskind, J. M., Lee, D. H., Cusi, A., Feiman, R., Grabski, W., & Anderson, A. K. (2008). Expressing fear enhances sensory acquisition. Nature Neuroscience, 11, 843–850.

    PubMed  Article  Google Scholar 

  • Teufel, C., Gutmann, A., Pirow, R., & Fischer, J. (2010). Facial expressions modulate the ontogenetic trajectory of gaze‐following among monkeys. Developmental Science, 13, 913–922.

    PubMed  Article  Google Scholar 

  • Thomason, M. E., Henry, M. L., Hamilton, J. P., Joormann, J., Pine, D. S., Ernst, M., & Gotlib, I. H. (2010). Neural and behavioral responses to threatening emotion faces in children as a function of the short allele of the serotonin transporter gene. Biological Psychology, 85, 38–44.

    PubMed  PubMed Central  Article  Google Scholar 

  • Tomaszczyk, J. C., & Fernandes, M. A. (2014). Age-related differences in attentional bias for emotional faces. Aging Neuropsychology and Cognition, 21, 544–559.

    Article  Google Scholar 

  • Tran, U. S., Lamplmayr, E., Pintzinger, N. M., & Pfabigan, D. M. (2013). Happy and angry faces: Subclinical levels of anxiety are differentially related to attentional biases in men and women. Journal of Research in Personality, 47, 390–397.

    Article  Google Scholar 

  • Troller‐Renfree, S., McDermott, J. M., Nelson, C. A., Zeanah, C. H., & Fox, N. A. (2015). The effects of early foster care intervention on attention biases in previously institutionalized children in Romania. Developmental Science, 18, 713–722.

    PubMed  Article  Google Scholar 

  • Tsao, D. Y., Moeller, S., & Freiwald, W. A. (2008). Comparing face patch systems in macaques and humans. Proceedings of the National Academy of Sciences, 105, 19514–19519.

    Article  Google Scholar 

  • van Bockstaele, B., Verschuere, B., Koster, E. H. W., Tibboel, H., de Houwer, J., & Crombez, G. (2011). Differential predictive power of self report and implicit measures on behavioural and physiological fear responses to spiders. International Journal of Psychophysiology, 79, 166–174.

    PubMed  Article  Google Scholar 

  • van Honk, J., Tuiten, A., Verbaten, R., van den Hout, M., Koppeschaar, H., Thijssen, J., & de Haan, E. (1999). Correlations among salivary testosterone, mood, and selective attention to threat in humans. Hormones and Behavior, 36, 17–24.

    PubMed  Article  Google Scholar 

  • Vuilleumier, P. (2005). How brains beware: Neural mechanisms of emotional attention. Trends in Cognitive Sciences, 9, 585–594.

    PubMed  Article  Google Scholar 

  • Vuilleumier, P., & Pourtois, G. (2007). Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia, 45, 174–194.

    PubMed  Article  Google Scholar 

  • Waechter, S., Nelson, A. L., Wright, C., Hyatt, A., & Oakman, J. (2014). Measuring attentional bias to threat: Reliability of dot probe and eye movement indices. Cognitive Therapy and Research, 38, 313–333.

    Article  Google Scholar 

  • Waechter, S., & Stolz, J. A. (2015). Trait anxiety, state anxiety, and attentional bias to threat: Assessing the psychometric properties of response time measures. Cognitive Therapy and Research, 39, 441–458.

    Article  Google Scholar 

  • Waller, B. M., Caeiro, C. C., & Davila-Ross, M. (2015). Orangutans modify facial displays depending on recipient attention. PeerJ, e827.

  • Waller, B., & Dunbar, R. I. M. (2005). Differential behavioral effects of silent bared teeth display and relaxed open mouth display in chimpanzees (Pan troglodytes). Ethology, 111, 129–142.

    Article  Google Scholar 

  • Waters, A. M., Bradley, B. P., & Mogg, K. (2014). Biased attention to threat in paediatric anxiety disorders (generalized anxiety disorder, social phobia, specific phobia, separation anxiety disorder) as a function of “distress” versus “fear” diagnostic categorization. Psychological Medicine, 44, 607–616.

    PubMed  Article  Google Scholar 

  • Waters, A. M., Henry, J., Mogg, K., Bradley, B. P., & Pine, D. S. (2010). Attentional bias towards angry faces in childhood anxiety disorders. Journal of Behavior Therapy and Experimental Psychiatry, 41, 158–164.

    PubMed  Article  Google Scholar 

  • Waters, A. M., Kokkoris, L. L., Mogg, K., Bradley, B. P., & Pine, D. S. (2010). The time course of attentional bias for emotional faces in anxious children. Cognition and Emotion, 24, 1173–1181.

    Article  Google Scholar 

  • Waters, A. M., Lipp, O. V., & Spence, S. H. (2004). Attentional bias toward fear-related stimuli: An investigation with nonselected children and adults and children with anxiety disorders. Journal of Experimental Child Psychology, 89, 320–337.

    PubMed  Article  Google Scholar 

  • Waters, A. M., Mogg, K., Bradley, B. P., & Pine, D. S. (2008). Attentional bias for emotional faces in children with generalized anxiety disorder. Journal of the American Academy of Child and Adolescent Psychiatry, 47, 435–442.

    PubMed  Article  Google Scholar 

  • Whalen, P. J., Rauch, S. L., Etcoff, N. L., McInerney, S. C., Lee, M. B., & Jenike, M. A. (1998). Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. Journal of Neuroscience, 18, 411–418.

    PubMed  Google Scholar 

  • Wilkowski, B. M., & Meier, B. P. (2010). Bring it on: Angry facial expressions potentiate approach-motivated motor behavior. Journal of Personality and Social Psychology, 98, 201–210.

    PubMed  Article  Google Scholar 

  • Williams, J. M. G., Watts, F. N., MacLeod, C., & Mathews, A. (1997). Cognitive psychology and the emotional disorders (2nd ed.). Chichester: Wiley.

    Google Scholar 

  • Wilson, E., & MacLeod, C. (2003). Contrasting two accounts of anxiety-linked attentional bias: Selective attention to varying levels of stimulus threat intensity. Journal of Abnormal Psychology, 112, 212–218.

    PubMed  Article  Google Scholar 

  • Wirth, M. M., & Schultheiss, O. C. (2007). Basal testosterone moderates responses to anger faces in humans. Physiology and Behavior, 90, 496–505.

    PubMed  Article  Google Scholar 

  • Yantis, S. (1996). Attentional capture in vision. In A. F. Kramer, M. G. H. Coles, & G. D. Logan (Eds.), Converging operations in the study of visual selective attention (pp. 45–76). Washington, DC: American Psychological Association.

    Chapter  Google Scholar 

  • Yiend, J. (2010). The effects of emotion on attention: A review of attentional processing of emotional information. Cognition and Emotion, 24, 3–47.

    Article  Google Scholar 

  • Yovel, G., & Freiwald, W. A. (2013). Face recognition systems in monkey and human: Are they the same thing? F1000Prime Reports, 5, 10.

  • Zald, D. H. (2003). The human amygdala and the emotional evaluation of sensory stimuli. Brain Research Reviews, 41, 88–123.

    PubMed  Article  Google Scholar 

  • Zhu, Q., Nelissen, K., Van den Stock, J., De Winter, F. L., Pauwels, K., de Gelder, B., & Vandenbulcke, M. (2013). Dissimilar processing of emotional facial expressions in human and monkey temporal cortex. NeuroImage, 66, 402–411.

    PubMed  Article  Google Scholar 

  • Zvielli, A., Bernstein, A., & Koster, E. H. W. (2014). Dynamics of attentional bias to threat in anxious adults: Bias towards and/or away? PLoS ONE, 9, e104025.

  • Zvielli, A., Bernstein, A., & Koster, E. H. W. (2015). Temporal dynamics of attentional bias. Clinical Psychological Science, 3, 772–788. 

Download references

Author note

Preparation of this work was supported by the Netherlands Science Foundation (VENI # 016-155-082), The Royal Netherlands Academy of Arts and Sciences (KNAW) Dobberke Foundation for Comparative Psychology (# UPS/BP/4387 2014-3) and the Leids Universiteits Fonds (#6511/21-6-16/Elise Mathilde Fonds to M.E.K.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mariska E. Kret.

Appendix

Appendix

Table 1 Results of dot-probe studies in human subjects with pictorial stimuli

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

van Rooijen, R., Ploeger, A. & Kret, M. The dot-probe task to measure emotional attention: A suitable measure in comparative studies?. Psychon Bull Rev 24, 1686–1717 (2017). https://doi.org/10.3758/s13423-016-1224-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3758/s13423-016-1224-1

Keywords

  • Emotion
  • Attention
  • Dot-probe task
  • Cross-species
  • Comparative