Keywords

Introduction

This chapter provides information about facial electromyography (EMG) as a method of investigating emotions and affect, including examples of application and methods for analysis. This chapter begins with a short introduction to emotion theory followed by an operationalisation of facial emotional expressions as an underlying requirement for their study using facial EMG. This chapter ends by providing practical information on the use of facial EMG.

Theory: From Emotional States to Their Expression

Darwin (1872/1965) studied emotions and their expression across species and argued that emotion phenomena were the products of natural selection. According to this evolutionary perspective, emotions constitute an interrelated suite of physiological and behavioural systems that have guided adaptive action over evolutionary time. According to Tomkins (1962), specific response patterns related to emotion experience are elicited automatically by certain events. For example, a threat or danger in the perceived environment should elicit fear. Emotion responses are characterised by coordinated patterns of activity that can include physiological changes, signalling behaviours in the voice and face, subjective experience, and relevant action. For example, a fearful response includes changes in brain activity in the amygdala (a region in the brain associated with emotion processing in general but specifically with fear) (Janak & Tye, 2015). Changes in physiology during a fearful episode can manifest as an associated facial expression (i.e. wide opened eyes, eyebrows pulled upwards and drawn together, and the corners of the mouth pulled outwards), the face turning pale, sweating, and a vocal expression (e.g. fear scream). Fear can direct our attention to the dangerous situation and facilitate adaptive action, such as fleeing. Emotions allow us to navigate life’s challenges, and each emotion is governed by its own adaptive logic.

Several theories consider emotions as distinct entities and as biologically innate (e.g. Ekman et al., 1982; Izard, 1977; Plutchik, 1980; Tomkins, 1984). A very prominent theory is the ‘basic emotion theory’ (Ekman, 1992a, b), according to which some emotions are considered universal, meaning they occur in humans across all cultures. Most theorists agree on at least six basic emotion categories: anger, disgust, fear, sadness, surprise, and happiness (Ortony & Turner, 1990). According to Tomkins (1962), each emotion has its unique affect programme such as the example outlined above in the case of fear. Ultimately, these categories map onto distinct patterns of activity shaped by evolutionary processes to solve different kinds of adaptive problems faced by our highly social hominin ancestors. Research has provided evidence for distinct patterns in physiology on the basis of heart rate, temperature, and electrodermal activity for the six basic emotions, and these varying physiological patterns can be linked to functions of emotions on a behavioural level (as proposed by Darwin). In a state of anger, a preparation for fighting occurs by increasing the blood flow to the hands (Levenson et al., 1990). In a state of fear, the blood flow to large skeletal muscles increases which prepares for a flight reaction (Levenson et al., 1990). A state of disgust will lead to a rejection of the eliciting stimulus by restricting airflow to olfactory receptors and triggering a gag reflex (Koerner & Antony, 2010). A state of sadness results in a loss of muscle tone (Oberman et al., 2007), slowing us down, allowing us to focus on the issue that induced the sadness (Wolpert, 2008). A state of happiness leads to an increase in the energy available to the organism by releasing respective transmitters (Uvnäs-Moberg, 1998). A surprised state results in air being quickly inhaled which increases the ability to react fast (Ekman & Friesen, 1975), as it interrupts ongoing processes (Tomkins, 1962). Even participants’ subjective understanding (i.e. conceptualisation) of emotion reflects distinct patterns for each of the six basic emotions. When asking participants to colour in the body parts they perceive to be affected by either an increase or decrease in sensations when being in a state of each of the six basic emotions, the obtained results were in line with associated physiological changes as outlined above (see Nummenmaa et al., 2014). Neuroscientific research has shown that the distinctiveness of emotions is also evident in brain activity patterns. Vytal and Hamann (2010) conducted a neuroimaging meta-analysis and found distinct patterns of neural correlates for anger, disgust, fear, happiness, and sadness. The evidence presented here supports the assumption that there are distinct response patterns of emotions at least for the basic emotions.

One alternative view is that emotions can be characterised as the integration of at least two fundamental dimensions: valence and arousal (Russell, 1980). Russell (1994) views the dimensions of valence and arousal as universal to emotions but questions the universality of distinct emotion categories. The valence dimension spans from negative (i.e. unpleasant) to positive (i.e. pleasant). The arousal dimension ranges from low (i.e. deactivated) to high (i.e. activated). Any affective state can be represented as a combination of these two dimensions. Multidimensional scaling thus reveals similarities and dissimilarities between affective states. For example, sadness is an emotion considered as negative in valence and low in arousal, whereas anger is considered also as negative in valence but high in arousal. As such, the dimensional conceptualisation of affect and the categorisation of emotions are not mutually exclusive and can actually complement each other (see Harmon-Jones et al., 2017). However, it should be noted that not all affective states are emotions, while emotions always are affective states. For example, the longer-lasting affective states are called ‘moods’, and emotions are rather short-lasting, while other affective states overlap with cognitive states, e.g. confusion and boredom.

Facial Emotional Expressions

The changes occurring throughout the body in an emotional state such as a face turning pale in a state of fear are visible to an observer and provide information about the affective state. Moreover, some physiological changes during the experience of emotion result in movement. For example, the activation of facial muscles leads to facial movement manifesting as facial expressions. Unlike skeletal muscles in the human body that are generally attached to bones, facial muscles also attach to each other or to the skin of the face. This anatomical set-up allows even slight contractions of facial muscles to pull the facial skin and create a facial expression visible to others. The general number of facial muscles in humans is 43, although this number can vary between people (Waller et al., 2008). This large concentration of muscles in a narrowly defined space (i.e. the face) allows for the execution of many different facial movements and results in various expressions. The Facial Action Coding System (FACS; Ekman & Friesen, 1978; new edition: Ekman et al., 2002) is an anatomical catalogue describing all movement-related facial actions (i.e. action units (AUs)) possible in humans. As a result, FACS has become a widely used tool in facial emotion research.

For emotional facial expressions to send an interpretable signal and serve as a means of communication, the emotion needs to be expressed in a certain way for it to be clearly attributable to a specific emotion. Ekman et al. (2002) provided suggestions for AU combinations that align with basic emotional expressions. For example, the activations of AU 9 (nose wrinkle), AU 10 (upper lip raise), and AU25 (lips parted) together result in a facial expression displaying disgust. Since facial actions as outlined by the AUs are the result of facial muscle activations, facial muscles can be linked to specific AUs. Sticking with the example of disgust, the activation of the levator labii muscle leads to a wrinkling of the nose and a raised upper lip. The connection between facial action and muscles also provides the association with specific emotions. Table 17.1 shows the six basic emotions with associated AUs and facial muscles. The facial expressions resulting from AU activations per emotion category are considered prototypical and align with the universality assumption of basic emotions as proposed by Ekman (Ekman & Friesen, 1971). When participants are shown images (or videos) displaying these prototypes, attributions of the respective emotion label are generally high. Most facial emotion recognition research utilises prototypes of basic facial emotional expressions, and many stimulus sets including these prototypes have been developed for these purposes (e.g. Ekman & Friesen, 1976; Krumhuber et al., 2013; Matsumoto & Ekman, 1988; Tottenham et al., 2009; Van Der Schalk et al., 2011; Wingenbach et al., 2016; Young et al., 2002).

Table 17.1 Basic emotions with associated AUs and facial muscles

As mentioned above, there are inter-individual differences in humans regarding their number of facial muscles. This variability raises the question of how prototypical displays of facial emotion are possible. Would it not require a standard set of facial muscles to produce expressions specific to basic emotions (as presented in Table 17.1)? To address this question, Waller et al. (2008) investigated whether the facial muscles underlying facial movements associated with facial emotional expressions of basic emotions are affected by inter-individual variability. These researchers dissected recent human cadavers and documented whether specific facial muscles were absent or present and whether this was the case for both sides of the face. The facial muscles investigated were the frontalis, orbicularis oculi, zygomaticus major, depressor anguli oris, orbicularis oris, procerus, corrugator supercilii, zygomaticus minor, buccinator, mentalis, depressor labii inferioris, risorius, levator labii superioris, levator labii superioris alaeque nasi, nasalis, and depressor septi. The first five facial muscles of this list were considered essential for the production of facial emotional expressions associated with the expression of basic emotions by Waller et al. (2008). Their results showed that the facial muscles assumed to be necessary to produce basic facial emotional expressions were present, mostly bilaterally, in all of the dissected cadavers. In addition, muscles commonly associated with the expression of basic emotions (as outlined in Table 17.1) were, although not always bilaterally, present in all cadavers, i.e. the corrugator, mentalis, depressor labii inferioris, and both levator labii muscles. The other facial muscles investigated were not present in all cadavers, and many were only present unilaterally. These findings support the universality assumption of basic emotions, at least in terms of facial expressions.

Investigating Facial Emotional Expressions Using Facial EMG

In some instances, participants’ facial expressions are video-recorded while they are undergoing an experiment, and the recorded facial expressions are subjected to analyses. The FACS (Ekman & Friesen, 1978) can be used to code the presence of specific facial AUs, and a combination of certain facial AUs can be indicative of the presence of a specific facial emotion. For example, the co-presence of the AU6 (raising the cheek) and AU12 (pulling lip corners outwards) would indicate the presence of a facial expression of happiness. Applying this method requires FACS training and is subject to inter-individual perceptual differences. For these reasons, automated facial action coding software has been developed based on FACS (e.g. FaceReader). When using the FaceReader software, video recordings of faces can be imported, and the software output provides coded AUs as well as timings for the six basic emotions, valence, and arousal values. However, good video quality and clearly visible faces are necessary for automatic detection of AUs/emotions, and thus, trained human decoders can outperform the software.

Whether AUs are coded by humans or by software, visible movements are required for an AU to be coded. An alternative method for investigating facial emotional expressions is using facial electromyography (EMG). A great advantage of facial EMG is that it is a highly sensitive method able to ascertain the slightest contractions in facial muscles. Since fatty tissue and skin are covering the muscles in the face, very slight muscle contractions are not necessarily visible to the naked eye but do occur nonetheless during the processing of emotion-related stimuli or the presence of emotion. It should be noted that emotional states are not always expressed, as the expression thereof often has communicative or signalling function (Fridlund, 1994) that does not apply to all emotion-inducing situations. However, facial muscle contractions non-visible to observers are measurable using facial EMG (Cacioppo et al., 1986). Consequently, facial EMG can also detect facial muscle activity congruent with the affective state even when participants are instructed to suppress their emotional expression (Cacioppo et al., 1992).

So, how does facial EMG work? Whenever muscles are contracted, electricity is generated through the combined action potentials of an active motor with the measurement unit being either millivolt (mV) or microvolt (μV). These action potentials are the result of depolarisation and repolarisation at the muscle fibre membrane. When a motor nerve is excited, transmitters are released in the motor endplates, and a potential is formed in the muscle fibre (Nazmi et al., 2016). Even during a resting state when muscles are not contracted, a muscle tonus is present which can be measured with EMG. The presence of this muscle tonus is the reason why baseline measures often need to be taken, i.e. to be able to evaluate the reaction to a stimulus relative to the baseline activity; the fast nature of facial expressions makes using a prestimulus baseline necessary. Two detecting electrodes are needed to assess the electricity in one muscle, one negative electrode (VIN-) and one positive electrode (VIN+). An additional electrode is used as a reference point, i.e. ground electrode. There are two different kinds of electrodes for EMG. Needle electrodes are more commonly used within medical settings, and surface electrodes (which are non-invasive) are generally used in psychological studies. This is because surface electrodes do not require medical training and do not risk infection and discomfort. It should be noted though that surface electrodes are not necessarily muscle-specific, as they can pick up muscle activity from a greater area than the confined area around the needle insertion point. Thus, it is advised to speak of facial muscle sites instead of specific muscles when measuring facial EMG. Guidelines on using facial EMG were published by Fridlund and Cacioppo (1986), which are still considered the gold standard today.

Investigating Affect and Emotion Using Facial EMG

Affective states are associated with physiological responses across the body as described earlier, so one obvious use of facial EMG within emotion research is to investigate the presence of these affective states. Physiological measures such as electrocardiogram and galvanic skin response have long been applied when examining affect or specifically affective arousal (Alexander & Adlerstein, 1958; Block, 1957; Dimascio et al., 1957; Goldstein et al., 1965; Kaiser & Roessler, 1970; Oken, 1962; Vogel et al., 1958). Whereas most physiological measures are useful tools to measure affective arousal, they do not allow one to easily identify the valence of the experienced affective state. But in the 1970s, researchers started to use facial EMG and demonstrated its usefulness for differentiating affective states based on valence. For example, Schwartz et al. (1976) instructed participants to imagine happy, sad, and angry situations. The researchers distinguished between sad and happy states based on measurements from the corrugator and zygomaticus muscle sites. Cacioppo et al. (1986) demonstrated that based on measurements of the corrugator and zygomaticus facial muscle sites, mildly and moderately experienced affect can be differentiated according to its valence and also intensity. It should be noted that the resulting facial muscle activity in Cacioppo et al. (1986) was mainly covert (i.e. not visible), again highlighting the sensitivity of facial EMG. Such research findings underpin the association between the corrugator muscle site activity and negative affect (i.e. frowning) and the zygomaticus site activity with positive affect (i.e. smiling).

Published research thus far has most often investigated the facial muscle sites of corrugator and zygomaticus despite there being at least five muscles that are considered essential for the facial expression of basic emotions (see Waller et al., 2008). A reason for the preference of investigating the corrugator and zygomaticus facial muscle sites could be that a rudimentary differentiation of stimuli as either positive or negative is considered the first occurring process when faced with affective stimuli (Zajonc, 1980) and allows for investigation including a variety of stimuli of positive or negative valence without having to categorise the stimuli in distinct emotion categories. The categorisation and interpretation of an affective stimulus in specific emotion categories is often difficult. For example, a visual stimulus such as a static picture or a movie scene is often complex and can elicit a range of emotions. For instance, a scene of a bully physically attacking a person (from the film My Bodyguard) can elicit disgust and contempt for the bully and anger (and/or sadness) about the situation (see Gross & Levenson, 1995). The general responsiveness of the corrugator and zygomaticus muscle sites to negative and positive valence stimuli, respectively, overcomes this difficulty and makes them the standard choice within facial EMG research related to affect and emotion.

Another potential reason for not generally including multiple facial muscle sites in facial EMG research can be the issue of ‘crosstalk’. That is, when neighbouring facial muscle sites are investigated, electrode pairs are necessarily placed close to one another. It is possible that an electrode pair of a non-activated muscle site records some of the activity from an adjacent activated muscle site, thus confounding results (Farina et al., 2004). Challenges like this might constitute one reason researchers generally measure fewer facial muscle sites that are not in close proximity. Corrugator and zygomaticus facial muscle sites are of sufficient distance from one another to not create crosstalk but also do not tend to activate simultaneously. Technological advances, however, have led to the recent development of smaller electrodes (i.e. with an outer diameter of <1 cm) which when placed carefully can potentially increase the number of electrode pairs used while still minimising possible crosstalk.

Corrugator and zygomaticus muscle sites are standard in facial EMG research, but there are many studies that included more facial muscles sites. For example, Vrana (1993) investigated multiple facial muscle sites to discriminate varying emotion experiences based on facial EMG. This researcher employed an imagery technique to have participants experience disgust, anger, pleasure, and joy while facial muscle activity was measured from the levator labii, corrugator, and zygomaticus sites. Results showed (1) higher activity in the levator site during disgust imagery than during anger imagery, (2) greater corrugator site activity during disgust and anger imagery compared to pleasure and joy imagery, and (3) increased zygomaticus site activity during joy imagery compared to anger, disgust, and pleasure imagery. This approach of comparing various emotion categories to each other based on the facial EMG activity at one muscle site is very common in facial EMG research. The approach is based on the assumption that specific facial action activation is indicative of a specific emotion such as a wrinkled nose resulting from the levator labii activation during the expression of disgust. However, facial emotional expressions generally include more than one facial feature activation, and some emotion categories share facial features. For example, corrugator activation is associated with facial expressions of anger, sadness, and fear (see Table 17.1) based on the overlapping facial feature of eyebrows pulled together. Such overlaps can make it difficult to draw precise conclusions about specific emotions based on individual muscle sites.

An alternative to investigating one facial muscle site per emotion category is to examine co-activations across several facial muscle sites for each emotion category. According to basic emotion theory, patterns of facial muscle activity should distinguish well between emotion categories. Fridlund et al. (1984) instructed participants to imagine situations related to feeling happiness, fear, anger, and sadness but also to pose the respective expressions while facial muscle activity was measured using EMG from the zygomaticus, corrugator, orbicularis oris, and orbicularis oculi sites. Their results showed that these emotion categories were differentiated from each other in valence based on facial EMG patterns across muscles for some, but not all, participants. But multiple emotions can be experienced during imagery, and there is significant inter-individual variability in displaying posed emotional expressions, both of which pose important limitations for this methodological approach.

Studies presented thus far involve participants imagining emotional situations and measuring aspects of their resultant emotional experience. However, facial reactions can also be measured as a participant’s affective response to visual or auditory affective stimuli. For example, Larsen et al. (2003) presented participants with pictures, sounds, and words of positive and negative affective content and measured the zygomaticus and corrugator facial muscle sites while participants reported their affective states. A relationship was found between self-reported valence ratings and facial EMG activity. Positive valence ratings were associated with activity in the zygomaticus muscle site and negative valence ratings with corrugator site activity. Facial reactions to emotional stimuli can also be assessed using EMG. Dimberg (1988) presented happy and angry facial expressions to participants and measured corrugator and zygomaticus site activity as well as heart rate. Increased corrugator site activity, heart rate deceleration, and more subjective experiences of fear were found in response to angry stimuli compared to happy stimuli. Conversely, increased zygomatic site activity and more subjective experiences of happiness were found in response to happy stimuli. A wide range of stimuli types with varying intensities can be used in research on the experiences and expression of affect and emotion and responses measured with facial EMG.

Investigating Emotion-Related Processes Using Facial EMG

The sensitivity of facial EMG in detecting facial muscle activity is of particular importance when examining phenomena that are difficult to observe with other approaches. For example, consider the investigation of covert facial mimicry. When we see a facial emotion expression, it is very likely that the muscles in our own face will become subtly activated in a manner that matches the observed expression. This phenomenon is commonly termed ‘facial mimicry’ and was first reported by Dimberg (1982). He investigated facial EMG from the zygomaticus and corrugator muscle sites while participants observed pictures of facial emotional expressions of anger and happiness. The results showed greater zygomaticus site activity in response to happiness than anger expressions and greater corrugator site activity in response to anger than happiness expressions. This phenomenon has since been replicated numerous times from the zygomaticus and corrugator muscle sites (for a review, see Hess & Fischer, 2013). These authors also list facial EMG studies where additional muscles were investigated in facial mimicry. For example, the levator labii muscle site has been reported to respond to observing facial expressions of disgust (Lundqvist, 1995; Lundqvist & Dimberg, 1995; Murata et al., 2016; Oberman et al., 2007; Rymarczyk et al., 2016) and the lateralis frontalis muscle site to expressions of fear (Lundqvist, 1995; Rymarczyk et al., 2016) and surprise (Lundqvist, 1995; Lundqvist & Dimberg, 1995; Murata et al., 2016). Nonetheless, the evidence is rather limited for matched facial muscle activation in observers for muscle sites other than the zygomaticus or corrugator.

Generally, studies on facial mimicry listed above investigated emotion-specific facial muscle activation in individual facial muscle sites for multiple emotion categories. As described earlier, some facial muscles are involved in the expression of various emotions (see Table 17.1). The corrugator muscle constitutes a prime example—it is involved in many expressions of negative affect and emotion. Thus, a different approach to showing differential facial muscle activation related to facial mimicry would be to investigate facial EMG across several muscles and consider the emerging activation patterns per emotion category, similar to the approach taken by Fridlund et al. (1984). Wingenbach et al. (2020) measured facial EMG from the corrugator, zygomaticus, depressor, levator, and frontalis facial muscle sites while participants watched dynamic facial expressions of the six basic emotions as well as the more complex emotions of contempt, pride, embarrassment, and neutral facial expressions (i.e. blank stares). The expected activation per muscle site based on previous work on facial emotional expressions was prespecified (as contrast coefficients) for each emotion category and treated as patterns (see https://www.nature.com/articles/s41598-020-61563-5/tables/1 Table 1 in Wingenbach et al., 2020). The measured EMG data across facial muscle sites per emotion category were compared to the theory-based expected patterns to investigate facial mimicry per emotion category. The measured EMG pattern of each emotion category with its expected pattern was also contrasted to expected patterns of emotion categories of the same valence category (positive, neutral, and negative) to test for distinctiveness. Results showed that the measured EMG data matched the expected patterns for most tested emotions. Additionally, the measured EMG patterns for individual emotion categories were distinct within their own valence category for most tested emotions (see Figure 3 in Wingenbach et al., 2020). That is, the measured EMG data better fit the expected patterns of the target emotions than the expected patterns of non-target emotions of the same valence. These findings suggest that facial mimicry is a categorical mirroring of the observed facial emotional expression.

As many studies have now demonstrated, facial EMG can be a useful tool for emotion-specific investigations. Moreover, facial EMG can also be used to investigate variations in facial expressions within an emotion category. For example, research has shown that subtle variations in kinds of smiles are mimicked by observers (Korb et al., 2014; Krumhuber et al., 2014). These researchers recorded facial muscle activity from the corrugator, orbicularis oculi, and zygomaticus sites while participants viewed dynamic displays of various smiles operationalised as variations of AU combinations. These variations are possible because facial expressions of emotion can be posed volitionally, and such posed expressions often differ from spontaneous felt expressions in terms of included AUs. Moreover, judges can reliably discriminate between posed and felt facial expressions (e.g. McLellan et al., 2010). Results from Korb et al. (2014) showed that the recorded EMG activity corresponded with the AUs displayed in the stimuli, demonstrating feature-specific mimicry, similar to the results by Wingenbach et al. (2020). Such findings of specificity in facial muscle activation, in line with the observed stimulus, hint at facial mimicry being a mirroring of the stimulus content rather than an affective reaction to the stimulus, although more research is needed to examine this issue.

Facial EMG can also be used to differentiate between participants’ felt and posed facial expressions of emotion. This differentiation is based on divergent temporal characteristics in posed and spontaneous facial expressions (Ekman & Friesen, 1982). For example, spontaneous smiles have a longer duration than posed smiles (Schmidt et al., 2006). Hess et al. (1988) instructed participants to pose or feel happiness and measured facial muscle activation across the zygomaticus, depressor anguli oris, corrugator, and masseter muscle sites. Temporal aspects of the facial EMG measurements (i.e. time mean, time variance, time skewness, and time kurtosis; Cacioppo et al., 1983) distinguished between posed and felt smiles. Such research findings demonstrate that facial EMG is a useful tool in assessing not only participants’ different expressions across elicitation conditions but also their defining characteristics.

Based on EMG’s high temporal frequency, it is further possible to identify the onset and offset of an expression and to illustrate the development of an expression (e.g. identifying the peak). Achaibou et al. (2008) segmented the recorded signal of the facial muscle activity in the zygomaticus and the corrugator in response to observing expressions of happiness and anger (i.e. a facial mimicry paradigm) in 100 ms epochs. Facial muscle response onsets were defined by comparing the mean facial muscle activity per epoch in response to happy and angry facial emotional expressions to one another (per muscle). The onset of corrugator activity in response to observing angry facial expressions was found at 200 ms after stimulus onset and 500 ms after stimulus onset for happy facial expressions in the zygomaticus. These findings suggest that the corrugator is activated more quickly. Angry expressions might be processed more rapidly than happy expressions which could serve an evolutionary adaptive function. It is further possible that the corrugator is involved in the (stimulus-unspecific) orienting response (Dimberg, 1982) preceding the mimicry response. Moreover, since morphed dynamic stimuli were used in this study, which create artificial facial movements, it remains to be seen whether these timing differences also occur when participants view video-recorded facial emotional expressions including the natural temporal characteristics of the facial emotional expressions. The investigation of the onsets of facial muscle activity when participants observe static facial emotional expressions has not shown differing onsets in the EMG signal in response to the stimuli (Dimberg & Thunberg, 1998).

We have now seen application possibilities of facial EMG to assess the experience of affect and emotion, posed expressions of emotion, and responses related to the processing of stimuli of emotional content (e.g. facial expressions, words, sounds). Another application possibility is using facial EMG as a manipulation check. Some investigations include the manipulation of facial muscle activation in participants, and facial EMG can demonstrate the success of the manipulation. Examples of the manipulation of facial muscle activation are biting on a pen or holding a pen with the lips (e.g. Oberman et al., 2007; Wingenbach et al., 2018) or imitating observed facial expressions (e.g. Wingenbach et al., 2018). In Wingenbach et al. (2018), participants solved a facial emotion recognition task across two conditions with manipulated facial muscle activation, i.e. explicit imitation and pen in the mouth, next to a control condition with no manipulation, while five different facial muscle sites were measured across the face. Participants showed increased activity (compared to the control condition) in all five facial muscle sites in the explicit imitation condition. The pen-holding condition showed the highest activity in the electrodes placed below the left mouth corner (see Figure 2 in Wingenbach et al., 2018). The measured facial muscle activity thus showed a pattern as was intended by the manipulations, and the facial EMG results served to verify the method. The study further showed that an incongruence between visual input (facial emotional expression in the stimuli) and motor action (activity induced under the mouth corner from pen-holding) hampered the recognition of facial emotional expressions with feature saliency in the lower part of the face/mouth region (here, disgust, happiness, embarrassment, contempt, and pride) based on accuracy rates. Emotional expressions with feature saliency in the lower part of the face all include lip movement either outwards or upwards, which is inhibited by the pressing of the lips induced by the pen-holding. Judges’ lowered recognition rates might be due to a conflict between facial muscle movement observed in the stimuli and muscular feedback to the brain, which might also be part of a representation of the observed emotion (for more information on embodiment on emotion, see Niedenthal, 2007). Thus, not only can facial EMG serve as a means to verify applied facial muscle manipulations, facial EMG results can also inform interpretation of obtained behavioural results (e.g. recognition rates), and new theoretical insights might be gained.

Challenges of Using Facial EMG and How to Overcome Them

In summary, this chapter highlighted the many strengths of facial EMG and some possible applications in research. Its most notable advantages are (1) increased objectivity relative to self-reports, (2) high sensitivity in detecting small muscle activations, and (3) high temporal frequency allowing for the assessment of rapidly changing activations characteristic of facial expressions. Nonetheless, facial EMG also comes with challenges. It is well-known that awareness about the purpose of a measure or the hypotheses of a study can alter participants’ behaviour. To avoid potential influences on the obtained EMG data, it is custom to keep participants blind to the true purpose of the electrodes. This can be achieved by using a cover story in the instructions provided to participants, such as the electrodes measure temperature in various parts of the face. It is also possible that participants alter their natural facial behaviour simply because they have electrodes attached to their face. Some participants report during attachment that they are afraid the electrodes would come off, and others report that they feel restricted in their movements. These challenges can be overcome by ensuring proper electrode attachment (e.g. thorough cleaning of the skin) and asking participants to make grimaces to demonstrate secure electrode attachment. Generally, participants habituate to the electrodes quickly and do not actively feel them anymore. Acceptance of having electrodes attached in the face is generally high in participants, as participants do not perceive the electrodes as disturbing or restricting (Wingenbach, 2010).

Facial muscles are rather small, and the guidelines for electrode placement must thus be carefully followed. When misplacing an electrode by just 1 cm, it is already likely that non-targeted muscles are being recorded. While assessing facial muscle activity from multiple facial sites has numerous advantages, one should be aware of the potential for crosstalk between EMG sites. Researchers should make sure to have sufficient distance between electrode pairs; the smaller the electrodes, the better. Since facial EMG electrodes measure electricity, they are affected by ambient electromagnetic fields creating noise in the data, which can be minimised by collecting data within Faraday cages. It is further recommended to use shielded electrodes, keep electrical devices in the laboratory to a bare minimum, and use a notch filter on the recorded signal. Moreover, further filtering of the EMG data is necessary (e.g. high pass, low pass, moving average), and spike artefacts should be eliminated (e.g. see Wingenbach et al., 2020). Movement artefacts are common during EMG recordings, including sneezing, coughing, scratching, and yawning. Since these artefacts cannot easily be separated from the rest of the signal based on visual inspection, it is recommended to observe participants via camera, take notes including exact timing, and exclude those segments from data analysis. Every face is anatomically different which also includes variations in fatty tissue and muscle size. As a consequence, the recorded strength of the EMG signal has high inter-individual variability in addition to variability in responsiveness per se. To tackle this challenge, normalisation of the EMG data per participant is recommended (e.g. see Wingenbach et al., 2020).

Many investigations using facial EMG opt to z-standardise each participant’s data before entering it into analyses. This is then done for each measured facial muscle site across all experimental conditions but individually per participant. While this is indeed a legitimate approach to make the data comparable between participants, researchers are urged to consider the implications of z-standardisation for their results and whether the posed research question can be answered with z-standardised data. For example, should researchers wish to investigate whether there was an increase in facial muscle activity in response to a stimulus, then z-standardisation should not be done. Z-standardisation scales the mean activity from one channel (i.e. facial muscle site) across all trials to zero. Resulting positive z-values are thus to be interpreted as higher than average in response to a specific stimulus and negative z-values as lower than average. Care must thus be taken when interpreting these kinds of results. This problem is exemplified in a recent study by Wingenbach et al. (2020). The corrugator facial muscle site did not show an increase in activity in response to anger facial expression stimuli after a prestimulus baseline correction based on the non-standardised data. However, after z-standardisation of the corrugator site, positive z-values were obtained in response to anger facial expression stimuli (compare the third to fourth column in Fig. 17.1). That is, the corrugator site showed higher than average activity in response to anger facial expressions than to other stimulus categories included in the task. But the resulting positive and negative z-values did not represent an increase or decrease in activity, respectively, as was demonstrated by the non-standardised data, which in fact showed a decrease in activity in response to anger facial expressions. An alternative to z-standardisation is provided by range correction, which does not alter the interpretation of the results. That is, after prestimulus baseline correction, positive values represent an increase in activity in response to a stimulus, and negative values represent a decrease.

Fig. 17.1
A table of 4 columns and 5 rows. The facial muscle site, expected responses, normalized means, and z-standardized means are on the columns and zygomaticus, depressor, levator, corrugator, and frontalis in the rows. The inputs are boxes on either side of a vertical line with values on the right. Columns 3 and 4 have horizontal lines in the boxes. The expected responses from top to bottom are minus 2, minus 1, 2, 2, and minus 1. The normalized means are minus 0.02, minus 0.02, 0.04, minus 0.04, and minus 0.03. The z-standardized means are minus 0.32, minus 0.07, 0.09, 0.29, and minus 0.18.

Facial muscle responses to facial expression of anger

Note. This figure is a composite of Figures 2 and 4 in Wingenbach et al. (2020). The first column shows the five measured facial muscle sites and the second column the expected facial muscle responses when participants viewed angry facial expressions. Blue bars indicate an (expected) increase compared to a prestimulus baseline, and gold bars indicate an (expected) decrease. The third column shows the measured facial muscle responses to angry faces; the EMG data were range-corrected, and no increase in activity occurred in the corrugator. The fourth column shows the z-standardised means with positive z-values for the corrugator, which are in fact based on a decrease in corrugator activity in response to angry faces

Conclusion

Facial EMG is a sophisticated measurement tool that allows researchers to uncover subtle emotional components and thus deepen our understanding of emotion-related phenomena that occur in face-to-face social interaction (e.g. facial mimicry). It can also add to our knowledge of the experience and expression of emotions through faces, such as fine-grained temporal characteristics of facial emotional signalling. Based on sociocultural norms, people sometimes suppress their emotional feelings and experiences, which can include suppressing the associated facial expression. Otherwise, not easily observable facial EMG can provide information about the presence of a suppressed emotion. This provides researchers with a nice alternative to self-report, which are subjective in nature and require introspective abilities that vary across individuals, and is subject to a host of biases and normative constraints. Facial EMG further allows us to differentiate authentically felt emotion from posed affect/emotion and can uncover phenomena that we would not otherwise be aware of (e.g. facial mimicry). Overall, facial EMG is a valuable tool that is expanding our current knowledge on phenomena and processes associated with and underlying affect and emotion.