1 Introduction

1.1 Neural Correlates of Facial Emotion Processing

As human beings, the ability to recognize and appropriately respond to facial expressions of emotions is crucial for survival. Over the past few decades, neuroscientists have strived to better understand the neural mechanisms that underlie emotion recognition and processing [1]. Recent advances in neuroimaging techniques have made it possible to examine neural correlates of emotion processing more closely, and with improved accuracy. In this vein, facial emotions have been a major focus of a significant body of research with potential applications ranging from the clinical domain to human computer applications [26].

Facial stimuli depicting emotions have been linked to activation in brain regions such as amygdala, basal ganglia, and occipito-temporal regions of the cortex [7, 8]. The prefrontal cortex (PFC) in particular, has been implicated as an important region that subserves facial emotion processing, perhaps in a regulatory capacity [9]. Accordingly, researchers have suggested that various sub-regions within the PFC may play distinct roles with regard to recognizing and regulating emotions [10, 11].

1.2 Functional Near-Infrared Spectroscopy

To date, neuroimaging techniques such as functional magnetic resonance imaging (fMRI) and positron emission tomography (PET), which assess localized brain activation by monitoring task-related hemodynamic responses, have been widely used. These traditional neuroimaging techniques, however, can be prohibitively costly. Furthermore, they require individuals to be immobile, thus are less feasible for real-world applications, or more ecologically valid investigations. Therefore, it is necessary to explore how novel neuroimaging techniques can be validated as practical and cost efficient alternatives. Functional near-infrared spectroscopy (fNIRS), an optical neuroimaging technique, is becoming increasingly popular as an economically feasible and a more versatile alternative to traditional neuroimaging techniques such as fMRI and PET.

fNIRS is a cost-effective and portable neuroimaging method that measures evoked relative changes in oxygenated (oxy-Hb) and deoxygenated hemoglobin (deoxy-Hb) in the cerebral cortex using light [12, 13]. More specifically, fNIRS utilizes near infrared light between the wavelengths of 700 nm to 900 nm that is introduced at the surface of the scalp. The backscattered light is then monitored to estimate the changes in concentration ratios of the main chromophores in the cortical tissue, namely oxy-Hb and deoxy-Hb [14]. This method of optical neuroimaging has been increasingly becoming popular among researchers, and it has been successfully deployed in various research contexts to examine neural activation reliably and at minimal cost [13, 1517]. A main advantage that fNIRS provides, as compared to traditional neuroimaging techniques such as fMRI and PET, is its high portability. This makes fNIRS more versatile in its applications and highly accessible, allowing it to be deployed successfully in clinical applications and settings that promise higher levels of ecological validity [12, 18]. Previous studies have highlighted the potential utility of fNIRS in the context of examining the neural correlates of emotion perception [3, 19, 20], particularly within the PFC [21, 22]. Furthermore, a more recent study has also demonstrated the multimodal use of fNIRS, together with facial expression analysis [23].

1.3 Present Study

Gaining a clearer understanding of the neural correlates of facial emotion perception can have important implications for a variety of Neuroergonomic applications and brain computer interface settings [18, 2426]. Such applications range from product usability studies at the design phase, to adaptive systems of complex man-machine interfaces. Further development in these areas are only possible if practical and real-time monitoring of localized brain activity can be achieved. Furthermore, because fNIRS can be miniaturized, battery operated, wireless, and ultra-portable, it is an ideal candidate for potential in-home use [27]. Therefore, it is important to further validate the utility of this novel technology, especially with regard to its sensitivity to neural correlates of facial emotion face perception.

In understanding the neural correlates of facial emotion perception, various levels of processing can be examined. At the lowest level of processing, exploring the incidental encoding of emotional faces can be an important step in identifying neural correlates of facial emotion processing. Specifically, it would be important to determine if a given neuroimaging technique, such as fNIRS, is adequately sensitive to detect changes that occur at this lowest level of facial emotion processing.

The present study sought to evaluate whether fNIRS can be used to reliably evaluate neural responses to incidental encoding of faces. More specifically, the neural correlates of incidental encoding of neutral and fearful faces within the PFC was explored.

2 Method

2.1 Sample

Thirty-nine healthy adults, recruited through the undergraduate research participant pool at the University of Toronto Scarborough and from the Greater Toronto Area, provided written informed consent to participate in the present study. This sample consisted of right handed participants (35 female), who were on average 28.46 years old (SD = 12.09). Upon completing informed consent, the participants also completed a screening that was used to rule out serious medical (i.e., severe head trauma) or psychological illness (i.e., major depressive disorder).

2.2 Incidental Facial Emotion Encoding Task

The incidental facial emotion encoding task consisted of a widely spaced event-related design, with 32 trials in total. The face stimuli consisted of 16 neutral and 16 fearful faces that were obtained from the 2D facial emotional stimuli database of the University of Pennsylvania’s Brain Behavior Laboratory [28]. The faces were of diverse ethnic origins and were arranged in a quasi-random order. Half of the face stimuli in each emotion category consisted of female faces. During each trial, a face was presented to the participant for 250 ms on a black background, and was followed by a white cross-hair fixation. Each trial had a duration of 20 s. The participants were asked to press a button with their right index finger if the face was male, and with the right middle finger if the face was female.

2.3 Neuroimaging Procedure

The participants were seated in a dimly lit room in front of a computer monitor and a keyboard. fNIR Imager 1000® (fNIR Devices, Potomac, MD), a 16-channel continuous wave fNIRS system, was used in the present study to examine neural activation. Measurements of raw light intensities were obtained at 500 ms intervals, with 1.25 cm of penetration into the cortical tissue within the PFC at 16 measurement locations [29]. Specifically, the imaging probe positioning on the forehead was aligned with electrode positions F7, FP1, FP2 and F8, based on the international 10–20 EEG system [30]. Thus, the fNIRS probe provided coverage over Brodmann areas 9, 10, 45 and 46. The fNIRS raw light intensities that were obtained during the experiment were first visually inspected to exclude problematic channels affected by noticeable motion artifacts, and then were subjected to signal processing algorithms. Specifically, high frequency noise, physiological and motion artifacts were systematically excluded using low pass finite impulse response and sliding window motion artifact rejection with the fNIRSoft® Software Package [13, 15, 31]. Activation segments of interest were demarcated by time synchronization markers. The study focused on oxy-Hb as the primary parameter of interest, and relative changes in oxy-Hb were estimated for each channel.

2.4 Statistical Analyses

Multilevel models, which nested the time-series of observations within participants, were estimated for each channel [32]. These models accounted for the possibility that the number of observations were unbalanced across participants. All models were conservatively estimated using an unstructured covariance matrix and the Satterthwaite method of estimating degrees of freedom [33]. The two emotion conditions were compared using contrast codes which examined the difference between fearful and neutral conditions (fearful coded as ½, and neutral coded as −½). Type I error in the analyses was controlled by using False Discovery Rate corrections for multiple comparisons [34, 35]. Data analyses were conducted using IBM SPSS Statistics Version 20.0 (IBM Corp. Released 2011. IBM SPSS Statistics for Windows, Version 20.0. Armonk, NY: IBM Corp.). Additionally, the time-series graphs for each channel were created for both conditions, also mapping a 95 % Confidence Interval (CI) for the purpose of further examining the differences in neural responses.

3 Results

3.1 Behavioral Results

Participants demonstrated equivalent reaction times during the fearful condition (M = 710.17 ms, SD = 216.73 ms), and the neutral condition (M = 700.83 ms, SD = 216.79 ms; t(76) = −.13, p = .90, d = .04). Participants also demonstrated similar accuracy across both fearful (M = 85.2 %, SD = 10 %) and neutral (M = 89.9 %, SD = 14.04 %) conditions (t(76) = 1.85, p = .07, d = .39).

3.2 Fearful vs. Neutral Activation

Significant main effects for the emotion condition was found across 10 of the 16 channels. More specifically, higher activation was found in channel 12 for the fearful condition as compared to the neutral condition (b = .01, SE = .00, p < 0.01), corresponding to a region within the right medial PFC. The fearful condition as compared to the neutral condition, on the other hand, was associated with lower activation in channels 1 to 9 (b’s < −.01, SE’s = .00, p’s < 0.01), corresponding to a region encompassing the left medial and lateral regions of the PFC. These findings are depicted on a standard MRI template of the brain in Fig. 1, where blue and purple indicate lower activation (left PFC) and red and yellow indicate higher activation (right PFC).

Fig. 1.
figure 1

Activation map depicting statistically significant activation differences across the PFC for the fearful vs. neutral activation contrast. (Color figure online)

3.3 Examining the Time-Series

The neural activation patterns observed across the duration of the trial for the neutral condition and the fearful condition are depicted in Fig. 2.

Fig. 2.
figure 2

Activation patterns observed at 5-s intervals following neutral and fearful faces, visualized on a standard MRI template of the brain. (Color figure online)

The neutral condition appeared to be linked to a pattern of activation within the PFC that was more lateral at first, which subsequently became more medial. More specifically, activation was initially seen in more lateral aspects of the right and left lateral PFC. Towards the end of the trial, however, the lateral activation appeared to subside, and was followed by increases in activation within the left medial PFC. In contrast to the neutral condition, the fearful condition appeared to elicit a pattern of activation within the PFC that was more medial at first, which subsequently became more laterally localized. More specifically, immediately following the face stimulus, activation was observed bilaterally in the PFC, in regions that were more medial as compared to the neutral condition. As the trial progressed, this pattern of activation was replaced with persistent bilateral activation, which appeared to be more laterally situated as compared to the initial response.

When comparing the time-series of activation for neutral and fearful conditions (Fig. 3), it appeared that the neutral condition may be linked to a pattern of activation that is focused on the left medial and lateral PFC. Specifically, in these regions, the neutral stimuli appeared to be followed by increases in activation, whereas the fearful stimuli appeared to demonstrate the opposite trend. A potential trend towards increases in activation for the fearful condition was only apparent within the right medial region of the PFC.

Fig. 3.
figure 3

The 20-s oxy-Hb time-series across all 16 channels for neutral and fearful conditions with 95 % confidence intervals (CI) shown as shaded areas around the grand mean. (Color figure online)

4 Discussion

4.1 Summary of Findings

The present study sought to explore the neural responses observed within the PFC during the incidental encoding of face stimuli using fNIRS. Results revealed that, when comparing overall activation changes between fearful and neutral conditions, higher activation was observed within the right medial PFC following fearful faces, while higher activation was observed within the left medial and lateral PFC following neutral faces. Exploring the time-series of activation revealed that the neutral condition appeared to be characterized by an initial lateral increase in activation followed by a later medial increase in activation. The fearful condition, on the other hand, appeared to be characterized by an initial increase in activation that appeared to be more medial, which was followed by increases in activation in more lateral regions of the PFC.

4.2 General Discussion

These findings appear to be in line with previous neuroimaging research. In a comprehensive meta-analysis of 105 fMRI studies, it was demonstrated that neutral face processing was linked to neural activation within an area encompassing the left medial PFC, while fearful face processing was linked to neural activation in an area encompassing the right medial PFC [8]. Results from the present study are consistent with these findings and show somewhat similar activation foci in response to neutral and fearful faces.

Results from the present study may also be consistent with the notion that the medial PFC may be an important region that subserves emotion regulation processes [10, 11], specifically in the context of perceiving threat. More specifically, it was demonstrated that the incidental encoding of fearful faces was immediately followed by a PFC response that was more medial, as compared to the neutral condition.

Importantly, the present study demonstrates that fNIRS can be deployed as an adequately sensitive neuroimaging tool that has the potential to reliably examine emotion related processes within the prefrontal cortex. This can have numerous important implications within the context of clinical applications, brain-computer interfacing and Neuroergonomics research overall. For example, the patterns of activation observed during the encoding of different facial emotions can potentially be used as biomarkers for studying emotion perception/regulation failure in clinical populations. Furthermore, assessment of emotion related processes directly from the brain can be utilized in usability studies during the design of products or adaptive systems, especially while users are engaged with them within in-home settings. Overall, being able to accurately and reliably identify neural responses to emotional stimuli can greatly aid in refining brain-computer interfaces, and user experience.

4.3 Limitations

It should be noted that the present study focused primarily on incidental encoding of static face stimuli, which may be associated with neural response patterns that are different from dynamic facial emotions that are encountered in real life [36]. It should also be important to consider the impact of variables such as gender, race, and age on the neural correlates of facial emotion processing. Although the present study attempted to control for these variables by including a quasi-random selection of stimuli, future research should aim to systematically address these questions with adequately powered studies.

4.4 Future Directions

The present study aimed to examine the feasibility of fNIRS in detecting emotion related processes within the PFC. Future research can further validate these findings by replicating these results in a larger sample, across more emotion categories and other types of emotion stimuli. Given the versatility of fNIRS, future studies should also examine the feasibility of using fNIRS in more ecologically valid, complex emotion processing tasks.