Specific EEG/ERP Responses to Dynamic Facial Expressions in Virtual Reality Environments
Visual event-related potentials of facial expressions (FEs) have been studied using usually static stimuli after a nonspecific black screen as a baseline. However, when studying social events, the ecology of the environment and stimuli can be a bias. Virtual reality provides a possible approach to improve ecology while keeping stimulus control.
We propose a new approach to study responses to FEs. A human avatar in a virtual environment (a plaza) performs the six universal FEs along the time. The setup consisted of a 3D projection system coupled with a precision-position tracker. Subjects (N=6, mean age=25.6y) wore a 32-channel EEG cap together with 3D glasses and two infrared emitters for position tracking. The environment adapted in real time to subjects’ position, giving the feeling of immersion.
Each animation was composed by the instantaneous morphing of the FE, which is maintained for one second before the ’unmorphing’ to the neutral expression, which takes another second. Inter-trial interval was set to three seconds, keeping the neutral facial expression as baseline for one second before the morphing of any facial expression.
For the occipito-temporal region, we found a right asymmetrical negativity [150-350]ms after stimulus onset. Timefrequency analysis showed a significant difference in the beta frequency band (20-25Hz) around 350ms in the temporal lobe for the processing of the different facial expressions.
This result suggests an important role played by the temporal lobe in discriminating facial expressions. Furthermore, this study provides a proof-of-concept of the possibility of using a complex virtual reality setup coupled with an EEG system for the study of dynamic and ecological social stimuli.
KeywordsDynamic Facial Expressions EEG ERP Right Hemispheric Lateralization Virtual Reality
Unable to display preview. Download preview PDF.