Abstract
Visual event-related potentials of facial expressions (FEs) have been studied using usually static stimuli after a nonspecific black screen as a baseline. However, when studying social events, the ecology of the environment and stimuli can be a bias. Virtual reality provides a possible approach to improve ecology while keeping stimulus control.
We propose a new approach to study responses to FEs. A human avatar in a virtual environment (a plaza) performs the six universal FEs along the time. The setup consisted of a 3D projection system coupled with a precision-position tracker. Subjects (N=6, mean age=25.6y) wore a 32-channel EEG cap together with 3D glasses and two infrared emitters for position tracking. The environment adapted in real time to subjects’ position, giving the feeling of immersion.
Each animation was composed by the instantaneous morphing of the FE, which is maintained for one second before the ’unmorphing’ to the neutral expression, which takes another second. Inter-trial interval was set to three seconds, keeping the neutral facial expression as baseline for one second before the morphing of any facial expression.
For the occipito-temporal region, we found a right asymmetrical negativity [150-350]ms after stimulus onset. Timefrequency analysis showed a significant difference in the beta frequency band (20-25Hz) around 350ms in the temporal lobe for the processing of the different facial expressions.
This result suggests an important role played by the temporal lobe in discriminating facial expressions. Furthermore, this study provides a proof-of-concept of the possibility of using a complex virtual reality setup coupled with an EEG system for the study of dynamic and ecological social stimuli.
Keywords
- Dynamic Facial Expressions
- EEG
- ERP
- Right Hemispheric Lateralization
- Virtual Reality
This is a preview of subscription content, access via your institution.
Buying options
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Simões, M., Amaral, C., Carvalho, P., Castelo-Branco, M. (2014). Specific EEG/ERP Responses to Dynamic Facial Expressions in Virtual Reality Environments. In: Zhang, YT. (eds) The International Conference on Health Informatics. IFMBE Proceedings, vol 42. Springer, Cham. https://doi.org/10.1007/978-3-319-03005-0_84
Download citation
DOI: https://doi.org/10.1007/978-3-319-03005-0_84
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-03004-3
Online ISBN: 978-3-319-03005-0
eBook Packages: EngineeringEngineering (R0)