Abstract
Recent psychology and neuroscience studies have used tactile stimuli in patients, concluding after their experiments that touch is a sense tightly linked to emotions. In parallel, a new way of seeing films, 4D cinema, has added new stimuli to the traditional audiovisual via, including the tactile vibration. In this work, we have studied the brain activity of audience while viewing a scene filmed and directed by us and with an emotional content, under two different conditions: 1) image + sound, 2) image + sound + vibro-tactile stimulation. We have designed a glove where pulse trains are generated in coin motors at specific moments and recorded 35 viewers’ electroencephalograms (EEGs) to evaluate the impact of the vibro-tactile stimulation during the film projection. Hotelling’s T-squared results show higher brain intensity if the tactile stimulus is received during the viewing than if no tactile stimulus is injected. Condition 1 participants showed activation in left and right orbitofrontal areas, whereas Condition 2 they also showed activities in right superior frontal and right-medial frontal areas. We conclude that the addition of vibrotactile stimulus increases the brain activity in areas linked with attentional processes, while producing a higher intensity in those related to emotional processes.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Traditional films are designed to be perceived by two senses: sight and hear [1, 2]. Technology development during the latest years has allowed the transformation of audiovisual experiences into a multisensorial one, named as the Cinema 4D according to some works (Cardini, 2011; [3,4,5, 6, 76]. Several experiments have used smell, taste or touch stimuli in films or videogames [7],Bumsuk [8, 9]. For instance, Lemmens et al., [10] designed a suit that provided vibrotactile stimulation to boost the emotions of the viewers during the projection of audiovisual works. None of the above studies used brain activity measurements, stating their results only according to the subjective impressions of the participants.
To evaluate the impact of viewing films in humans, several neuroscience studies have used electroencephalogram (EEG) technology to investigate the viewers’ brain activity [11, 4, 12,13,14,15,16, 92]. Some of them have focused on emotions, showing that temporal and orbitofrontal areas are the most important ones when it comes to emotional content processing.
Regarding the identification of hemispherical brain lateralization, several hypotheses still remain under a competitive controversial. One of them suggests a higher participation of the right hemisphere on the emotion perception, even regardless its valence, i.e., if it is positive or negative [17]. On the other hand, a different participation of each hemisphere as a function of the emotional valence is proposed: the left hemisphere is the one more closely related to the valence of the emotions and the right one is more related with their intensity [17]. An additional proposal links positive emotions with the activation of the left hemisphere, while the right one activation would be raised by negative emotions [18,19,20,21,22], though further studies are needed to reinforce these valence differentiations. Notwithstanding, we can find at least a consensus in the literature: temporal lobes and orbitofrontal areas are the brain regions activated in any emotional processing [23, 24]. Although the quantity of basic emotions is still a controversial topic, at least the consensus reaches to the following six: anger, fear, sadness, happiness, disgust, and surprise [25,26,27, 77]. On the other hand, emotions can be analyzed according to their content in valence –negative/unpleasant to positive/pleasant– and excitation –low to high– [28, 96].
Regarding the relation between emotions and brainwaves, Jung et al. showed that explicit violent content in films increases the EEG register activity mainly in delta, theta and alpha waves [97], but other studies widen the scope of frequencies related to emotional reactions [29, 98], also showing the theta band preponderance [72]. Some EEG-based studies demonstrate the hypothesis of a higher brain activity intensity under negative emotions compared with that produced by positive [30, 31]. The origin of this activity could be located on the cortical-limbic structures, anterior orbitofrontal and temporal areas [32, 89].
Espenhahn et al. analyzed the brain activity in young people while viewing films by using the somatosensory evoked potentials measured with an EEG [78]. Their result, however, were not focused on the evaluation of emotional responses with different ways of watching the movies, as in the work presented here. Closer to it, Raheel et al., [33, 99] performed a study in which the brain activity was analyzed while watching the film with and without image and sound, and with some additional stimulation such as heating or ventilation. Their conclusion was that those stimuli increased the emotional intensity induced by the movie.
Furthermore, other studies have designed tactile devices, such as vests to enhance people's affective communication through multimedia [34], or to evoke specific emotions through vibration and heat [35]. Additional works have focused on enhancing the movie-watching experience through tactile stimulation [10, 36, 37], similar to Lemmens et al.'s [10] multisensory jacket, designed with 16 different segments of 4 motors each capable of generating synchronized vibrotactile stimuli during moments in a film. Furthermore, other studies have designed haptic vests applied to video games and augmented reality [38]. In this vibrotactile feedback jacket, fourteen vibration actuators were integrated to generate an immersive experience by activating different vibration patterns and intensity levels according to the game scenes.
Some other recent researches have used vibrotactile stimuli to boost the attentional or emotional response in people with some disabilities [39, 40], or even in non-disabled people [14, 41]. Recently, a study was conducted involving hearing impaired individuals, where the cortex activity of the participants was recorded using EEG [42]. While seeing a video of a neutral landscape, two different soundtracks were played, each one evoking a clearly different emotion. In one mode, subtitles were projected remarking the type of music, its title and authorship. In the other mode, the same video was projected along with a synchronized vibrotactile stimulation through a haptic glove instead of subtitles. Hear impaired people response was demonstrated to be highly attentional in the first case (frontal lobe activation), but highly emotional in the second one (temporal lobe activity).
Nevertheless, up to our knowledge, no relevant studies have been performed on the brain activity in people without disabilities when only tactile stimulation is applied while viewing films. The goal of this work is to analyze this activity while viewing a traditional film with sound and image, and that when adding tactile stimulation on the hands, focusing on the differences to understand the way in which the brain processes this new stimulus. This will pave the way to separate different neural processes, and to distinguish the exact role of the involved brain areas as a foundation to perform analysis when working with people with disabilities.
Despite the existing studies on the analysis of viewers' brain activity and emotions to date, there are no EEG studies that assess the differences between traditional viewing of audiovisual works and multisensory works, such as those that involve tactile stimulation. We have decided to conduct a study that provides results regarding these differences, considering that the perception of movies, video games, and other audiovisual content will increasingly be influenced by multisensory stimuli in the near future.
2 Materials and Methods
Thirty five participants make our sample up, with ages between 18 and 75, in accordance with other similar EEG-based studies [12, 43, 44]. Prior to the experiments, subjects were asked about their phobias, mental disorders, psychiatric and neurological pathologies, and/or if they use psychotropic substances. None of the participants mentioned anyone of the above conditions. This study is part of a line of research whose clinical trials were approved by the Clinical Research Ethics Committee (CEIC) of the San Carlos Clinical Hospital since April 4th 2019, Madrid (Spain).
2.1 Audiovisual material
Following the conclusions of some studies that used audiovisual stimulation to analyze emotions with EEGs [4, 45], and, overall, the one of Pereira et al., [46], where the use of videos of more than one minute was demonstrated as convenient when trying to detect emotions through EEG, we used a film sequence lasting 5 min. It was shot by us, working in a small cinema team in which two professional actors (an actress and an actor) played under a well illuminated but intimate atmosphere a tender scene, where they kiss and caress in the bed of a room. The aim of the director was to create the environment to define the personal relation between two lovers.
The video was produced in a hotel room in the central area of Madrid. It involved a director, a camera operator/director of photography, a sound technician, and a production assistant. The technical equipment used included a Sony PXW-FS5 camcorder, LED lighting, and a Zoom H5 sound recorder with Rode NTG-4 microphones. Recording files were stored in MXF format with the XAVC codec, at a resolution of 1920 × 1080 in progressive mode, 25 frames per second, and a bitrate of 50 Mbps. Subsequently, these files were converted to the Apple ProRes 422 codec so they could be edited in Final Cut Pro X. The final export of this process was an MP4 file with the H.264 codec, at a resolution of 1920 × 1080 and a bitrate of 8 Mbps.
To check its validity, 110 students from the Information Sciences Faculty from Complutense University of Madrid were surveyed to value the kind of emotion that was transmitted by the film. A questionnaire was applied using a model of emotions which sets a space of limited, discrete, basic emotions, as well as some complex emotions [28, 47],Zhao y Ge, 2018). The video was watched by students in a classroom at the Faculty of Information Sciences at the Complutense University of Madrid, in three different sessions. When the screening was over, the attending students were instructed to complete a questionnaire posted on the Virtual Campus using their computers or mobile devices.
In this preliminary study students rated the film as 'pleasant', opposed to 'unpleasant,' with an average score of 4.1 (5 being 'pleasant' and 1 being 'unpleasant'). In the questionnaire, we also asked them to rate the perceived intensity of a specific emotion on a scale of 1 to 5. The emotions being: happiness, relaxation, satisfaction, and surprise. The students associated the video more with feelings of relaxation (3.56) and satisfaction (3.01) than with happiness (1.72) and surprise (1.06).
After the preliminary study, we projected the same film to 35 participants (different from the previous 110) in two different modes: 1) image and sound, and 2) image, sound, and tactile stimulation (Fig. 2). Thus, in mode 1 viewers watched and heard the video in a traditional way, whereas in condition 2 the same video was synchronized with tactile stimuli.
The director of the film chose the exact times that had to trigger the tactile stimuli (Fig. 1). Mainly, it was during the images where the action included explicit physical contact between the actors, such as kisses and caresses. Once these moments were determined in milliseconds, they were included in the protocol so that, through the “EEG Control” software, they could be launched in complete synchronization with the video. Before conducting the experiment, the director viewed the audiovisual with tactile stimulation to ensure that all stimuli were correctly placed. It is worth to say that the exact placement and duration of the stimuli was established as a purely creative film decision, just like the lighting, framing or setup could have been [48, 79].
We employed a protocol that encompassed all commands for automatic synchronization with the video through our proprietary software (“EEG control”). The software serves as a protocol hub that unifies communication across multiple systems. This protocol includes timestamps in milliseconds, such as "play video," "glove vibration," or "EEG marker (Fig. 2). This ensured precise synchronization between the video and tactile stimuli or EEG markers (data collection of brain activity).
We must remark, however, that the mode was selected in a random way for each viewer, thus avoiding any memory effect in the brain responses. Half of the participants first viewed Condition 1 and then Condition 2, while the other half did the opposite. The first participant watched the video first in Condition 1 and then in Condition 2. Participant 2 watched it first in Condition 2 and then in Condition 1, and so on.
2.2 Tactile stimulation and protocol
Two haptic gloves were used for the vibrotactile stimulation. Each glove was a Inesis Golf Glove 100 where 6 Uxcell 1030 micromotors were placed, one on each of the finger pads and one on the palm. These micromotors provided haptic stimulation by vibration (Fig. 3). The motors operated at 3 V DC and consumed up to 70 mA each, with a maximum of 630 mA per glove. All other main electronic elements were placed on a printed circuit board (PCB) or Arduino shield, with three L293D motor controllers (to control the 12 motors for the 2 gloves) and all the appropriate connections to the gloves and power supply. In order to provide enough power and to avoid damaging the Arduino board due to an excessive current demand by the haptic stimulation devices, we designed a drive circuit system to control each motor switch. This circuit was made up of a power bank, another L293D motor controller (which controlled up to 4 motors and provided up to 0.6A per channel) and a flyback diode to protect against motor flyback currents.
The positive haptic stimulation lasts 1.6 s each. The haptic stimuli are modulated in a square PWM signal with variable duty cycle due to intensity control, in bursts of 1 kHz. It is applied finger per finger, with the following order of cumulative activations: palm – thumb – index – middle – ring – little finger and then reverse deactivation towards the palm, in a 1.6 s full sequence. This pattern is produced in both hands simultaneously. To generate the stimuli, an Arduino UNO rev3 was used, which in turn is triggered by a control PC and synchronized with the viewing of the film. It was mandatory an exact timing as well as a perfect fitting of the glove to the user’s hand, ensuring the right haptic stimulation through the motors.
Every moment in which the director of the film decided the insertion of an emotional tactile stimulus, a mark was generated in the EEG to analyze the 200 ms (Fig. 4). We selected this analysis window in the EEG recordings because, according to several studies, it is suitable for analyzing the emotional activity of viewers during the viewing of a visual or audiovisual work [42, 49,50,51,52].
Each participant underwent a control test to verify that the vibration itself did not produce brain activity beyond tactile detection. This vibration consisted of a signal with a constant frequency of 2 Hz and a duty cycle of 10%, with the 6 motors of each hand on for 50 ms and off for 450 ms. The duration of this test was 3 min. The reason for using such a low duty cycle was twofold: not to cause discomfort due to its long duration, and to maintain the tactile stimulus constant.
All the stimulations were tracked in this way to evaluate the differences of the brain activity among the different viewers. Once the EEG record was obtained, a Self-Assessment Manikin (SAM) questionnaire was performed to evaluate the emotional experience of the viewers, regarding valence, arousal, and dominance [28, 47, 53]. Each participant reported, individually, their emotional experience for each one of the modes of the film on a score of up to 5 points. For valence they chose between pleasant (5 points), pleased (4 points), neutral (3 points), unsatisfied (2 points), and unpleasant (1 point). For arousal they selected between excited (5 points), wide-awake (4 points), neutral (3 points), dull (2 points), and calm (1 point). Finally, for dominance the participants chose between dependent (5 points), powerlessness (4 points), neutral (3 points), powerful (2 points), and independent (1 point). The duration of the whole experiment, including EEG instrumentation and gloves setups and initializations, as well as the questionnaires, was around 60 min per participant. No payment was made to anyone in the sample group for this study.
2.3 EEG recording method
A 64-channels Neuroscan Quik-Cap was used for acquiring the EEG recordings. The software was ATI Pentatek © (from Advantel SRL). Prior to obtain the EEG recording, the impedance was checked to be under 5 kΩ. Reference electrodes where ubicated on the two mastoids. Sample frequency for the register was 1 kHz. These data were averaged using a mean reference. A visual inspection on each one of the obtained registers was performed to clean data away from artifacts due to eye or muscle movements. Those noisy channels were substituted by a linear interpolation of adjacent channels. Additionally, those channels whose square magnitude was higher than four standard deviations of their mean power were substituted by the mean of the adjacent channels [54].
2.4 Brain sources localization
To allocate the origin of neural activities, an EEG inverse problem was assessed. It was solved using Low Resolution Electromagnetic Tomography (LORETA) method, [55]. The solution for each model was restricted to one, or a combination of more than one, specific anatomical structure. Those structures had restrictions derived from the assumption of segmenting the average brain atlas from the Montreal Neurological Institute (MNI) [56] into 90 parts. This procedure followed the automated anatomical atlas (AAL) [57].
LORETA was applied through the Neuronic software [49, 58, 59, 90] in 50 to 70 artifact-free 200 ms windows, for each participant and mode. It produced a series of bioelectrical activation maps revealing the maximum activation areas for each group.
Once those areas were located, statistical parametric maps (SPMs) were computed through the Hotelling’s T-square test against zero, voxel by voxel, to determine statistically significant sources. Applied to independent groups [60], it allowed the obtention of probability maps with thresholds at a false discovery rate (FDR) of q ¼ 0.05 [61], depicting them as 3D activation images overlapped to a MNI brain model. Once the probability maps were obtained, those anatomical structures greater than 10 voxels and over the threshold – according to AAL atlas—were identified and highlighted [61]. Subsequently, local maxima were located using the MNI XYZ-coordinate system.
3 Results
LORETA results reveal a maximum activity both in the left orbitofrontal area (X = 2, Y = 51, Z = –5, with T2 = 4.07) and the right orbitofrontal (X = 3, Y = 52, Z = –4, T2 = 4.21), with a maximum intensity of 4.21 in the mode 1 (image + sound). In mode 2 (image + sound + touch), the average brain activity was located in the upper right frontal area (X = 27, Y = 62, Z = 6, T2 = 26.40), medial right frontal (X = 40, Y = 53, Z = 8, T2 = 22.,99), as well as left orbitofrontal (X = -5, Y = 62, Z = -5, T2 = 18.054) and right orbitofrontal (X = –4, Y = 62, Z = –43, T2 = 19.718) with a maximum intensity of 26.40. These results are shown through the brain activity maps in Fig. 5.
The questionnaires performed after the EEG experiment show differences between the emotional response of the viewers during the conditions of mode 1 and mode 2. While in the condition 1 the viewers scored the emotional valence with a 4.2 (Pleasant), that of the mode 2 was slightly higher, 4.3 (Pleasant). Regarding the arousal, there were higher differences: average score of mode 1 was 3 (Neutral), whereas multisensorial condition 2 reached a 4.1 (Wide awake). Moreover, participants scored a dominance of 4.1 (Powerlessness) in mode 1, and 2.1 (Powerful) in condition 2.
4 Discussion
Our results show that the tactile stimulation produces a higher activity in frontal and orbitofrontal areas during the viewing of a film with multisensorial stimuli. According to several studies [59, 62], emotions are located in orbitofrontal brain areas while the attentional processes arise from the frontal areas.
Interestingly, we find a remarkable increase in the brain activity of the condition 2 (image + sound + touch) respect to the mode 1 (image + sound) in superior frontal areas. Several studies have related superior frontal areas with attentional processes [63, 64]. Therefore, the reason for the observed increment in those areas may be the generation of attentional processes in viewers when a new stimulus is added to the traditional viewing of a film, as they are not used to that way of perceiving the audiovisual works, like some authors state [65, 66, 83]. Furthermore, although there is a significant difference in brain activity between both conditions (4.21 in Condition 1 compared to 26.4 in Condition 2), the results are consistent with studies [67] that suggest that when we are accustomed to a stimulus, the introduction of a new one has a greater impact on brain activity. In the case of this experiment, participants were accustomed to viewing movies with sound and images, but not with a tactile stimulus.
This is a conclusion that could be easily linked with the research in habituation psychology [68, 69, 70, 76]. In a cinematographic environment, the incorporation of a new stimulus could be compared to the situation produced at the early stages of its development, at the beginning of the 20th Century, where the viewers could see the first frames on a screen. Among the audience, some were not able to perceive persons, but “flying heads”, due to not seeing the whole bodies of the actors on the screen [71]. This was a new kind of stimulation for which the brain needed a certain training and, hence, requiring a volitional attentional process.
On the other hand, it is remarkable how the results show a higher lateralization of the brain activity, pointing towards the right hemisphere in condition 2 (image + sound + touch) when compared to mode 1. According Pralus et al., [17] right hemisphere activation in the brain is related to a higher intensity in the perception of an audiovisual work, and the left hemisphere activation is more related to the valence of the emotion. Our EEG recording results seem to reinforce the above conclusions, because they match with the viewers’ answers in those questionnaires [28, 96]. Although the participants provided a similar valence score in both conditions (pleasant), they remarkably differed in arousal and dominance. While arousal was neutral in condition 1, viewers felt more awaken in condition 2. Indeed, they marked with a high score the tactile stimulation, qualifying it as “powerful”, whereas the condition 1 was “powerlessness”.
Finally, all these conclusions agree with the raising of frontal areas activities in condition 2, reported by several authors as to be linked with attentional processes [63].
5 Conclusions
We conclude that the tactile stimulation increases the brain activity intensity of the viewers while watching a film with emotional content. They perceive a higher emotional intensity, as well as they develop more cognitive attention focused on the projected film.
We would like to remark some limitations of the present study. First: the tactile stimuli were located only in both hands. Therefore, further research is needed to analyze the brain activity with additional olfactory or gustative stimulus, or/and tactile on other parts of the body. Second: for future research, we think that a comparison should be made between two groups, one stimulated by a series similar to the one applied to our mode 2, and other one in which a training on the sensorial stimuli had been previously performed. In this way, we could value the real effect on those viewers of the tactile stimulation used to perceive works with multisensorial stimuli.
Data availability
The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.
References
Akiba HT, Costa MF, Gomes JS, Oda E, Simurro PB, Dias AM (2019) Neural Correlates of Preference: A Transmodal Validation Study. Front Hum Neurosci 13:73. https://doi.org/10.3389/fnhum.2019.00073
Huang RS, Chen CF, Sereno MI (2018) Spa- tiotemporal integration of looming visual and tactile stimuli near the face. Human Brain Mapp 39(5):1256–2176. https://doi.org/10.1002/hbm.23995
Biazon, L. C., Martinazzo, A. A. G., Jose, M. A., Ficheman, I. K., Zuffo, M. K., & Lopes, R. D. (2016). Developing an interactive 4D cinema platform with open source tools. 2016 IEEE International Symposium on Consumer Electronics (ISCE).https://doi.org/10.1109/isce.2016.7797404
Kang D, Kim J, Jang D-P, Cho YS, Kim S-P (2015) Investigation of engagement of viewers in movie trailers using electroencephalography. Brain-Computer Interfaces 2:193–201. https://doi.org/10.1080/2326263X.2015.1103591
Terlutter R, Diehl S, Koinig I, Waiguny MKJ (2016) Positive or Negative Effects of Technology Enhancement for Brand Placements? Memory of Brand Placements in 2D, 3D, and 4D Movies. Media Psychol 19(4):505–533. https://doi.org/10.1080/15213269.2016.1142377
Zhoul Y, Tapaswi M, Fidler S (2018) Now you shake me: Towards automatic 4D cinema. 2018 IEEE/CVF Conference on computer vision and pattern recognition, pp 7425–7434. https://openaccess.thecvf.com/content_cvpr_2018/html/Zhou_Now_You_Shake_CVPR_2018_paper.html
Genna C, Oddo C, Fanciullacci C, Chisari C, Micera S, Artoni F (2018) Bilateral cortical representation of tactile roughness. Brain Res 1699:79–88. https://doi.org/10.1016/j.brainres.2018.06.014
Bumsuk Choi, Eun-Seo Lee, & Kyoungro Yoon. (2011). Streaming Media with Sensory Effect. 2011 International Conference on Information Science and Applications. https://doi.org/10.1109/icisa.2011.5772390
Rangel ML, Souza L, Rodrigues EC, Oliveira JM, Miranda MF, Galves A, Vargas CD (2021) Predicting upcoming events occurring in the space surrounding the Hand. Neural Plast 2021:6649135
Lemmens, P., Crompvoets, F., Brokken, D., van den Eerenbeemd, J. & de Vries, G.-J. (2009). A body-conforming tactile jacket to enrich movie viewing. World Haptics 2009 - Third Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. https://doi.org/10.1109/whc.2009.4810832
Simons RF, Detenber BH, Cuthbert BN, Schwartz DD, Reiss JE (2003) Attention to television: Alpha power and its relationship to image motion and emotional content. Media Psychol 5(3):283–301. https://doi.org/10.1207/S1532785XMEP0503_03
Jalilifard A, Rastegarnia A, Pizzolato EB, Islam MK (2020) Classification of emotions induced by horror and relaxing movies using single-channel EEG recordings. Int J Electr Comput Eng (IJECE) 10(4):3826
Kauttonen J, Hlushchuk Y, Jääskeläinen IP, Tikka P (2018) Brain mechanisms underlying cue-based memorizing during free viewing of movie Memento. Neuroimage 172:313–325. https://doi.org/10.1016/j.neuroimage.2018.01.068
Masood N, Farooq H (2021) Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms. Brain Sci 11(6):696. https://doi.org/10.3390/brainsci11060696
Raz G, Hendler T (2014) Forking Cinematic Paths to the Self Neurocinematically Informed Model of Empathy in Motion Pictures. Projections 8–2, https://doi.org/10.3167/proj.2014.080206
Zhu L, Wu Y (2021) Love your country: EEG evidence of actor preferences of audiences in patriotic movies. Front Psychol 12:717025
Pralus A, Belfi A, Hirel C, Lévêque Y, Fornoni L, Bigand E, Jung J, Tranel D, Nighoghossian N, Tillmann B, Caclin A (2020) Recognition of musical emotions and their perceived intensity after unilateral brain damage. Cortex, S0010945220302173. https://doi.org/10.1016/j.cortex.2020.05.015
Brookshire G, Casasanto D (2018) Approach motivation in human cerebral cortex. Philos Trans B 373. https://doi.org/10.1098/rstb.2017.0141
Comer C, DeVore B, Harrison P, Harrison D (2017) Cerebral laterality, emotion, and cardiopulmonary functions: An investigation of left and right CVA patients. Acta Neuropsychologica 15:32–55. https://doi.org/10.5604/01.3001.0010.6093
Davidson R (1993) Cerebral asymetry and emotion: conceptual and methodological conundrums. Cog Emotion 7:115–138
Gainotti G (2019) A historical review of investigations on laterality of emotions in the human brain. J Hist Neurosci 11:217–233. https://doi.org/10.1080/0964704X.2018.1524683
Wyczesany M, Capotosto P, Zappasodi F, Prete G (2018) Hemispheric asymmetries and emotions: Evidence from effective connectivity. Neuropsychologia 121:98–105
Kortelainen, Jukka; Väyrynen, Eero; Seppänen, Tapio (2015). High-Frequency Electroencephalographic Activity in Left Temporal Area Is Associated with Pleasant Emotion Induced by Video Clips. Comput Intell Neurosci 1–14. https://doi.org/10.1155/2015/762769
Meletti S, Tassi L, Mai R, Fini N, Tassinari CA, Russo GL (2006) Emotions induced by intracerebral electrical stimulation of the temporal lobe. Epilepsia 47(5):47–51. https://doi.org/10.1111/j.1528-1167.2006.00877
Barrett LF (2011) Was Darwin wrong about emotional expressions? Curr Dir Psychol Sci 20:400–406. https://doi.org/10.1177/0963721411429125
Ortony A, Turner TJ (1990) What’s basic about basic emotions? Psychol Rev 97:315–331. https://doi.org/10.1037/0033-295X.97.3.315
Panksepp J (2010) Affective consciousness in animals: perspectives on dimensional and primary process emotion approaches. Proc Biol Sci 277:2905–2907. https://doi.org/10.1098/rspb.2010.1017
Barrett LF, Mesquita B, Ochsner KN, Gross JJ (2007) The experience of emotion. Annu Rev Psychol 58:373–403. https://doi.org/10.1146/annurev.psych.58.110405.085709
Zheng W-L, Zhu J-Y, Lu B-L (2019) Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput 10(3):417–429. https://doi.org/10.1109/taffc.2017.2712143
Arjmand H-A, Hohagen J, Paton B, Rickard NS (2017) Emotional Responses to Music: Shifts in Frontal Brain Asymmetry Mark Periods of Musical Change. Front Psychol 8:2044. https://doi.org/10.3389/fpsyg.2017.02044
Pan F, Zhang L, Ou Y, Zhang X (2019) The audio-visual integration effect on music emotion: Behavioral and physiological evidence. PLoS ONE 14(5):1–21. https://doi.org/10.1371/journal.pone.0217040
Linhartová P, Látalová A, Kóša B, Kašpárek T, Schmahl C, Paret C (2019) fMRI neurofeedback in emotion regulation: A literature review. Neuroimage. https://doi.org/10.1016/j.neuroimage.2019.03.011
Raheel A, Majid M, Alnowami M, Anwar SM (2020) Physiological Sensors Based Emotion Recognition While Experiencing Tactile Enhanced Multimedia. Sensors 20(14):4037. https://doi.org/10.3390/s20144037
Zhang L, Saboune J, El Saddik A (2015) Development of a haptic video chat system. Multimedia Tools Appl 74(15):5489–5512. https://doi.org/10.1007/s11042-014-1865-x
Arafsha F, Alam KM, El Saddik A (2015) Design and development of a user centric affective haptic jacket. Multimed Tools Appl 74(9):3035–3052. https://doi.org/10.1007/s11042-013-1767-3
Ablart D, Velasco C, Obrist M, Video O (2017) Integrating mid-air haptics into movie experiences. En Proceedings of the 2017 ACM International Conference on Interactive Experiences for TV, pp 77–84. https://doi.org/10.1145/3077548.3077551
Mazzoni A, Bryan-Kinns N (2016) Moody: Haptic sensations to enhance mood in film music. En Proceedings of the 2016 ACM Conference Companion Publication on Designing Interactive Systems. https://doi.org/10.1145/2908805.2908811
Zhu L, Cao Q, Cai Y (2020) Development of augmented reality serious games with a vibrotactile feedback jacket. Virtual Reality & Intelligent Hardware 2(5):454–470. https://doi.org/10.1016/j.vrih.2020.05.005
Piccardi ES, Begum Ali J, Jones EJH, Mason L, Charman T, … Gliga T (2021) Behavioural and neural markers of tactile sensory processing in infants at elevated likelihood of autism spectrum disorder and/or attention deficit hyperactivity disorder. J Neurodevelopmental Disord 13(1). https://doi.org/10.1186/s11689-020-09334-1
Portnova GV, McGlone FP, Tankina OA, Skorokhodov IV, Shpitsberg IL, Varlamov AA (2019) EEG correlates of tactile perception abnormalities in children with autism spectrum disorder. Sovremennye Tekhnologii v Meditsine 11(1):169. https://doi.org/10.17691/stm2019.11.1.20
Zhuang N, Zeng Y, Yang K, Zhang C, Tong L, Yan B (2018) Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals. Sensors 18(3):841. https://doi.org/10.3390/s18030841
Mulas MJ, Revuelta P, Garcia A, Ruiz B, Vergaz R, Cerdan V, Ortiz T (2020) Vibrotactile captioning of musical effects in audio-visual media as an alternative for deaf and hard of hearing people: An EEG study. IEEE Access: Pract Innovations, Open Solutions 8:190873–190881. https://doi.org/10.1109/access.2020.3032229
Lee Y-Y, Hsieh S (2014) Classifying different emotional states by means of EEG-based functional connectivity patterns. PLoS ONE 9(4):e95415. https://doi.org/10.1371/journal.pone.0095415
Pradhapan P, Velazquez ER, Witteveen JA, Tonoyan Y, Mihajlović V (2020) The Role of Features Types and Personalized Assessment in Detecting Affective State Using Dry Electrode EEG. Sensors 20:6810. https://doi.org/10.3390/s20236810
Revuelta P, Ortiz T, Lucía MJ, Ruiz B, Sánchez-Pena JM (2020) Limitations of Standard Accessible Captioning of Sounds and Music for Deaf and Hard of Hearing People: An EEG Study. Front Integr Neurosci 14:1. https://doi.org/10.3389/fnint.2020.00001
Pereira ET, Gomes HM, Veloso LR, Mota MRA (2021) Empirical evidence relating EEG signal duration to emotion classification performance. IEEE Trans Affect Comput 12(1):154–164. https://doi.org/10.1109/taffc.2018.2854168
Geethanjali B, Adalarasu K, Hemapraba A, Kumar SP, Rajasekeran R (2017) Emotion analysis using SAM (Self-Assessment Manikin) scale. Biomedical Research-tokyo. https://acortar.link/OV4Xdn
Bordwell D, Bordwell T (1988) Film Art: an Introduction, 2nd edn. Random House
Ponz A, Montant M, Liegeois-Chauvel C, Silva C, Braun M, Jacobs AM, Ziegler JC (2014) Emotion processing in words: a test of the neural re-use hypothesis using surface and intracranial EEG. Social Cognitive and Affective Neuroscience 9(5):619–627. https://doi.org/10.1093/scan/nst034
Xie W, McCormick SA, Westerlund A, Bowman LC, Nelson CA (2019) Neural correlates of facial emotion processing in infancy. Dev Sci 22(3):e12758. https://doi.org/10.1111/desc.12758
Yang K, Tong L, Shu J, Zhuang N, Yan B, Zeng Y (2020) High Gamma Band EEG Closely Related to Emotion: Evidence From Functional Network. Front Human Neurosci 14. https://doi.org/10.3389/fnhum.2020.00089
Yang T, Di Bernardi Luft C, Sun P, Bhattacharya J, Banissy MJ (2020) Investigating age-related neural compensation during emotion perception using electroencephalography. Brain Sci 10(2):61. https://doi.org/10.3390/brainsci10020061
Lang PJ (1985) The Cognitive Psychophysiology of Emotion: Anxiety and the Anxiety Disorders. Lawrence Erlbaum, Hillsdale, NJ
Dmochowski JP, Bezdek MA, Abelson BP, Johnson JS, Schumacher EH, Parra LC (2014) Audience preferences are predicted by temporal reliability of neural processing. Nature Communications 5(July):1–9. https://doi.org/10.1038/ncomms5567
Pascual-Marqui RD, Michel CM, Lehman D (1994) Low resolution electromagnetic tomography: a new method for localizing electrical activity of the brain. Int J Psychophysiololy 18:49–65. https://doi.org/10.1016/0167-8760(84)90014-X
Evans AC, Collins DL, Mills SR, Brown ED, Kelly RL, Peters TM (1993) 3D statistical neuroanatomical models from 305 MRI volumes. Proc. IEEE- Nucl Sci Symp Med Imaging Conf 95:1813–1817 (https://bit.ly/2FbDzDw)
Tzourio-Mazoyer N, Landeau B, Papathanassiou D, Crivello F, Etard O, Delcroix N, Mazoyer B, Joliot M (2002) Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain. Neuroimage 15(1):273–289
Al-Nafjan A, Hosny M, Al-Ohali Y, Al-Wabil A (2017) Review and classification of emotion recognition based on EEG brain-computer interface system research: A systematic review. Appl Sci 7(12):1239. https://doi.org/10.3390/app7121239
Goshvarpour A, Goshvarpour A (2019) EEG spectral powers and source localization in depressing, sad, and fun music videos focusing on gender differences. Cogn Neurodyn 13:161–173. https://doi.org/10.1007/s11571-018-9516-y
Carbonell F, Galan L, Valdes P, Worsley K, Biscay RJ, Diaz-Comas L (2004) Random field-union intersection tests for EEG/MEG imaging. Neuroimage 22:268–276. https://doi.org/10.1016/j.neuroimage.2004.01.020
Lage-Castellanos A, Martínez-Montes E, Hernández-Cabrera JA, Galán L (2010) False discovery rate and permutation test: an evaluation in ERP data analysis. Stat Med 29:63–74. https://doi.org/10.1002/sim.3784
Cela-Conde C, Ayala F, Munar E, Maestú F, Nadal M, Capó M, Del Río D, López-Ibor J, Ortiz T, Mirasso C, Marty G (2009) Sex-related similarities and differences in the neural correlates of beauty. PNAS 106(10):3847–3852. https://doi.org/10.1073/pnas.0900304106
Hales JB, Brewer JB (2011) The timing of associative memory formation: frontal lobe and anterior medial temporal lobe activity at associative binding predicts memory. J Neurophysiol 105(4):1454–1463. https://doi.org/10.1152/jn.00902.2010
Vaidya AR, Fellows LK (2019) Ventromedial frontal lobe damage affects interpretation, not exploration, of emotional facial expressions. Cortex; a J Devoted Study Nerv Syst Behav 113:312–328. https://doi.org/10.1016/j.cortex.2018.12.013
Geller JD (2020) Introduction: Psychotherapy through the lens of cinema. J Clin Psychol. https://doi.org/10.1002/jclp.22995
Pace-Schott EF, Shepherd E, Spencer RMC, Marcello M, Tucker M, Propper RE, Stickgold R (2011) Napping promotes inter-session habituation to emotional stimuli. Neurobiol Learn Mem 95(1):24–36. https://doi.org/10.1016/j.nlm.2010.10.006
Schweinberger SR, Neumann MF (2016) Repetition effects in human ERPs to faces. Cortex; a J Devoted Study Nerv Syst Behav 80:141–153. https://doi.org/10.1016/j.cortex.2015.11.001
Sokolov YN (1963) Higher nervous functions: The orienting reflex. Annual Annual Review of Physiology 25:545–580. https://doi.org/10.1146/annurev.ph.25.030163.002553
Wood DC (1988) Habituation in Stentor produced by mechanoreceptor channel modification. J Neurosci 8(7):2254–2258. https://doi.org/10.1523/JNEUROSCI.08-07-02254.1988
Rankin HA, Abrams T, Barry RJ, Bhatnagar S, Clayton DF, Colombo J, Thompson RF (2009) Habituation revisited: An updated and revised description of the behavioral characteristics of habituation. Neurobiol Learn Mem 92(2):135–138. https://doi.org/10.1016/j.nlm.2008.09.012
Zatsepin V (2017) Acting for the silent screen: Film actors and aspiration between the wars. Hist J Film Radio Telev 37(4):760–762. https://doi.org/10.1080/01439685.2017.1345133
Balconi M, Lucchiari C (2006) EEG correlates (event-related desynchronization) of emotional face elaboration: A temporal analysis. Neuroscience Letters 392(1–2):0–123. https://doi.org/10.1016/j.neulet.2005.09.004
Breiter H, Etcoff N, Whalen P, Kennedy W, Rauch S, Buckner R, Srauss M, Hyman S, Rosen B (1996) Response and Habituation of the Human Amygdala during Visual Processing of Facial Expression. Neuron 17(5):875–887. https://doi.org/10.1016/S0896-6273(00)80219-6
Cardini F, Costantini M, Galati G, Romani GL, Làdavas E, Serino A (2011) Viewing One’s Own Face Being Touched Modulates Tactile Perception: An fMRI Study. J Cogn Neurosci 23(3):503–513. https://doi.org/10.1162/jocn.2010.21484
Cela-Conde C, Marty G, Maestú F, Ortiz T, Munar E, Fernández A, Roca M, Rosselló J, Quesne F (2004) Activation of the prefrontal cortex in the human visual aesthetic perception. Proc Natl Acad Sci 16:6321–6325. https://doi.org/10.1073/pnas.0401427101
Christoforou C, Papadopoulos TC, Constantinidou F, Theodorou M (2017) Your Brain on the Movies: A Computational Approach for Predicting Box-office Performance from Viewer’s Brain Responses to Movie Trailers. Front Neuroinform 11:72. https://doi.org/10.3389/fninf.2017.00072
Ekman P, Cordaro D (2011) What is meant by calling emotions basic. Emot Rev 3:364–370. https://doi.org/10.1177/1754073911410740
Espenhahn S, Yan T, Beltrano W, Kaur S, Godfrey K, Cortese F, Bray S, Harris AD (2020) The effect of movie-watching on electroencephalographic responses to tactile stimulation. NeuroImage. https://doi.org/10.1016/j.neuroimage.2020.117130
Giménez Sarmiento Á, CerdánMartínez V (2022) Propuesta metodológica para el análisis de la forma documental De la cuantificación a la cualificación. Comunicación Y Métodos 4(1):9–25. https://doi.org/10.35951/v4i1.151
Harmon-Jones E (2003) Clarifying the emotive functions of asymmetrical frontal cortical activity. Psychophysiology 40(6):838–848. https://doi.org/10.1111/1469-8986.00121
Gainotti G (2018) Emotions and the Right Hemisphere: Can New Data Clarify Old Models? Neuroscientist. https://doi.org/10.1177/1073858418785342
Grimshaw GM, Carmel D (2014) An asymmetric inhibition model of hemispheric differences in emotional processing. Front Psychol 5:489
Kaneko T, Tomonaga M (2008) Utility of Habituation-Dishabituation Procedure for Comparative Cognitive Studies of Callithrix Jacchus and Aotus spp.: Preliminary Assessments. Perceptual and Motor Skills 106(3):830–832. https://doi.org/10.2466/pms.106.3.830-832
Knyazev GG, Slobodskoj-Plusnin JY, Bocharov AV (2009) Event-related delta and theta synchronization during explicit and implicit emotion processing. Cognitive Neuroscience 164(4):0–1600
Khan, P., Ranjan, P., & Kumar, S. (2021). AT2GRU: A human emotion recognition model with mitigated device heterogeneity. IEEE Transactions on Affective Computing, 1–1. https://doi.org/10.1109/taffc.2021.3114123
Leeuwis N, Pistone D, Flick N, van Bommel T (2021) A sound prediction: EEG-based neural synchrony predicts online music streams. Front Psychol 12:672980
Major S, Carpenter K, Beyer L, Kwak H, Dawson G, Murias M (2020) The influence of background auditory noise on P50 and N100 suppression elicited by the Paired-Click Paradigm. J Psychophysiol 34(3):171–178
Masood N, Farooq H (2019) Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State. Sensors 19:522. https://doi.org/10.3390/s19030522
Miller JG, Xia G, Hastings PD (2019) Resting Heart Rate Variability is Negatively Associated with Mirror Neuron and Limbic Response to Emotional Faces. Biological Psychology 107717. https://doi.org/10.1016/j.biopsycho.2019.107717
Ortiz Alonso T, Santos JM, Ortiz Terán L, Borrego Hernández M, Poch Broto J, de Erausquin GA (2015) Differences in early stages of tactile ERP temporal sequence (P100) in cortical organization during passive tactile stimulation in children with blindness and controls. PLoS ONE 10(7):e0124527. https://doi.org/10.1371/journal.pone.0124527
Schrammen E, Grimshaw GM, Berlijn AM, Ocklenburg S, Peterburs J (2020) Response inhibition to emotional faces is modulated by functional hemispheric asymmetries linked to handedness. Brain Cogn 145(105629):105629. https://doi.org/10.1016/j.bandc.2020.105629
Smith ME, Gevins A (2004) Attention and brain activity while watching television: Components of viewer engagement. Media Psychol 6(3):285–305. https://doi.org/10.1207/s1532785xmep0603_3
Smith EE, Reznik SJ, Stewart JL, Allen JJB (2017) Assessing and conceptualizing frontal EEG asymmetry: An updated primer on recording, processing, analyzing, and interpreting frontal alpha asymmetry. Int J Psychophysiol 111:98–114. https://doi.org/10.1016/j.ijpsycho.2016.11.005
Tanaka K, Yasuda S, Kuriki S, Uchikawa Y (2016) The influence of visual induction of positive-negative emotions on the somatosensory cortex. IEEE Trans Electron Inf Syst 136(9):1298–1304. https://doi.org/10.1541/ieejeiss.136.1298
Ushida T, Ikemoto T, Taniguchi S, Ishida K, Murata Y, Ueda W, Tanaka S, Ushida A, Tani T (2005) Virtual pain stimulation of allodynia patients activates cortical representation of pain and emotions: a functional MRI study. Brain Topogr 18(1):27–35. https://doi.org/10.1007/s10548-005-7898-8
Zhao G, Zhang Y, Ge Y (2018) Frontal EEG Asymmetry and Middle Line Power Difference in Discrete Emotions. Front Behav Neurosci 12:225–235. https://doi.org/10.3389/fnbeh.2018.00225
Jung C-W, Lee H-E, Wi H-W, Choi N-S, Park P-W (2016) Study on the characteristics of EEG in resting state on visuo-spatial working memory performance. Journal of the Korea Academia-Industrial cooperation Society 17(4):351–360. https://doi.org/10.5762/kais.2016.17.4.351
Masood N, Farooq H (2018) Multimodal paradigm for emotion recognition based on EEG signals. En Human-Computer Interaction. Theories, Methods, and Human Issues. Springer International Publishing, pp 419–428
Raheel A, Anwar SM, Majid M, Khan B, Ehatisham-ul-Haq (2016) Real time text speller based on eye movement classification using wearable EEG sensors. 2016 SAI Computing Conference (SAI). https://doi.org/10.1109/SAI.2016.7555977
Funding
Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Informed consent
The participants of the experiment signed a consent document.
Conflicts of interest
We do not have conflicts of interest.
Research involving Human Participants and/or Animals
This study is part of a line of research whose clinical trials were approved by the Clinical Research Ethics Committee (CEIC) of the San Carlos Clinical Hospital since April 4th 2019, Madrid (Spain).
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Cerdán-Martínez, V., García-López, Á., Revuelta-Sanz, P. et al. Haptic stimulation during the viewing of a film: an EEG-based study. Multimed Tools Appl 83, 67673–67686 (2024). https://doi.org/10.1007/s11042-024-18218-8
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-024-18218-8