Total Immersion: Designing for Affective Symbiosis in a Virtual Reality Game with Haptics, Biosensors, and Emotive Agents
Affective symbiosis for human–computer interaction refers to the dynamic relationship between the user and affective virtual agents. In order to facilitate a true, immersive experience, we believe that it is necessary to adapt the presentation of an affective agent to the user’s affective state. Investigating the experience, behavior, and physiological correlates of affective events, such as winning and losing during a competitive game, therefore might be used to adapt the agent’s emotional states and system events. An experimental virtual reality game environment was designed as a stepping stone toward a system to demonstrate affective symbiosis. Users were invited to play a game of air hockey with affective agents in virtual reality. We collected the electrocardiography, electrodermal activity, and postural data, as well as self-reports, to investigate how emotional events affected physiology, behavior, and experience. The users were found to be engaged in the competition strongly while only paying limited attention to their adversaries’ emotional expressions. We discuss how game events are much stronger causes for affective responses, with the physiological effects of winning and losing becoming more enhanced as the game progresses. We discuss how an affective, symbiotic system could implement both game events and dynamic, affective agents to create a system for total immersion.
KeywordsAffective symbiosis Virtual reality Immersion Haptics 3D game Physiology HCI Symbiotic interaction
Human–computer symbiosis is a relationship between a human and computer in a cooperative interactive space in which the computer acts as an organism different from the human and is not limited to inflexible dependence on predetermined programs . Based on a recent review by Jacucci et al. , for a human–computer paradigm, symbiotic interaction is possible to achieve by using technological resources to understand users and to make those resources understandable to users. Computation, sensing technology, and interaction design are key aspects to monitoring and affecting users through the systems output in a closed-loop paradigm. Following the notion of symbiotic interaction in human–computer paradigms outlined by Jacucci et al. , we affixed an affective agent or system to form affective symbiosis.
An affective agent, in one respect, can be expressed as the embodied agent or screen-based anthropomorphic entity that can display emotional expressions to affect users emotionally. A number of studies have demonstrated the potential of using affective agents in interaction [3, 4, 5, 6, 7]. For example, the use of an embodied agent in an interface can enhance social interaction  and improve life-like interaction with agents . Of course, not every situation requires affective agents: Embodied, affective agents in utilitarian contexts can result in negative user emotions (e.g., the Microsoft search puppy). On the other hand, competitive contexts, such as computer games, might employ these agents more appropriately. In these types of systems, it is also much more likely that a relationship among a user, virtual agents, and the system grows and changes over time. Moreover, earlier studies have shown that the competitiveness of the interaction is positively associated with emotional synchrony between human users . In other words, people pay more attention and are more aware of their adversary’s emotions as the competitiveness of the interaction increases. In human–computer interaction, this could mean that the agent’s affective state becomes more salient for the user when increasing the competitiveness of the game.
However, creating an affective symbiosis between the user and agent requires more than simply integrating emotional expressions into a competitive context. The interaction with the system should be built on a dynamic loop that adapts the system to a user’s body, behavior, and experience . In the present study, we took steps toward creating this type of system in a competitive game scenario, expecting that it will enhance engagement and immersion during interaction. At this point, the system’s adaptiveness is limited to the adversary’s performance, meaning that the performance of the adversary improves directly with the user’s performance. We expect that if a user engages in affective symbiosis, (a) the user’s behavior and physiological states should be affected strongly by system events, (b) the emotional state of the user should be affected by the emotional state of artificial agents in the system, and (c) engagement with the system should increase as a function of the cumulative degree of interaction. In the present study, we aimed to gain early insights on the relationship between these critical aspects of affective symbiosis. The hope is that identified shortcomings will inform us adequately to enable a full-scale affective symbiosis system in the future.
We implemented an interaction scenario similar to the table-top game of air hockey, featuring virtual reality, haptic feedback, psychophysiological measurements, and affective agents. The virtual agent displayed emotional expressions during play. We also built a haptic glove to display the different forms of tactile feedback during the game events. The air hockey game was selected because it represents a fast-changing competitive context in which the user is facing his/her opponent and is allowed to see his/her adversary’s facial expressions without disengaging attention from the game.
In order to investigate the user’s engagement in affective symbiosis, we measured the effect of emotional expressions displayed by affective agents. We expected that interaction with the agent could result in two types of emotional relationship, which one might consider, in the context of symbiotic interaction, as either parasitic or mutualistic. From a primarily utilitarian point of view, competition might entail a parasitic emotional effect: a negative expression of one’s opponent suggesting a gain and therefore resulting in positive affect. However, many studies have suggested that emotional expressions usually result in mirroring behavior, imitating the agent’s expression, which, according to some, might cause associated emotions . In other words, positive emotions expressed by the agents might result in either negative (parasitical affective symbiosis) or positive (mutualistic affective symbiosis) affects in the user.
Because emotions are known to affect users unconsciously on occasion , if the system is affectively symbiotic, we expect that the more one interacts with the system, the more one becomes affected by game events. This refers primarily to events that happen to oneself (i.e., one wins or loses). To measure the emotional involvement, we complemented traditional self-report questionnaires with physiological data. To investigate how users implicitly responded to emotional events, we took measures of electro-dermal activity (EDA) and heart rate variability (HRV) during the game.
We think that immersion might improve affective symbiosis, because it might create a stronger impression of the importance of the game and events, thus increasing the salience of the agent’s emotions. For this reason, we used virtual reality  and haptic feedback . The findings of the study will contribute to understandings of some crucial points to consider before developing an affective symbiosis adaptation and/or game model and will direct the possibilities.
2 Related Work
Both systems that adapt to user emotions and affective agents that influence user emotions have been popular topics of research in the field of human–computer interaction (e.g., [15, 16, 17, 18, 19]). Work regarding emotionally adapted games has demonstrated how adapting game events to a user’s physiological activity can improve the game experience . Kuikkaniemi et al. , for instance, used EDA to control the level of difficulty in a first-person shooting game. Results revealed that players who were able to see their biosignal while playing were more immersed and positively affected by the game.
In addition to the adaptive games, affective agents’ emotional expressions also have been shown to influence users’ emotional states and social behavior [5, 7, 20]. De Melo and colleagues investigated social decision-making between a human and affective virtual agent, demonstrating how users were more willing to cooperate or make concessions if agents were displaying certain emotions [21, 22]. Besides making different decisions, users also have shown to mimic the expressions of affective agents automatically and respond to negative expressions with heightened autonomic arousal . In conclusion, affective agents indeed affect users’ emotions and behavior.
Rather than using 2-D screen-based virtual reality, one can use head-mounted displays (HMDs) to achieve better immersion  by shutting out the outside world. Recently, advancements in HMD technologies have increased the use of virtual reality and its applications, mostly for gaming purposes [25, 26], to provide the experience of immersion and the enjoyment of full engagement. Though HMDs are used for the visualization of the virtual environment, the realistic feel comes from the feeling of touch and getting a response from the virtual objects , which can be achieved through haptic technologies. A haptic feedback technology increases the player’s sense of immersion  and provides interesting ways to interact with the game  while enhancing the entertainment value .
To summarize, systems that adapt to a user’s emotions and systems in which affective agents influence user emotions have been popular topics of research. However, an immersive system that incorporates adaptive gaming performance into affective expressions requires more research. The present work aims to pave the way toward affective symbiosis by investigating the changes in users’ physiology and emotional state in a competitive affective game environment.
3 The Game
An air hockey game (similar to Pong) was designed to investigate the effects of affective agents in an immersive, competitive context. A virtual puck was served from the center left of the air hockey table either toward the agent or the player, alternatively. Both the player and agent blocked the puck using a virtual mallet. There were slots on either end of the air hockey table that served as goal bars. The angle of the puck-serving direction was randomly generated between 15° to 75° angles toward either the agent or the player.
3.1 Player Interaction
We used Leap Motion, a tiny device used to sense a user’s hand and finger motions using infrared cameras in an interactive 3-D space, to track the user’s right-hand movements. This device was positioned underneath a glass table and enabled sight of the player’s hand in virtual reality. A transparent plastic sheet was used to help move the user’s right hand freely on the glass table. The default hand model, provided as part of Leap Motion’s Unity3D package (www.leapmotion.com), was used to enable interaction in virtual reality. In virtual reality, a user could grip the virtual mallet and move it to face the moving puck. A player’s left-hand index, middle, and ring fingers rested on the arrow keys of a PC keyboard and were used to provide answers to the self-reported items shown during gameplay (Fig. 3).
3.2 Agent Interaction
3.3 Tactile Feedback
Seventeen university students and researchers (12 male, five female, 25.47 ± 4.4 years old) volunteered to take part in the study. They signed informed consent agreements prior to the start of the experiment and, afterward, received a movie ticket for their time.
4.2 Test Setup
The Unity 3-D game engine (version 4.5.4, www.unity3d.com) was used to implement the experimental system, providing the game scenario and questionnaires visually in virtual reality via the HMD, recording user responses, and communicating with the physiological apparatus. Virtual reality was enabled using the Oculus Rift VR HMD (Oculus Rift Developer Kit 2, Resolution 960 × 1080 per eye, Refresh Rate 75 Hz, 100° nominal field of view). The player’s head movements were tracked at 1000 Hz using a three-axis accelerometer, gyroscope, magnetometer, and external positional tracker (www.oculusvr.com).
Affective agents in virtual reality were designed by combining a face model and a body model. The face model used was the original one provided by Faceshift (www.faceshift.com), and the body was designed using the FUSE design tool provided by Mixamo (www.mixamo.com). The texture of the face and body were edited in order to build three different-looking agents. The agent displayed three emotional expressions (happiness, anger, neutral control condition). These were recorded prior to the study by capturing a live presentation by a professional actress using Faceshift algorithms. Each expression animation lasted 4 s, ending with neutral expression. Prior to the experiment, a pilot study was conducted in which a sample of human participants was used to classify the expressions based on categories of six basic emotions (disgust, fear, sadness, anger, happiness, and neutral). Expressions of anger, happiness, and neutrality were then selected for the final paradigm based on their relatively higher recognition accuracy (happy: 95 ± 14%, neutral: 90 ± 14%, angry: 90 ± 23%).
4.3 Procedure and Design
Participants undertook nine blocks of seven trials. The virtual agent was changed between blocks whereas the order of agents was counterbalanced between participants. Trials were randomized fully across the 63 possibilities obtained by orthogonally crossing the 3 repetitions × 3 agents (angry, happy, neutral) × 7 serves.
User experience was investigated using a quantitative analysis of both explicit self-reported Likert scales and implicit behavioral and physiological measures.
Descriptive statistics of self-reported questionnaire (N = 17). A 5-point Likert scale (1 = not at all, 5 = very much) was used for item scoring. Note: values are means ± SEs
I felt frustrated
I felt influenced by the agent’s mood
I felt tense
I found it boring
I had to put a lot of effort into the game
I thought it was fun
I deeply concentrated on the game
I was good at it
The agent caught my attention
Note: Values indicate mean values on Likert scale from 1 to 5 ± standard errors
5.1 Emotion Recognition, Game Experience, and Presence Measures
Participants’ judgments concerning emotions, game experience, emotional interdependence, and co-presence were investigated using one-way repeated measure ANOVAs (RMANOVAs) with the agent’s emotional expression (neutral vs. angry vs. happy) as a factor. The agent’s emotional expressions were judged correctly in 91% of cases. Furthermore, some of the expressions were recognized significantly better than others, F(1.36, 21.82) = 4.34, p = .04. A post hoc LSD (Least Significant Difference) analysis revealed significantly higher recognition rates (p < .05) for happy (100%) compared to neutral (88 ± 14%) and angry (85 ± 23%) expressions. Moreover, the agent’s emotional expression had a significant effect on experienced emotional interdependence, F(2, 32) = 9.01, p = .001, η2 = .36, and co-presence, F(1.43, 22.84) = 15.31, p < .001, η2 = .49. A post hoc LSD analysis revealed that participants felt more influenced by the agent’s mood in happy (2.81 ± 0.78) and angry (2.54 ± 0.77) conditions than in the neutral (2.04 ± 0.59) condition (p < .05). Along similar lines, participants reported higher experiences of co-presence while playing against happy (3.45 ± 0.66) vs. angry (3.05 ± 0.59) agents, rating neutral (2.48 ± 0.70) agents as the least attention-capturing (both emotions > neutral, p < .05). Finally, the agent’s emotions affected the participant’s ratings on how deeply they concentrated on the game, F(2, 32) = 7.17, p = .003, η2 = .31. A post hoc LSD analysis revealed that participants were more concentrated during the happy (3.45 ± 0.66) than the angry condition (3.05 ± 0.59) and that the neutral expression (2.48 ± 0.70) was associated with poorest concentration (p < .01).
5.2 Experience of Game Events: Behavior and Physiology
Trials were included in the analysis only if both the agent and user successfully blocked or bounced the puck at least once. Seventeen users were tested, although the accelerator recordings were missing for one. Data for this user were omitted from movement-related analyses.
In order to find out whether the agent’s emotion had a general effect on behavioral and physiological responses, RMANOVAs were conducted with the agent’s emotion as factor and performance, movement, tonic EDA, average heart rate, HRV (standard deviation of the IBI), and skin conductance response rate as measures. Tonic EDA was found to be affected by the agent’s emotional expression significantly, F(2, 32) = 4.59, p = .03, but no effect was observed for any other measure, p > 0.1. Angry agents increased tonic EDA (883 µS) relative to neutral (829 µS) and happy (802 µS) conditions.
Next, we investigated how winning or losing affected the behavioral (movement data) and physiological (EDA, SCR, IBI) response. RMANOVAs were conducted on measures taken in the first 3 s after the game event of winning or losing with performance (win vs. loss) as factor. Performance significantly affected EDA, F(1, 16) = 15.20, p = .001, SCR(1, 16) = 17.92, p = .001, and movement, F(1, 15) = 20.15, p < .001. Losing increased EDA, SCR, and the amount of movement.
6 Discussion and Conclusion
The emergence of interest in human–computer symbiosis and affective communication calls for examinations into the possibilities of affective symbiosis in the human–computer paradigm. Here, we developed a competitive VR game environment working as a stepping stone toward a system that allows affective symbiosis. Previously, symbiotic interaction has been considered a beneficial aspect of collaborative interaction. However, recent findings from human–human gaming have demonstrated that affective linkage between users is crucial in competitive contexts . Developing a similar affective linkage in a human–computer paradigm would require new sources of information, such as biosignals, and a system that is able to adapt the game events to such information. In the present study, we built a simple, behaviorally adaptive, competitive environment in order to investigate how the affective agent’s emotional expressions affected the user’s emotions and game performance.
The results of the present study showed that the agent’s expressions affected emotional interdependence and co-presence, indicating that the adversary’s happy and angry rather than neutral expressions caught users’ attention and affected their conscious emotional state. Moreover, users reported concentrating more on the game when the agent expressed happiness. However, neither the difficulty of the game nor other game-related experiences were influenced by the agent’s emotion. Thus, the results suggest that the computer agent’s affective expressions modified the user’s attention and emotional experience but did not change the degree to which the user found the game difficult or entertaining.
Additionally, the implicit measures supported this conclusion. Users’ tonic EDA level was higher when their adversaries were angry, whereas users’ game performance was not influenced by the agent’s emotion. EDA and other implicit measures also revealed that users were likewise affected by the game events. Losing was related to more amplified tonic and phasic skin conductance responses than winning. Finally, users’ EDA responses got stronger as the game progressed, indicating greater personal investment or engagement over time. The findings suggest that affective agents and game events influence the user’s emotional state on both the conscious and implicit levels. These effects do not, however, modify the user’s performance in the game or its perceived difficulty.
The limited effect of agents’ affect could be due to the competitive nature of the task. In a negotiation context, another person’s nonverbal cues might be more salient and thus influence the user’s behavior more . In turn, in more competitive contexts, the influence of social cues is limited because the game events change more quickly and require continuous attention from the user. Nonetheless, the affectivity of the agent influenced users’ emotional state regardless of the competitive context. This effect was consistent with earlier findings . The results thus suggest that affective expressions presented in a competitive context do not necessarily make humans perform better in the game but improves the overall social experience.
The findings of this study will help us understand users’ experiences while interacting with affective adversaries in an immersive competitive environment. The results also can be used to build an affectively adaptive system that integrates users’ behaviors to their emotional responses when calculating further game events. More knowledge of agents’ adaptive emotional responses and their effect on the user is required. In the future, one could, for instance, investigate whether varying the agents’ affective state according to game events would intensify the effect of the agent’s emotions on the user’s emotions and behavior.
This work was supported by the Academy of Finland. Project number 268999 (HapCom).
- 2.Jacucci, G., Spagnolli, A., Freeman, J., Gamberini, L.: Symbiotic interaction: a critical definition and comparison to other human-computer paradigms. In: Jacucci, G., Gamberini, L., Freeman, J., Spagnolli, A. (eds.) Symbiotic 2014. LNCS, vol. 8820, pp. 3–20. Springer, Cham (2014). doi:10.1007/978-3-319-13500-7_1 Google Scholar
- 3.Isbister, K.: Better Game Characters by Design: A Psychological Approach. Elsevier/Morgan Kaufmann, Amsterdam/Boston (2006)Google Scholar
- 5.Dietz, R., Lang, A.: Affective agents: effects of agent affect on arousal, attention, liking and learning. In: Proceedings of the Third International Cognitive Technology Conference, San Francisco (1999)Google Scholar
- 6.Melo, C.M., Carnevale, P., Gratch, J.: The influence of emotions in embodied agents on human decision-making. In: Allbeck, J., Badler, N., Bickmore, T., Pelachaud, C., Safonova, A. (eds.) IVA 2010. LNCS (LNAI), vol. 6356, pp. 357–370. Springer, Heidelberg (2010). doi:10.1007/978-3-642-15892-6_38 CrossRefGoogle Scholar
- 8.Yee, N., Bailenson, J.N., Rickertsen, K.: A meta-analysis of the impact of the inclusion and realism of human-like faces on user experiences in interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1–10. ACM (2007)Google Scholar
- 14.Andrews, S., Mora, J., Lang, J., Lee, W.S.: Hapticast: a physically-based 3D game with haptic feedback. In: Proceedings of the International Conference on the Future of Game Design and Technology (Future Play) (October 2006)Google Scholar
- 15.Moghim, M., Stone, R., Rotshtein, P., Cooke, N.: Adaptive virtual environments: a physiological feedback HCI system concept. In: 2015 7th Computer Science and Electronic Engineering Conference (CEEC), pp. 123–128. IEEE (2015)Google Scholar
- 16.Edlinger, G., Holzner, C., Guger, C., Groenegress, C., Slater, M.: Brain-computer interfaces for goal orientated control of a virtual smart home environment. In: 4th International IEEE/EMBS Conference on Neural Engineering (NER 2009), pp. 463–465. IEEE (2009)Google Scholar
- 19.Kuikkaniemi, K., Laitinen, T., Turpeinen, M., Saari, T., Kosunen, I., Ravaja, N.: The influence of implicit and explicit biofeedback in first-person shooter games. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 859–868. ACM (2010)Google Scholar
- 21.de Melo, C.M., Carnevale, P., Gratch, J.: The effect of expression of anger and happiness in computer agents on negotiations with humans. In: The 10th International Conference on Autonomous Agents and Multiagent Systems, vol. 3, pp. 937–944. International Foundation for Autonomous Agents and Multiagent Systems (2011)Google Scholar
- 24.Sziebig, G., Rudas, I., Demiralp, M., Mastorakis, N.: Achieving total immersion: technology trends behind augmented reality - a survey. In: WSEAS International Conference, Proceedings. Mathematics and Computers in Science and Engineering (WSEAS) (2009)Google Scholar
- 27.Morris, D., Joshi, N., Salisbury, K.: Haptic battle pong: high-degree-of-freedom haptics in a multiplayer gaming environment. In: Experimental Gameplay Workshop, GDC. Citeseer (2004)Google Scholar
- 28.Ahmed, I., Harjunen, V., Jacucci, G., Hoggan, E., Ravaja, N., Spapé, M.M.: Reach out and touch me: effects of four distinct haptic technologies on affective touch in virtual reality. In: Proceedings of the 2016 ACM on International Conference on Multimodal Interaction. ACM (2016)Google Scholar
- 29.Van Erp, J.B.: Guidelines for the use of vibro-tactile displays in human computer interaction. In: Proceedings of Eurohaptics, pp. 18–22 (2002)Google Scholar
- 30.Poels, K., de Kort, Y.A.W., Ijsselsteijn, W.A.: FUGA-the fun of gaming: measuring the human experience of media enjoyment. Deliverable D3. 3: Game Experience Questionnaire. FUGA Proj. (2008)Google Scholar
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.