Keywords

1 Introduction

Recent research has shown the importance of individual idiosyncrasies, such as personality traits, in different human computer interaction contexts such as entertainment [1], mobile application usage [2], games [3] and even abusive use of information technologies [4]. While the value of individual characteristics is considered in models such as the task-technology fit (TTF) model [5], most attempts at operationalizing this part of the model have focused on self-perceived assessments.

Research is calling for unobtrusive and non-invasive approaches to capture users’ idiosyncrasies in order to better infer cognitive and emotional states [6, 7]. The ultimate goal of neuroadaptive interfaces is to provide users with the right content at the right time [8]. In order to achieve this goal, the interface needs to use some personal information like past behavior or the user’s current mental state. According to Parasuraman [9], neuroadaptive interfaces (i.e., based on the users’ current mental state) lead to a better synergy between technology and users, therefore ensuring a better user experience. These types of interfaces monitor physiological signals of the user to infer mental states in order to adapt automatically to that state, with the goal of improving the interaction. To successfully implement neuroadaptive interfaces, reliable assessment of a user’s mental states during the interaction process is necessary.

In this article, we explore the extent to which automatic facial information can inform on users’ idiosyncrasies such as personality traits. We use facial emotion from players in an emotionally charged video game to explore the relationship between the emotion detected during the game and their personality traits. Our results contribute to the development of a neuroadaptive model by outlining a simple and effective way of modelling users’ idiosyncrasies that can be used by physiological inference models [7].

2 Literature Review

It is generally agreed that basic human emotions can be associated with specific patterns of facial expressions. Each primary emotion has its own set of facial muscle contractions leading to a unique facial expression [10]. Facial expressions can be produced by pure emotional experiences as well as by many other events, and are caused by simultaneous temporary (between 250 ms and 5 s) contractions of different muscles such as eyelids, lips and mouth [11]. The temporal dimension and intensity of a contraction are crucial to infer a person’s emotional state with good accuracy. For example, Fridlund and Izard [12] found a correlation between the zygomatic muscle (a group of muscles near the mouth used for smiling) and positive emotions, and between the corrugator muscle (a group of muscles near the eyes used for frowning) and negative emotions.

There are two methods that can be used to measure facial expressions: electromyography and the use of automatic facial analysis Electromyography (EMG) consists in the recording of electrical activity induced by muscular contraction and relaxation [1315]. EMG has been used in HCI to measure users’ emotional valence in a gaming context [16].

The second method used to measure facial expressions is the Automatic Facial Analysis (AFA), which seeks to infer human’s emotions by using a chosen specialized software such as FacereaderFootnote 1 [17] and AffdexFootnote 2. It is crucial to use an accurate and reliable model of AFA in the field of HCI since it greatly relies on emotional intelligence [18]. The chosen software should detect facial features and associate them with specific facial expressions [19]. Given the disparity of biological features and environmental settings, this type of analysis requires consideration of many variables [11]. When analyzing facial expressions with a computer-based program, some difficulties may occur such as lighting (the face needs to be lit adequately), face orientation (the face needs to be facing the camera) and the expressions themselves (even emotionally induced, expressions differ between individuals due to physiological differences) [20].

Difficulties like face orientation can be avoided by using programs like Facereader (Noldus, The Netherlands). Such programs use the Active Template Method to associate the user’s face to a generic face template. The Active Appearance Model is used to help recognized variations related to all issues listed earlier [20]. Facial expressions can be classified in different physiological, cognitive, and emotional states by using facial actions called action units. A set of action units can be associated with multiple facial action possibilities like raising eyebrows or pinching the lips. Valstar and Pantic [21] explained that different action units, which are individually or collectively associated with physiological, cognitive, or emotional states, can be detected by a computer program or a trained observer. See Riedl and Léger [22] for a review of the measurement of facial emotions.

Prior research has explored the relationship between personality trait and facial emotions. Based on Tomkins’s ideo-affective structures [23] and Izard’s affective-cognitive structures [24], one’s repetitive emotional experiences lead to structural changes that are consolidated in his personality [25]. An experiment investigating physicians’ ability to express emotions (happy, sad, anger, and surprise) compared to their personality traits has shown results linking those two concepts [26]. They found that the ability to express sadness was linked with a sadness personality trait and the ability to express anger was linked with a dominance personality trait. Another study conducted by Malatesta-Magai et al. [27] compared Type A and Type B personalities on facial expression of anxiety, depression, anger, and aggression. They found a significant difference in hostile emotional behavior, more specifically suppression of anger facial expression for Type A individuals.

Moreover, Ekman and Rosenberg [10] found a relationship between the Extraversion trait of the Eysenck Personality Questionnaire-Revised [28] and facial responses. They classified the extraversion trait in three categories: introverts, ambiverts and extraverts. For enjoyment displays, the facial response was higher for extraverts than for introverts and ambiverts.

Literature on personality in the field of HCI also shows the importance and usefulness of adding personality variables to interfaces in order to better understand users’ behavior [2]. Recent work from Bastarache-Roberge et al. [3] proposes a model to predict flow states of players in a multiplayer game, which shows better accuracy once personality variables are taken into account. Another framework presented by Bostan [1] claims that behavior is affected by situational and personal factors. This extensive knowledge of participants’ characteristics, such as personality, was added to improve a prediction model of players and agent behavior in computer games.

3 Methodology

3.1 Methodological Strategy and Participants

For this experiment, we chose to observe users’ emotional reactions while they were playing an emotionally charged video game called Team Fortress 2 (Valve Corporation, Bellevue). This first-person shooter game was chosen based on the possibility of manipulating the game’s levels of difficulty and extracting players’ actions. 88 university students participated in this study (51 male, 31 female).

Players had to alternate between playing the game and watching the other players play on their screen through a spectator mode. Successful and missed shots, as well as game results (win or lose) were extracted from the game logs with an additional plug-in Supstats2 (by F2).

Players were all in the same room, around a hexagonal table, but were not facing each other. They each had their own headset (headphones and microphone) in order to communicate, therefore simulating an online environment.

After a practice game, only two players played the next game, alternating for each game. A total of 7 games were played by session. Teams were also randomly assigned to various difficulty level combinations (easy, normal, and hard). Data from 120 games were collected for this study.

3.2 Measures

Videos of the participant’s faces were recorded with webcams using MediaRecorder 2 (Noldus, The Netherlands) and then processed with Facereader 6 (Noldus, The Netherlands) in order to extract six basic emotions: happy, sad, scared, disgusted, surprised, and angry. Emotions were inferred for the whole gameplay and averaged for each participant, leading to six variables.

All participants had to fill in the HEXACO Personality Inventory [29]. It was used to assess personality based on six dimensions: honesty-humility, emotionality, extraversion, agreeableness, conscientiousness and openness to experience.

Based on Lee and Ashton [29], each dimension of the measurement scale is briefly summarized below.

  • Honesty-humility dimension is used to describe someone who will not use others for personal profit (material or social), or will not be tempted to break the rules. If the score on this dimension is low, then it describes someone who would do those things and more in order to get what they want.

  • The emotionality dimension describes a person who feels strong emotions and needs emotional support from peers. Some of the most common feelings are fear, anxiety, empathy and attachment. If a score is low on this dimension, it describes someone who is not close to his emotions, nor feels the need to share them with others.

  • For extraversion, people with high scores are positive and social. They are confident, enthusiastic and energetic. On the contrary, people with low scores on this dimension are uncomfortable under a lot of attention and do not particularly enjoy social activities.

  • Agreeableness is used to describe an individual who is forgiving, non-judgmental, cooperative and who has good abilities at controlling their temper. On the contrary, an individual with a low score is more stubborn, critical, and angry when facing mistreatment.

  • Someone organized, disciplined, and perfectionist will have a high score on the conscientiousness dimension, and someone unconcerned, impulsive and who is not bothered by work containing mistakes will have a low score.

  • The openness to experience dimension describes someone imaginative and curious about nature and other domains of knowledge. A low score describes someone who isn’t curious nor creative and who does not appreciate radical or unconventional ideas.

Table 1. Correlation analysis between personality traits and facial emotions.

4 Results

To explore the relationship between personality dimensions and discrete emotions of users, a correlation analysis was performed. Table 1 summarizes the results. First, we observe multiple statistically significant correlations (2-tailed) between users’ personality dimensions and their emotions during the game sessions, supporting our general contention that personality influences emotional reactions. Second, not all personality dimensions had an influence on emotional reactions (e.g., openness to experience) and some dimensions had influence over a greater number of emotions than others (e.g., emotionality and conscientiousness). Third, the personality dimension mostly related to emotions, emotionality is the one correlated with the greatest number of emotions. We found four significant correlations with disgust (p = 0.0262), happiness (p = 0.0091), angriness (p = 0.0001) and sadness (p < 0.0001). It suggests that users’ high on emotionality are more expressive than others when interacting with an interface. Fourth, interestingly two emotions (angry and surprise) had negative correlations with personality dimensions (emotionality and conscientiousness). Based on these results, it seems that a high score on emotionality and agreeableness dimensions is related with lower angriness. This result may suggest that users do not want to show their angriness in social situations. In addition, the more users scored high on the conscientiousness dimension, the less they express their surprise, suggesting that their need for control may inhibit their surprise expression.

5 Discussion and Concluding Comments

The results support our general hypothesis that personality influences emotional reactions. It contributes to the HCI literature by underlining the importance of taking into account users’ personality traits in analyzing their emotional reactions to interfaces.

Our results suggest that neuroadaptive interfaces using emotional reactions would perform better by taking into account users’ personality traits. Since those two concepts seem to have a relationship, we suggest that personality and emotions are variables that should be taken into account while developing adaptive interfaces in order to improve their accuracy and robustness.

Since the experiment was conducted is a social context (online environment), our results cannot be generalized to an individual context. Alone in a room, users may express themselves differently. This would allow a more intimate relationship between personality traits and emotions, rather than a relationship lowered to a social context, where facial expression is a part of communication and can be influenced by the presence of others. We suggest that this experiment should be replicated in an individual setting, in order to prime different personality dimensions. Another interesting possibility would be to replicate this experiment with other types of interfaces, and observe different emotional reactions.

Because personality traits influence individual emotional reactions, neuroadaptive systems based on users’ emotional reactions need to take into account personality traits to optimize the systems’ physiological inference.