Abstract
Knowledge production within the interdisciplinary field of human–robot interaction (HRI) with social robots has accelerated, despite the continued fragmentation of the research domain. Together, these features make it hard to remain at the forefront of research or assess the collective evidence pertaining to specific areas, such as the role of emotions in HRI. This systematic review of state-of-the-art research into humans’ recognition and responses to artificial emotions of social robots during HRI encompasses the years 2000–2020. In accordance with a stimulus–organism–response framework, the review advances robotic psychology by revealing current knowledge about (1) the generation of artificial robotic emotions (stimulus), (2) human recognition of robotic artificial emotions (organism), and (3) human responses to robotic emotions (response), as well as (4) other contingencies that affect emotions as moderators.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Building our research on and relating it to existing knowledge is the building block of all academic research activities, regardless of discipline [227], p. 333.
1 Introduction
Due to demographic change and related skilled-worker scarcity [25] and increased technological penetration of our private and working lives [232], social robots and success factors of human–robot interaction (HRI) become increasingly important. HRI and research on it represent a multidisciplinary field. It involves “the study of the humans, robots, and the ways they influence each other” ([84], p. 257). HRI research brings together various disciplines, such as robotics, engineering, computer science, human–computer interaction, cognitive science, and psychology [13]. Across these domains, a growing body of research focuses on human interactions with social robots. These robots “exist primarily to interact with people” ([142], p. 322) or evoke social responses from them [158]. Social robots appear in numerous roles, such as museum guides [226], receptionists [235, 236, 239], educational tutors [114, 127], household supporters [219], and caretakers [56, 114, 148, 252].
Due to their automated social presence, such robots make humans “feel that they are in the company of another social entity” ([277], p. 909). Social presence is often associated with a robot’s ability to express artificial emotions and facilitate social relationships [126]. Emotional signals have been shown as important factors in human–human relationsips [237], and emotions by robots increase humans’ perceptions of the transparency of the HRI. Furthermore, these signals allow humans to interpret robotic behaviors using well-known social cues, which they learned from prior human–human interactions [75]. As social robots gain the capacity to approximate humans more closely, their emotional expressions increasingly facilitate social HRI [37, 107, 204]. Accordingly, and thereby contributes to robotic psychology. Robotic psychology is defined as interdisciplinary field examines emotional, cognitive, social, and physical human responses to human-robot interactions by also considering physical and social environments. This overview seeks to synthesize research knowledge pertaining to emotions during HRI with social robots.
As the popularity of social robots continues to rise [124, 215], research into their displayed emotions also has accelerated; between 2000 and 2020, more than 1600 publications appeared in this area. The publication rate across the two decades indicates continuous increases, as Fig. 1 shows.
Empirical studies of emotions during HRI mainly revolve around three topics: (1) emotion expression by robots, (2) human recognition of robotic emotions, and (3) human responses to robotic artificial emotions. Some conceptual/overview articles, studies on emotion recognition during HRI, and publications in related fields also can inform the current literature synthesis. The complexity and fragmentation in this research domain makes it challenging for researchers to keep up with state-of-the-art findings. Furthermore, it is difficult to conceive the collectivity of evidence available in a particular research area. Therefore, a literature review is both timely and necessary. This comprehensive review aims to identify publications dealing with emotions during HRI with social robots (for studies on manufacturing robots, see [182, 186, 209]), following the process detailed in Fig. 2.
As a starting point, this review relies on an electronic search of digital libraries (Google Scholar, ScienceDirect, and Dimensions) using keywords such as “human–robot interaction / HRI AND emotion”, “robots AND emotion”, and “social robots AND emotions”. Next, the author conducted a manual search of proceedings published in key journals and conferences (for a complete list of reviewed conferences and journals see Web-Appendix 1 tiny.cc/IJSR20WebApp1). This initial scanning process revealed more than 1600 publications; further screening identified many of these publications as patent reports, short reports, or book chapters though. After excluding them from further consideration, 473 articles remained for the review (Fig. 2).
As Fig. 2 indicates, several other exclusion criteria apply for the review process too. First, the focus of this review is on emotions during HRI with social robots.
Second, in a broader sense, HRI involves a wide spectrum of research topics, such as industrial robots, telepresence, virtual reality, and wearables [9, 36]. This review takes a more narrow perspective by focusing on embodied HRI [56]. Accordingly, 24 studies on emotions exhibited during interactions with non-humanoid or non-embodied agents, such as virtual avatars and wearables, are excluded from further consideration.
Third, to maintain focus on dyadic interactions between typically developed humans and social robots, this review excludes several studies. Specifically, studies in unique, particular settings, such as human–robot teams that depend on other, dynamic effects (for an overview see [287]) are excluded. Furthermore, HRIs that are specific to humans experiencing health issues, in which the medical diagnostics is pivotal to defining the HRI [269, 270] are excluded. Web-Appendix 2 (tiny.cc/IJSR20WebApp2) outlines the excluded studies; these references may serve as further readings for researchers with special interests in these excluded areas.
Fourth, this review requires original empirical publications that underwent a double-blind and peer-reviewed process. Conceptual contributions and dissertations thus are excluded. Because of their predominant emphasis on conceptual approaches, studies of the ethical implications of emotions during HRI are also excluded. After applying these criteria, a total of 175 papers remain for the survey.
This review also may facilitate theory development. In particular, the insights from extant research indicate several areas that demand more research, as manifested in calls for conceptual and empirical models of emotions during HRI with social robots. Accordingly, this review sought to address five main research questions:
-
(1)
What methods have been applied for robotic emotional expression generation?
-
(2)
How well can robotic artificial emotions be recognized by humans?
-
(3)
How do humans respond to artificial robotic emotions?
-
(4)
How do contingencies affect the relationship between robotic emotions and human responses during HRI?
-
(5)
What remains to be learned regarding emotions during HRI?
In accordance with the guidelines for systematic literature reviews [15, 261], the review effort involves explanations of underlying theoretical perspectives, empirical design issues, and major findings to establish a foundation of existing research that can advance knowledge. By integrating both perspectives and empirical findings, it identifies areas in which research findings are disparate or exhibit interdisciplinary views.
To answer the research questions, Sect. 2 starts with a description of the conceptual framework as an organizing structure for this review. Section 3 presents the state of the art, organized by application domains. Finally, Sect. 4 outlines research directions for the field. The paper finishes with a conclusion in Sect. 5.
2 Framework of the Overview
A detailed review of published articles reveals four main streams of research on (1) emotional expressions by robots, (2) the human recognition of artificial robotic emotions, (3) human responses to robotic emotions, and (4) contingency factors. These research streams provide the basis for the review framework in Fig. 3.
To present extant research, this article organizes the discussions along a causal chain, which parallels the notions of the stimulus–organism–response (S–O–R) paradigm [178]. According to this framework, certain features of the environment (stimulus) affect people’s cognitive and emotional states (organism), which in turn drives behavioral responses (response) [66]. In parallel with the S–O–R paradigm for the current analysis, the stimulus (HRI) is an independent variable, and an organism is a mediator (human participants, their cognitive or affective states, and their prior experiences). The response (behaviors during the HRI) is the dependent variable for this review (see also [267, 289]).
This relationship is not automatic but rather tends to be shaped by the context and people’s own experience. The summary in Fig. 3 displays the focal topic of emotions during HRI with social robots.
3 State of the Art
3.1 Research Stream 1: Robotic Expressions of Artificial Emotions
The first research stream includes 71 reviewed studies and relates to the stimulus, depicted in Fig. 3. Reviewing this research attempts to answer the first research question: What methods have been applied for robotic emotional expression generation?
In a classical S–O–R paradigm, a stimulus can include expressions of another person’s internal states [77]. For the current framework, it entails expressions of artificial emotions by a robot [72]. Emotions are perceived as strong feelings by observers [80], so they offer important stimuli during both human–human interactions [259] and HRI [39, 144].
Disciplines. For their foundation, these studies rely on contributions from robotics (e.g., [1, 11, 156]) and HRI [121, 122, 169, 225]. Further studies are rooted in human–computer interaction (e.g., [3, 4, 21, 99, 140, 173], engineering [171], and philosophy [101].
Theoretical approaches. Some researchers (e.g., [167]) rely on a multilevel process theory of emotion [160,161,162]. It holds that people perceive emotions within a three-level hierarchical system, including a sensory level that generates primary emotions, a schematic level that integrates perceptions and responses in a memory system, and a conceptual level that integrates prior experiences with predictions about the future. This model provides valuable insights about the cognitive processes during HRI.
Other researchers base their proposed classification standards for robotic emotion expressions on conceptual considerations [51, 87], the facial action coding system (FACS) emotions [300], or a circumplex model of emotions [31, 125]. The FACS includes six basic emotions—happiness, surprise, anger, sadness, fear, and disgust—that arguably can be experienced by both humans and non-humans [74]. In contrast, secondary emotions such as interest and curiosity are particular to humans [59]. Still other classifications rely on user responses [139], emotion simulations [180, 181], or emotion animations [278]. This model suggests characterizing robotic artificial emotional expressions according to valence and arousal dimensions (for reviews, see [152, 216]). Accordingly, a basic premise of the circumplex model is that a person’s affective states appear along the circumference of a circle, on which the various dimensions can be classified by their degrees of positive or negative affect.
Methods for generating artificial emotions. The applied methods can be classified into two major categories: (1) static and (2) dynamic.
Static approaches to robotic emotion generation create manually coded, pose-to-pose robotic animations based on stable system architectures of robotic emotions, using hand-crafted categorizations of robotic emotions. Authors have proposed robotic emotion architectures based on predefined scripts [2, 64], predefined emotional spaces [42, 143], movements of pneumatic actuators [106], or a fuzzy emotion system [271, 293].
Dynamic approaches can be either proactive or reactive. Proactive emotion generation may be inspired by graphic animation design, such as Disney’s 12 basic principles of animation [90], which can be applied to generate lifelike robotic emotions too. The underlying notion of these principles is to use “pose-to-pose animation, in which several keyframes are defined and interbetweened to get the intermediate postures” ([175], p. 546). This creative design-oriented approach generates high-quality robotic animations because it is adapted to the morphology of the robot. Furthermore, emotions might stem from a combination of features, hand-crafting, a creative design approach, and direct imitations of the human body [175].
Reactive emotion generation instead relies on data, generated through the recognition of human emotions in general, as might be gleaned from humans’ faces, head movements, body motions/gestures, speech, touch, or brain feedback (for overviews see [208, 211]. Table 1 provides an overview of the recognition areas).
Studies in this tradition mostly attempt a direct imitation by tracking human emotional expressions, such as with computer vision techniques, or special markers and sensors. The key positions then can be mapped to the robot’s movement space either with data-driven processing [290] or by defining some suitable transfer functions for the robot morphology [176].
Summary of findings. Hand-coded robotic animations can offer high quality [175]. However, these static approaches are limited because “robot performance based on a static emotional state cannot vividly display dynamic and complex emotional transference” ([282], p. 160). Furthermore, the limited set of emotions increases the likelihood of repetitive behavior, which may appear inappropriate in HRI. Therefore, research suggests that robotic emotion generation should be based on dynamic algorithms that can recognize human emotion. Yet despite its strengths, this approach is challenging due to the differences in the movement possibilities between humans and robots [175].
3.2 Research Stream 2: Human Recognition of Artificial Robotic Emotions
The second research stream includes 43 reviewed studies and relates to the organism depicted in Fig. 3. Reviewing this research attempts to answer the second research question: How well can robotic artificial emotions be recognized by humans?
In the S–O–R paradigm, an organism refers to any “internal processes and structures intervening between stimuli external to the person and the final actions, reactions, or responses emitted” ( [12], p. 46). For this survey, the organism is represented by humans’ recognition of robotic artificial emotions, expressed via face, body, or both. Details about the reviewed studies in this stream can be found in Web-Appendix 3 (tiny.cc/IJSR20WebApp3).
Geographical origins. Most of the studies that fall into this stream focus on a single country (see for details Web-Appendix 3) in Europe (e.g., Austria [265], France [40, 68, 190], Germany [75, 95, 105, 110, 111, 170, 236], the Netherlands [98], Portugal [224], Switzerland [76], and the United Kingdom [16, 46, 174]), Asia (e.g., China [141, 223], India [215, 241], Japan [123, 124, 131, 229, 251, 254, 295], Korea [187], and Taiwan [102]), or America (e.g., United States [22, 29, 115, 117, 183, 256] and Mexico [214]). Only four studies investigate humans’ recognition of robotic emotions across multiple countries [19, 55, 82, 215]. For example, [19] find cultural differences in emotion recognition among participants from the United States, Asia, and Europe, as do [82] for Denmark, the United Kingdom, and Germany. Other studies capture data from Germany, Hungary, India, the Netherlands, Poland, Portugal, and the United Kingdom [215] or Germany, Slovakia, and Spain [55], without explicitly examining cultural differences.
Disciplines. For their foundation, these studies rely on contributions from robotics [20, 40, 55, 82, 187, 214, 223, 224, 242], human–computer interaction [253, 254, 256, 295], HRI [29, 75, 105, 110, 123, 170, 183, 215, 229], and social robotics [16, 76, 179, 265]. One study is rooted in neuroscience [68].
Theoretical approaches. From a theoretical perspective, most of the studies [17, 22, 29, 40, 68, 70, 86, 95, 174, 183, 214, 215, 224, 236] rely on the FACS model [72]. The circumplex model of emotions (see Sect. 3.1) also has been applied in several studies [18, 29, 55, 111, 117, 229, 251, 265]. A closely related approach is Plutchik’s wheel of emotions [205], which has been applied in two studies [253] [254].
Examined emotions and modes of expression. Table 2 summarizes the percentage recognition rates by which humans can recognize the six basic emotions of the FACS model [73, 74]. The determination of the average percentage values is based on the detailed list of reported recognition rates across the reviewed studies in Web-Appendix 4 (tiny.cc/IJSR20WebApp4). Recognized emotions provide a basis “for evaluating and judging events to assess their overall value with respect to the creature (e.g., positive or negative, desirable or undesirable, etc.)” ([31], p.273).
A closer look at the studies (see Table 2 and in detail Web-Appendix 4) reveals that there is only little consistency across the reviewed studies. Rather, they are heterogeneous in several important respects:
-
Manipulated emotions: Most existing studies select manipulated emotions based on the FACS approach [71] or the circumplex model [216]. Therefore, a large proportion of these studies focuses on the emotions of happiness, surprise, anger, fear, sadness, and disgust.
-
Robotic agents: Robotic agents can be distinguished as anthropomorphic (category a and b) or zoomorphic (category c) robots (see Fig. 4). Due to their different degrees of freedom in their bodies and/or faces, they exhibit different abilities to express emotions.
-
Body parts for emotion expression: Consistent with notions from social psychology [63], existing publications focus on facial or bodily emotion expressions, or both, as exhibited by robots during HRI. This heterogeneity may also result from the availability of different robots (see Fig. 4).
-
Context of HRI: Most of the studies have been conducted in a laboratory setting or online (see Web-Appendix 4); only two studies feature real-life settings, i.e., home settings [98] or clinical settings [183].
-
Scenario for the HRI: With regard to the type of HRI, the studies differ in whether the interaction is face-to-face, video-based, or based on images (see Web-Appendix 4).
An interesting question is whether differences in humans’ ability to recognize robotic emotions occur when emotions are expressed by the robot’s face or body. Because most of the studies only reported average recognition rates as percentages (see Table 2 and in detail Web Appendix 4), the requirements for a t-test for independent samples are not met. The test for potential differences for this review therefore relies on a Mann–Whitney U test [189, 217, 303]. This test indicates whether the central tendencies of two independent samples (e.g., studies on human recognition of facial robotic emotion expressions and studies on human recognition of bodily robotic expressions) are different. Mathematically, the Mann–Whitney U statistics can be defined as follows [189]:
where \(n_x\) is the number of observations in the first group of studies (e.g., studies of facial expression recognition); \(n_y\) is the number of observations in the second group of studies (e.g., studies on bodily expression recognition).
\(R_x\) is the sum of the ranks assigned to the first group and \(R_y\) is the sum of the ranks assigned to the second group. That is, the Mann–Whitney U test is based on the idea of ranking the data; the measured values themselves are not used in the calculations, but instead are replaced by ranks, which inform the actual test. Thus, calculations are based solely on the order of the data (i.e., greater than, less than). Absolute distances between the values are not considered. For example, both U equations can be understood as the number of observations in the experimental studies on HRI when all the scores from one group are placed in ascending order.
Researchers suggest greater importance of facial relative to bodily expressions during HRI [295]. Of the reviewed studies in this stream, a large portion examines face-to-face human–robot interaction [16, 40, 75, 82, 97, 105, 117, 123, 131, 170, 187, 214, 223, 224, 229, 242, 251, 254, 256, 265], whereas other studies rely on images [20, 29, 55, 61, 68, 76, 82, 110, 117, 183, 214, 215, 295] to express robotic emotions. With their findings, they detail efficient methods to program robots’ artificial expressions, using both facial expressions and bodily features. However, in the current review, no significant differences arise in participants’ recognition rates of emotions, expressed by a robot using either facial or bodily expressions (see Table 2). The Mann–Whitney U test further shows that the recognition rates during HRI with a physically embodied robot are not higher than those in HRI with an image- or video-based robot, which is consistent with some extant findings [117].
Robot type. Robotic avatars can be categorized into three groups (see Fig. 4): robotic faces, fully embodied robots, and zoomorphic robots. Robotic faces (e.g., Flobi robot, Melvin robot, EMYS robot, Golem-X robot, ERWIN robot, android PKD, KISMET robot) have been used commonly in studies of facial emotional expressions. Fully embodied robots appear in studies of bodily emotional expressions (e.g., KOBIAN, WE-4RII, Nao, Pepper, Elenoide). Among zoomorphic robots, the Keepon, KAROTY, and Pepe robots have been investigated.
Research setting. Most of the reviewed studies feature laboratory settings [19, 29, 40, 68, 75, 76, 82, 86, 103, 105, 117, 123, 131, 170, 183, 187, 190, 214, 223, 224, 229, 242, 251, 253, 256, 265, 294, 295]. Another set of studies is designed as online experiments [19, 55, 82, 110, 117, 214, 215]. A study by De Graaf, Allough, and van Dijk [97] involves qualitative interviews. No studies use field settings.
As participants, the majority of studies rely on adult student samples [16, 76, 82, 105, 117, 123, 170, 214, 223, 224, 229, 242, 251, 254, 256, 265]; three feature adult non-student respondents, specified either as household members [97], non-clinicians/clinicians [183] or frontline employees [236]. Others include mixed samples of adult participants, obtained through online channels [19, 55, 82, 110, 117, 214]. Among some notable exceptions, some studies used data gathered from children as participants [17, 29]. Canamero and Fredslund [40] compare the recognition capabilities of adults and children and find that children recognize robotic emotions better than adults. Several studies do not identify their participants clearly [22, 53, 68, 70, 75, 95, 131, 174, 188, 215, 242, 295].
Summary of findings. These studies affirm that robots can be programmed to express emotions, despite not actually having them. The average values of the recognition rates for different emotions serve as indicators, though various robotic agents require adequate programming and testing to establish expressions of robotic emotions.
Despite the differences across robots, the revealed recognition rates also offer some guidance regarding what robotic emotional expressions are associated with what emotions. In Table 2 the average recognition rates of most studies are well above the threshold value of 15% recommended in early HRI studies [29] for both facial expressions (58.76) and body expressions (57.87) (see in detail Web-Appendix 4). Thus, future HRI studies should strive for an emotion recognition rate of at least 50% for both facial and bodily expression. Furthermore, these examined emotions provide an initial basis for creating standardized, posed emotional expressions that accurately and reliably convey information. The validated expressions in robotic research also are less likely to suffer the problems that have been associated with emotional expression stimuli developed without any standardized system. Yet few studies provide data about any mean differences in detection rates; instead, they report percentages. This limits the capacity for tests of significant differences across groups. Although the results provide initial indications, an empirically validated “gold standard” for expressing robotic emotions is not yet available.
3.3 Research Stream 3: Human Responses to Artificial Robotic Emotions
The third research stream includes 61 reviewed studies and relates to the response depicted in Fig. 3. Reviewing this research attempts to answer the third research question: How do humans respond to artificial robotic emotions? In the S–O–R paradigm, the response is a person’s reaction to a stimulus. Accordingly, research stream 3 includes studies of human reactions to robotic emotions (see for details Web-Appendix 5, tiny.cc/IJSR20WebApp5).
Geographical origins. The reviewed studies in this research stream mostly take place in single countries, which span most of the world, including Korea [136, 138, 202], Japan [128,129,130, 147, 188, 192, 193, 196, 253, 286], India [255], and China [223, 243, 301] in Asia. France [5, 6, 43], the Netherlands [91, 92, 230, 258, 283, 284], Finland [102], Germany [195, 233, 264], Italy [17], Spain [89], Sweden [7], and the United Kingdom [28, 38, 146, 157, 213, 279] in Europe; as well as the United States [27, 49, 50, 93, 115, 119, 142, 145, 153, 159, 165, 218, 221], Canada [222], Israel [113, 114], Australia [231], New Zealand [33], and Brazil [262].
Only three studies include multiple countries [82, 179] [104]. These studies reveal insights on contingency factors that may affect the strength of the effects of human responses on HRI. One study includes both the United States and Japan [179]. Robotic joy prompts similar ratings from humans in both countries, but a robot that appears to represent another culture is perceived as part of the outgroup. Another study includes Australia and Japan [104]. Results show that Australian participants perceived an android robot more positive than Japanese participants. A comparative study of native language speakers from Denmark, the UK, and Germany found that different communities hold different expectations regarding robotic emotional expressions [82].
Disciplines: Most of the studies have their origins in the field of robotics [5, 6, 43, 115, 138, 145, 179, 196, 202, 223, 283, 284, 301], human–computer interaction [38, 93, 111, 128, 129, 230], or HRI [27, 49, 50, 91, 92, 113, 114, 136, 146, 157, 159, 165, 188, 195, 213, 218, 221, 243, 255, 258, 279]. Emotional reactions to HRI also have attracted considerations of social interactions, as detailed in research into behavioral robotics [28], cognitive science [102], ergonomics [119], social robotics [142, 147, 192], and psychology [253].
Theoretical approach. Social identity theory [179] and the similarity attraction paradigm [5, 6], as first introduced by Tajfel and colleagues [246, 247, 249, 250] provide frameworks for examining whether humans perceive robots as part of their social ingroup or social outgroup. The unified theory of acceptance and the technology acceptance model (TAM) [28, 222, 231], rooted in information systems research [57, 58] that deal with technology acceptance by humans, also have been applied to HRI. In the TAM, perceived usefulness and ease of use determine behavioral intentions to use a system, which in turn predicts actual use [57].
In addition, cognitive appraisal theory [83] and the hierarchical model of cognitive appraisal, [198] provide a framework for developing artificial agents that are capable of exhibiting emotional responses [150]. Finally, the uncanny valley paradigm [146, 234, 264], first introduced by Mori [184, 185, 274], helps to predict humans’ emotional responses to robots, according to their human-likeness (i.e., extent to which they resemble humans [172]).
Examined relationships regarding emotions. Different variables represent robotic emotional actions, as emotion-related input, and the human reactions to robotic emotions, or human reactions during HRI. Several variables also have been studied as outcomes of robotic emotions on the one hand and antecedents of human reactions to robotic emotions on the other, which are referred to as emotion-related mediators. The investigated variables can be organized into an input-process-output model, depicted in Fig. 5.
With regard to emotion-related input, studies show that robots’ characteristics, such as indications of their personality [5, 6, 202], empathy [43], or human-likeness [33, 234, 279], affect emotions during HRI. For example, a robot’s personal similarity with the human and human-likeness affect acceptance among humans.
Robotic emotion displays [7, 28, 89, 93, 115, 146, 159, 179, 221, 253, 263, 264] and emotional capabilities [286], such as using non-verbal cues [102, 147] or referring to humans by name [128, 218], also increase robot acceptance. Human characteristics, such as their emotional intelligence [50, 153] and experience with robots [119], similarly can affect emotions during HRI. Finally, social cues, including the length [301] or mode of emotional expression [27, 38, 49, 91, 92, 136, 213, 255], help determine emotions during HRI.
The emotion-related mediators help explain how an emotion-related input relates to the human response to a HRI [133]; see Fig. 5. Only one study examines indirect effects pertaining to emotions during HRI [230]. It shows that a robot’s emotional valence indirectly affects user perceptions through their emotional appraisals of the HRI.
Although not explicitly identified as investigations along these lines, several studies shed relevant light on potential emotion-related variables that likely mediate the input–human reaction relationship. In examining both antecedents and human responses to a set of constructs, they identify what is referred to as emotion-related mediators in Fig. 5. These potential mediators include a robot’s perceived social nature [27, 301], emotional responsiveness [27, 49, 113,114,115, 165, 221, 243], and pleasantness [38, 115, 138, 223, 230, 283, 284].
Finally, among human responses to HRI, the studies examine affective, cognitive, and behavioral reactions [191]. Affective responses include affect [91, 111, 145, 192, 196, 213], likability [102, 243], empathy [136], interest in the robot [129], uncanniness [146], emotional adaptation to robotic emotions [157, 195, 283, 284], and trust in the robot [43, 91].
Cognitive responses include the attention to the robot [115], social agency judgements [92], overall perception of the robot [255], perceived human-likeness [146], emotional interpretation [159], emotional valence [230], and perceived ingroup connections [179]. Some researchers [109, 238] leverage the TAM. From the TAM, only perceived usefulness has been included thus far as a dependent variable to examine emotions during HRI [223, 243]. The behavioral responses include variables such as intensity of the interaction [5, 6, 93, 114, 202], positive reactions [258], avoidance [142], altruistic behavior toward the robot [253], and human performance [145, 165, 188, 221].
Empirical design/sample. Most studies use laboratory experiments [5, 6, 14, 27, 28, 38, 44, 49, 91,92,93, 102, 104, 111, 113, 114, 129, 130, 135, 137, 142, 145,146,147, 157, 159, 165, 188, 190, 192, 195, 202, 213, 221, 223, 243, 253, 255, 258, 279, 283, 284, 299, 301] with students [5, 27, 28, 32, 44, 49, 91, 92, 104, 111, 113, 114, 157, 188, 193, 221, 223, 253, 255, 283, 284, 301]. The participants are usually adults, without further specification [14, 102, 129, 130, 137, 142, 146, 165, 192, 196, 218, 230, 301], or else are children [38, 128, 135, 145, 147, 159, 165, 190, 213, 258, 279]. Web-Appendix 5 (tiny.cc/IJSR20WebApp5) provides further details. The controlled simulation of HRI in laboratories may reflect the continued legitimacy of a positivist paradigm in mainstream robotics research, according to which findings that are supposed to have a knowledge character should be limited to the interpretation of “positive”, i.e., actual, sensory, perceivable, and verifiable findings. Yet this research tends to be limited in its generalizability.
Few studies include online experiments [50, 119, 179, 230] or data from online participants who represent various backgrounds. Three experimental studies were conducted in a real-life setting, gathering data from visitors [129, 142] or clients in elder care [130]. Most studies involved small samples of fewer than 50 respondents; only about 20% feature 50 respondents or more.
All reviewed studies rely on participants’ self-ratings, which are useful to assess their characteristics. Gauging emotions or behaviors with self-ratings may create a threat of common method variance [207], “attributable to the measurement method rather than to the constructs the measures represent” ([206], p. 879). It creates a false internal consistency, suggesting an apparent correlation among variables that actually is generated by their common source.
In most cases, the studies focus on a single interaction with a robot, reflecting an implicit assumption that humans’ emotional reactions remain identical and do not change over time or through additional interactions with a robot [276]. In a few longitudinal studies, the same user interacts with a robot several times. For example, a six-month field experiment [93] shows that HRI lasts longer with emotional robots (which express happiness or sadness) than neutral robots. A field study in a shopping mall over a period of 26 days [129] reveals that participants who evaluate the robot positively also express more interest in the interaction. A nine week study determines that the degree of empathy humans offer in response to emotional expressions does not differ from their degree of empathy after verbal expressions [142].
Such longitudinal studies are more laborious and time-consuming [88]. Furthermore, only recently has the technology been robust enough to allow some degree of autonomy when users interact with robots for extended periods. However, “longitudinal studies are extremely useful to investigate changes in user behaviour and experiences over time” ([158], p. 291).
Summary of findings. Emotions are particularly important during HRI with social robots. During HRI, humans express emotional, cognitive, and behavioral responses. In particular, robotic emotion-related characteristics, emotional capabilities, and displays of emotions matter for HRI. A robot that expresses positive emotions is more accepted as technology than one that does not.
3.4 Research Stream 4: Contingency Factors Affecting Emotions During HRI
The fourth research stream includes 14 studies that relate to the contingencies in which the interaction takes place (see Fig. 3). The studies are a cutting quantity of the reviewed studies in research stream 3 (see Sect. 3.3). Accordingly, details about these studies can also be found in Web-Appendix 5 (tiny.cc/IJSR20WebApp5).
Reviewing this research attempts to answer the fourth research question: How do contingencies affect the relationship between robotic emotions and human responses during HRI?. A contingency or moderator variable either strengthens or weakens the relationship between two or more variables [10, 65]. By considering contingency factors, this research stream goes beyond the classical S–O–R logic by recognizing that the basic effects may not be equally strong in every situation; rather, the presence and strength of the basic effects may depend systematically on contingency factors.
Characteristics of the interacting parties. Five studies test whether the human’s gender affects emotions during HRI [7, 38, 50, 136, 188]. While several studies find differences (e.g., [136]), others indicate that men prefer to interact with a pleasant (cf. neutral) robot, but women indicate no such preference differences [38]. Robot characteristics, including emotional intelligence [50] and human-likeness [188], also have been examined as moderators.
Characteristics of the interaction. Gaze cues during a game with a robot increase participants’ perceptions of the social desirability of a geminoid robot, which is designed to look like a specific person, but not those of a less human-like robot, such as Robovie [188]. Control over the robot during the HRI also increases participants’ affect, expressed in response to an android’s facial expression [192].
Duration of the interaction. Six studies note long-term effects of emotions during HRI with a social robot [93, 128, 129, 142, 147, 218]. Relying on longitudinal data from adult participants, these studies consistently reveal that social cues and emotions, expressed by a social robot, trigger HRI over time [93, 128, 129, 142, 218]. However, one study indicates that children between 3 and 4 years of age lose interest in the robot over time [147].
Environmental factors. External factors, such as cultural differences [104, 179] or task characteristics [234], have been examined too (see Web-Appendix 5). The existing studies clearly reveal cultural differences regarding human responses to emotions during HRI. Furthermore, it could be shown that the task complexity matters for the trust in robots during HRI [234].
Summary of findings. Environmental characteristics (e.g., culture of human participants) and characteristics of the involved parties (humans or robots) matter. Furthermore, control over the robot increases robot acceptance during HRI. The studies further indicate that the duration of the interaction matters for humans’ emotional responses to HRI. As most extant research is cross-sectional in nature, their conclusions should be treated with caution.
4 Discussion
4.1 What Do We Know?
Answering the first four research questions of this survey attempted to provide insights on current empirical knowledge on emotions during HRI (see Sect. 1). This review reveals that the field is well researched, with many methodologically sound empirical studies. The domain integrates findings pertaining to robotics, HRI, and psychology, though the different disciplines reveal some variations in their research focus. For example, robotics research mainly seeks technical specifications to improve HRI, but researchers from the HRI, social robotics, or psychology traditions are primarily interested in human responses to interactions.
In terms of theoretical backgrounds, research in the latter domains appears more strongly theory driven, whereas robotics research is more technology focused. Accordingly, the specific theories used in prior research can be assigned to three categories: (1) Classical concepts of human emotions, such as FACS, the circumplex model of emotions, and Plutchik’s wheel of emotions, (2) approaches to social interaction, such as social identity theory, the similarity attraction paradigm, emotional contagion, or a social agency perspective, and (3) concepts related specifically to HRI, such as the uncanny valley paradigm.
Classical human emotion concepts specifically address expressions of basic human emotions, which can be transferred to robotic emotional expressions. The theories in the other two categories are broader, in the sense that they explain the underlying mechanisms that lead humans to respond in a particular manner during human–human interactions. Previous research suggests a fairly consistent pattern of human responses. Robot acceptance depends strongly on the robot’s exhibited characteristics (e.g., empathy, personality), emotional displays, and emotional capabilities (e.g., competence), as well as the human’s prior experience with robots.
Studies typically analyze the direct links of these emotion-related input variables with robot acceptance, without considering the possible indirect effects (e.g., through mediators). This gap is surprising, because several process variables, such as robotic perceived naturalness, emotional responsiveness, and pleasantness, have been studied as antecedents or outcomes in extant research.
Furthermore, the review provides clear evidence that moderator variables are relevant for studying emotions and robot acceptance. In other words, the strength of the links between emotional input variables and robot acceptance is systematically influenced by other variables. However, research related to such moderating effects is rather fragmented and more research is needed.
4.2 What Remains to be Learned?
Despite considerable progress achieved by empirical research on emotions during HRI, this review also reveals several limitations of previous empirical research. This section therefore relates to the fifth research question of this survey (see Sect. 1) that asks What remains to be learned regarding emotions during HRI? and outlines seven suggestions for continued empirical research on HRI and robotic psychology.
Suggestion 1: Gain a better understanding of the underlying mechanisms for human responses to robotic emotions.
Clarifying underlying mechanisms that drive human responses to robotic emotions would provide an answer to an important “Why” question: Why do humans respond in the way they do to artificial emotions? Is it because they compare their expectations toward robots with the perceived robotic emotions, experienced in the HRI, as suggested in the expectation-disconfirmation paradigm [34, 197, 272]? Is it because humans compare their emotions with those expressed by robots [81]? Do they assign robots to their own or another social group as indicated by social identity theory [248]? Or do humans become infected with robotic emotions, similar to the emotional contagion that takes place during human–human interactions [108, 112]?
Previous research offers a rich range of possible theoretical approaches to make predictions about suitable robotic emotions (e.g., FACS model [71], circumplex model of emotions [216]) or human responses to robotic emotions (e.g., emotional contagion theory [108], social identity theory [248], TAM [57]). However, still many studies fail to draw explicitly on theoretical approaches to establish or justify their hypotheses. Hypotheses development should be based firmly in theories that have been well established with respect for human–human interaction (e.g., [69, 168]) or new theories on HRI should be developed. Table 3 provides an overview of potentially fruitful psychological theories that could be applied (and extended), as well as some sample research questions, to gain a deeper understanding of the theoretical mechanisms at play during HRI (for an overview on robotics psychology, see [234]).
Suggestion 2: Investigate contingency effects to a greater extent.
The logic for examining contingency factors proposes that there is not one best HRI design [260]. Rather, the human-related outcomes of HRI depend on the culture (for an overview see [79]), the setting (for an overview see [177]), the scenario (for an overview see [285]), and the human participants. Some empirical studies that focused on emotions during HRI mention moderator variables (see Sect. 3.4), but research in this area is still scarce. Conceptual articles distinguish several categories of potentially relevant contingency factors [234], such as the interaction setting and its duration, but no integrative, empirical analysis of situational variables has been published. Researchers should pursue such a contribution.
Suggestion 3: Define uniform standards for the experimental investigation of emotions in HRI experiments.
Most of the studies in this review are based on experimental investigations. They are relatively heterogeneous in their experimental design, as is particularly evident in the repetition frequencies, study period, sample, and form of interaction (e.g., direct face-to-face versus indirect online or via virtual reality), as Web-Appendices 3 and 5 reveal. This heterogeneity is challenging in two respects. First, it makes it difficult to compare the findings across studies. Second, the quality of the findings is difficult to assess, particularly due to the lack of design science research dedicated to investigating human reactions to HRI. The few available contributions [116, 285], deserve more attention; more work also is needed in this field.
Suggestion 4: Compare different HRI scenarios with regard to their effectiveness.
In terms of possible scenarios, the studies can be differentiated according to whether HRI takes place directly or indirectly (see Fig. 6):
-
Media-supported HRI is mostly used in online studies or face-to-face experiments in which images or videos of robots are used.
-
Direct HRI is mostly used in face-to-face experiments where human participants interact with real robots or parts of robots (e.g., head, upper body).
Although, no differences in emotion recognition rates across different scenarios of HRI could be found in this survey, the varying degrees of immersion likely cause humans to react differently to images or videos of robots than to a face-to-face HRI in a real–world situation though. The lack of differences between the scenarios in the Mann–Whitney test also should be evaluated cautiously, due to the strong heterogeneity across the experimental studies considered. Despite a few studies of these questions [285], no clear findings are available.
Suggestion 5: Use real–world environments to test the effects of emotions during HRI.
Some recent studies of HRI take place in real–world settings, such as homes [85, 244], workplaces [188], elderly care facilities [218], schools [128, 147], shopping malls [129] [130, 191], or a university campus [142]. But most studies continue to rely on laboratory settings (see Fig. 6). This setting has the advantage of limiting external errors, due to the controlled nature of the experiment. However, the external validity of the results is limited, and they are difficult to generalize to real–world settings. That is, the results may be valid in an experimental setting but not in realistic settings [100]. Levitt and List [163, 164] explicitly note concerns about extrapolating interpretations of data from lab experiments to the world beyond. The lack of studies that move beyond the laboratory also is surprising, because a real–life, face-to-face HRI scenario is the most informative [285]. As robots take on more roles in society and business, continued research should examine emotions during HRI using real–world, private environments and business settings, including both customer–robot [236, 238] and employee–robot interactions [234].
Suggestion 6: Analyze longitudinal effects of emotions during HRI.
Most studies rely on cross-sectional data, so their findings stem from a single interaction, which could reflect humans’ sense of surprise when they met a robot for the first time. The few existing longitudinal studies clearly indicate that the duration and repetition of HRI matter for human emotional responses. Additional research should examine longitudinal effects of emotions, accounting for not only first impression effects but also the effects of emotions and potential changes of HRI over time [158]. Understanding these long-term effects of emotions during HRI with social robots is important, because most real–world applications aim for long-term uses of robots. Researchers thus might investigate whether and how humans’ communication with the robot or other humans change over time.
Another interesting question relates to potential responsibility shifts over time, as famously exemplified by the increased automation bias resulting from the use of navigation systems in cars [94]. Extant research also indicates that an automation bias can arise during HRI [257]. Continued research could examine whether a similar responsibility shift occurs during long-term HRI and how this affects human emotions.
Suggestion 7: Examine feedback loops during HRI to a greater extent.
This review indicates that extant studies tend to analyze relationships between emotion-related input variables of robot acceptance by humans according to simplistic, “one–stage” models (see Fig. 5). Analyzing such simplistic models provides only limited understanding of the driving forces for HRI, because they cannot distinguish direct versus indirect effects on robot acceptance. This limitation is critical, because some categories of success factors (e.g., robotic social cues) likely affect robot acceptance only indirectly, rather than directly. A systematic analysis of such structures is possible only if researchers use complex integrative models that support the simultaneous analysis of both direct and indirect effects in a single model. Such integrative studies also would be consistent with the logic of the S–O–R model [267, 289].
Furthermore, the logic of the S–O–R model should be extended with potential feedback loops to consider the dynamic robotic expression of emotion. For example, PSI theory [67] addresses the interplay among motivational stimuli, cognitive processes, and outcomes. An interactive feedback model also would account for the robot’s sensitivity to what the human is doing (such that robots need a sophisticated system to recognize human emotions). A cybernetic framework that can account for the dynamic and adaptive nature of emotions during HRI appears necessary (see Fig. 7, inspired by [291]).
5 Conclusion
Social robots are an increasingly pervasive reality in daily lives, and they have prompted more than 1,600 studies in the past two decades. However, the interdisciplinary, fragmented state of research on emotions during HRI with social robots makes it difficult for researchers to develop new insights and ideas using extant studies. This review systematically condenses extant knowledge. In terms of human recognition of robotic emotions, studies that examine the five primary emotions suggested in the FACS model are identified. Although such studies include different robots and emotional expression modes (facial, bodily, both), humans can recognize on average about 50% of a robot’s emotions correctly; for some high arousal emotions, such as happiness and anger, the average recognition rates are even higher. In terms of human responses to robotic emotions during HRI, extant research has made a lot of progress. Emotions inform the interaction intensity and positive human responses to a robot. The findings from this review suggest conceptual and methodological suggestions for further research. In turn, they hold the promise to generate meaningful impacts and encourage further empirical research in this field.
References
Abd Latif MH, Yusof HM, Sidek S, Rusli N (2015) Thermal imaging based affective state recognition. In: 2015 IEEE international symposium on robotics and intelligent sensors (IRIS). IEEE, pp 214–219 (2015)
Acosta M, Kang D, Choi HJ (2008) Robot with emotion for triggering mixed-initiative interaction planning. In: 2008 IEEE 8th international conference on computer and information technology workshops. IEEE, pp 98–103
Ahmed TU, Hossain S, Hossain MS, Ul Islam R, Andersson K (2019) Facial expression recognition using convolutional neural network with data augmentation. In: 2019 Joint 8th international conference on informatics, electronics & vision (ICIEV) and 2019 3rd international conference on imaging, vision & pattern recognition (icIVPR). IEEE, pp. 336–341
Alonso-Martin F, Malfaz M, Sequeira J, Gorostiza JF, Salichs MA (2013) A multimodal emotion detection system during human–robot interaction. Sensors 13(11):15549–15581
Aly A, Tapus A (2013) A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human–robot interaction. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 325–332
Aly A, Tapus A (2016) Towards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human–robot interaction. Auton Robot 40(2):193–209
Andreasson R, Alenljung B, Billing E, Lowe R (2018) Affective touch in human–robot interaction: conveying emotion to the NAO robot. Int J Soc Robot 10(4):473–491
Anjum M (2019) Emotion recognition from speech for an interactive robot agent. In: 2019 IEEE/SICE international symposium on system integration (SII). IEEE, pp 363–368
Anzai Y (1993) Human–robot–computer interaction: a new paradigm of research in robotics. Adv Robot 8(4):357–369
Arnold HJ (1982) Moderator variables: a clarification of conceptual, analytic, and psychometric issues. Organ Behav Hum Perform 29(2):143–174
Azuar D, Gallud G, Escalona F, Gomez-Donoso F, Cazorla M (219) A story-telling social robot with emotion recognition capabilities for the intellectually challenged. In: Iberian robotics conference. Springer, pp 599–609
Bagozzi RP (1986) Principles of marketing management. Science Research Associates, Chicago
Baraka K, Alves-Oliveira P, Ribeiro T (2019) An extended framework for characterizing social robots. arXiv preprint arXiv:1907.09873
Bartneck, C (2003) Interacting with an embodied emotional character. In: Proceedings of the 2003 International Conference on Designing Pleasurable Products and Interfaces, pp. 55–60
Baumeister RF, Leary MR (1997) Writing narrative literature reviews. Rev Gen Psychol 1(3):311–320
Beck A, Cañamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: 19th international symposium in robot and human interactive communication. IEEE, pp 464–469
Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334
Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments, pp 37–42
Becker-Asano C, Ishiguro H (2011a) Evaluating facial displays of emotion for the android robot geminoid f. In: 2011 IEEE Workshop on Affective Computational Intelligence (WACI), pp. 1–8. IEEE
Becker-Asano C, Ishiguro H (2011b) Intercultural differences in decoding facial expressions of the android robot geminoid f. J Artif Intell Soft Comput Res 1(3):215–231
Benamara NK, Val-Calvo M, Álvarez-Sánchez JR, Díaz-Morcillo A, Vicente JMF, Fernández-Jover E, Stambouli TB (2019) Real-time emotional recognition for sociable robotics based on deep neural networks ensemble. In: International work-conference on the interplay between natural and artificial computation. Springer, pp 171–180
Bennett CC, Šabanović S (2014) Deriving minimal features for human-like facial expressions in robotic faces. Int J Soc Robot 6(3):367–381
Bera A, Randhavane T, Manocha D (2019) Modelling multi-channel emotions using facial expression and trajectory cues for improving socially-aware robot navigation. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops
Bera A, Randhavane T, Prinja R, Kapsaskis K, Wang A, Gray K, Manocha D (2019) The emotionally intelligent robot: improving social navigation in crowded environments, pp 257–266. arXiv preprint arXiv:1903.03217
Bieling G, Stock RM, Dorozalla F (2015) Coping with demographic change in job markets: how age diversity management contributes to organisational performance. German J Hum Resour Manag 29(1):5–30
Bien ZZ, Kim JB, Kim DJ, Han JS, Do JH (2002) Soft computing based emotion/intention reading for service robot. In: AFSS international conference on fuzzy systems. Springer, pp 121–128
Birnbaum GE, Mizrahi M, Hoffman G, Reis HT, Finkel EJ, Sass O (2016) Machines as a source of consolation: robot responsiveness increases human approach behavior and desire for companionship. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 165–172
Bishop L, van Maris A, Dogramadzi S, Zook N (2019) Social robots: the influence of human and robot characteristics on acceptance. Paladyn J Behav Robot 10(1):346–358
Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155
Breazeal C, Aryananda L (2002) Recognition of affective communicative intent in robot-directed speech. Auton Robot 12(1):83–104
Breazeal C, Brooks R (2005) Robot emotion: a functional perspective. In: Fellous J-M, Arbib MA (eds) Who needs emotions? The brain meets the robot. Oxford University Press, Oxford, pp 271–310
Broadbent E, Kuo IH, Lee YI, Rabindran J, Kerse N, Stafford R, MacDonald BA (2010) Attitudes and reactions to a healthcare robot. Telemed e-Health 16(5):608–613
Broadbent E, Lee YI, Stafford RQ, Kuo IH, MacDonald BA (2011) Mental schemas of robots as more human-like are associated with higher blood pressure and negative emotions in a human-robot interaction. Int J Soc Robot 3(3):291–298
Brown SA, Venkatesh V, Goyal S (2014) Expectation confirmation in information systems research. MIS Q 38(3):729–756
Bryant D (2019) Towards emotional intelligence in social robots designed for children. In: Proceedings of the 2019 AAAI/ACM conference on AI, ethics, and society, pp 547–548
Bueno L, Brunetti F, Frizera A, Pons JL, Moreno J (2008) Human-robot cognitive interaction. In: Pons JL (ed) Wearable Robots Biomechatronic Exoskeletons, vol 1. Wiley, New York, pp 87–126
Butler EA, Egloff B, Wilhelm FH, Smith NC, Erickson EA, Gross JJ (2003) The social consequences of expressive suppression. Emotion 3(1):48–67
Cameron D, Millings A, Fernando S, Collins EC, Moore R, Sharkey A, Evers V, Prescott T (2018) The effects of robot facial emotional expressions and gender on child–robot interaction in a field study. Connect Sci 30(4):343–361
Cañamero D (1997) Modeling motivations and emotions as a basis for intelligent behavior. In: Proceedings of the first international conference on autonomous agents, pp 148–155
Cañamero LD, Fredslund, J (2000) How does it feel? emotional interaction with a humanoid LEGO robot. In: Proceedings of American association for artificial intelligence fall symposium, FS-00-04, pp. 7–16
Castillo JC, Castro-González Á, Alonso-Martín F, Fernández-Caballero A, Salichs MÁ (2018) Emotion detection and regulation from personal assistant robot in smart environment. In: Personal assistants: emerging computational technologies. Springer, pp 179–195
Chao-gang W, Jie-yu Z, Yuan-yuan Z (2008) An emotion generation model for interactive virtual robots. In: 2008 international symposium on computational intelligence and design, vol 2. IEEE, pp 238–241
Charrier L, Galdeano A, Cordier A, Lefort M (2018) Empathy display influence on human-robot interactions: a pilot study (2018)
Charrier L, Rieger A, Galdeano A, Cordier A, Lefort M, Hassas S (2019) The rope scale: a measure of how empathic a robot is perceived. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 656–657
Chastagnol C, Clavel C, Courgeon M, Devillers, L (2014) Designing an emotion detection system for a socially intelligent human–robot interaction. In: Natural interaction with robots, knowbots and smartphones. Springer, pp 199–211
Chen C, Garrod OG, Zhan J, Beskow J, Schyns PG, Jack R E (2018) Reverse engineering psychologically valid facial expressions of emotion into social robots. In: 2018 13th IEEE international conference on automatic face & gesture recognition (FG 2018). IEEE, pp 448–452
Chen H, Gu Y, Wang F, Sheng W (2018) Facial expression recognition and positive emotion incentive system for human–robot interaction. In: 2018 13th world congress on intelligent control and automation (WCICA). IEEE, pp 407–412
Chen L, Su W, Feng Y, Wu M, She J, Hirota K (2020) Two-layer fuzzy multiple random forest for speech emotion recognition in human–robot interaction. Inf Sci 509:150–163
Chen TL, King CH, Thomaz AL, Kemp CC (2011) Touched by a robot: An investigation of subjective responses to robot-initiated touch. In: 2011 6th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 457–464
Chita-Tegmark M, Lohani M, Scheutz, M (2019) Gender effects in perceptions of robots and humans with varying emotional intelligence. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 230–238
Chung S, Ryoo H (2018) Level values of robot visual interface factors based on users’ experience on screen, light, face. Int J Control Autom 11(5):117
Cid F, Manso LJ, Núnez P (2015) A novel multimodal emotion recognition approach for affective human robot interaction. In: Proceedings of fine, pp 1–9
Claret JA, Venture G, Basañez L (2017) Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task. Int J Soc Robot 9(2):277–292
Dandıl E, Özdemir R (2019) Real-time facial emotion classification using deep learning. Data Sci Appl 2(1):13–17
Danev L, Hamann M, Fricke N, Hollarek T, Paillacho D (2017) Development of animated facial expressions to express emotions in a robot: roboticon. In: 2017 IEEE second ecuador technical chapters meeting (ETCM). IEEE, pp 1–6
Dautenhahn K (2007) Methodology & themes of human–robot interaction: a growing research field. Int J Adv Rob Syst 4(1):103–108
Davis FD (1985) A technology acceptance model for empirically testing new end-user information systems: theory and results. Ph.D. thesis, Massachusetts Institute of Technology
Davis FD (1989) Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q 13(3):319–340
Demoulin S, Leyens JP, Paladino MP, Rodriguez-Torres R, Rodriguez-Perez A, Dovidio J (2004) Dimensions of “uniquely” and “non-uniquely” human emotions. Cognit Emot 18(1):71–96
Deng J, Pang G, Zhang Z, Pang Z, Yang H, Yang G (2019) cGAN based facial expression recognition for human–robot interaction. IEEE Access 7:9848–9859
Deshmukh A, Babu SK, Unnikrishnan R, Ramesh S, Anitha P, Bhavani RR (2019)Influencing hand-washing behaviour with a social robot: Hri study with school children in rural India. In: 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, pp 1–6
Devillers L, Tahon M, Sehili MA, Delaborde A (2015) Inference of human beings’ emotional states from speech in human–robot interactions. Int J Soc Robot 7(4):451–463
Di Lorenzo G, Pinelli F, Pereira FC, Biderman A, Ratti C, Lee C, Lee C (2009) An affective intelligent driving agent: driver’s trajectory and activities prediction. In: 2009 IEEE 70th vehicular technology conference fall. IEEE, pp 1–4
Dodd W, Gutierrez R (2005) The role of episodic memory and emotion in a cognitive robot. In: ROMAN 2005. IEEE international workshop on robot and human interactive communication, 2005. IEEE, pp 692–697
Donaldson L (2001) The contingency theory of organizations. Sage, London
Donovan R, Rossiter J (1982) Store atmosphere: an environmental psychology approach. J Retail 58(1):34–57
Dörner D, Güss CD (2013) Psi: a computational architecture of cognition, motivation, and emotion. Rev Gen Psychol 17(3):297–317
Dubal S, Foucher A, Jouvent R, Nadel J (2011) Human brain spots emotion in non humanoid robots. Soc Cognit Affect Neurosci 6(1):90–97
Duncan S, Fiske DW (2015) Face-to-face interaction: research, methods, and theory. Routledge
Dziergwa M, Kaczmarek M, Kaczmarek P, Kędzierski J, Wadas-Szydłowska K (2018) Long-term cohabitation with a social robot: a case study of the influence of human attachment patterns. Int J Soc Robot 10(1):163–176
Ekman P (2004) Emotions revealed. BMJ 328(Suppl S5):0405184
Ekman P (2005) Handbook of cognition and emotion, chap. basic emotions
Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement, consulting psychologists press. Palo Alto
Ekman P, Sorenson ER, Friesen WV (1969) Pan-cultural elements in facial displays of emotion. Science 164(3875):86–88
Embgen S, Luber M, Becker-Asano C, Ragni M, Evers V, Arras KO (2012) Robot-specific social cues in emotional body language. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication. IEEE, pp 1019–1025
Erden MS (2013) Emotional postures for the humanoid-robot NAO. Int J Soc Robot 5(4):441–456
Eroglu SA, Machleit KA, Davis LM (2001) Atmospheric qualities of online retailing: a conceptual model and implications. J Bus Res 54(2):177–184
Erol BA, Majumdar A, Benavidez P, Rad P, Choo KKR, Jamshidi M (2019) Toward artificial emotional intelligence for cooperative social human–machine interaction. IEEE Trans Comput Soc Syst 7(1):234–246
Evers V, Maldonado H, Brodecki T, Hinds P (2008) Relational vs. group self-construal: untangling the role of national culture in HRI. In: 2008 3rd ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 255–262
Eysenck MW, Keane MT (2015) Cognitive psychology: a student’s handbook. Psychology Press, Philadelphia
Festinger L (1954) A theory of social comparison processes. Hum Relat 7(2):117–140
Fischer K, Jung M, Jensen LC, aus der Wieschen MV (2019) Emotion expression in HRI–when and why. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 29–38
Folkman S, Lazarus RS, Dunkel-Schetter C, DeLongis A, Gruen RJ (1986) Dynamics of a stressful encounter: cognitive appraisal, coping, and encounter outcomes. J Pers Soc Psychol 50(5):992–1003
Fong T, Thorpe C, Baur C (2003) Collaboration, dialogue, human-robot interaction. In: 10th international symposium on robotics research . Tracts in Advanced Robotics, vol 6. Springer, pp 255–266
Forlizzi J (2007) How robotic products become social products: an ethnographic study of cleaning in the home. In: 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 129–136
Gácsi M, Kis A, Faragó T, Janiak M, Muszyński R, Miklósi Á (2016) Humans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviour. Comput Hum Behav 59:411–419
Galindo C, Fernández-Madrigal JA, González J (2008) Multihierarchical interactive task planning: application to mobile robotics. IEEE Trans Syst Man Cybern Part B (Cybern) 38(3):785–798
Ganster T, Eimler SC, Von Der Pütten A, Hoffmann L, Krämer NC, von der Pütten A (2010) Methodological considerations for long-term experience with robots and agents
Garrell A, Villamizar M, Moreno-Noguer F, Sanfeliu A (2017) Teaching robot’s proactive behavior using human assistance. Int J Soc Robot 9(2):231–249
Ghani DA, Ishak SBA (2012) Relationship between the art of wayang kulit and disney’s twelve principles of animation. Rev Res Soc Interv 37:162–179
Ghazali AS, Ham J, Barakova E, Markopoulos P (2019b) Assessing the effect of persuasive robots interactive social cues on users’ psychological reactance, liking, trusting beliefs and compliance. Adv Robot 33(7–8):325–337
Ghazali AS, Ham J, Markopoulos P, Barakova EI (2019a) Investigating the effect of social cues on social agency judgement. In: HRI, pp 586–587
Gockley R, Simmons R, Forlizzi J (2006) Modeling affect in socially interactive robots. In: ROMAN 2006-The 15th IEEE international symposium on robot and human interactive communication. IEEE, pp 558–563
Goddard K, Roudsari A, Wyatt JC (2012) Automation bias: a systematic review of frequency, effect mediators, and mitigators. J Am Med Inform Assoc 19(1):121–127
Gonsior B, Sosnowski S, Buß M, Wollherr D, Kühnlenz K(2015) An emotional adaption approach to increase helpfulness towards a robot. In: 2012 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 2429–2436
Goulart C, Valadão C, Delisle-Rodriguez D, Funayama D, Favarato A, Baldo G, Binotte V, Caldeira E, Bastos-Filho T (2019) Visual and thermal image processing for facial specific landmark detection to infer emotions in a child–robot interaction. Sensors 19(13):2844
de Graaf MM, Allouch SB, van Dijk JA (2016) Long-term acceptance of social robots in domestic environments: insights from a user’s perspective. In: 2016 AAAI spring symposium series (2016)
de Graaf MM, Allouch SB, Van Dijk J (2015) What makes robots social? A user’s perspective on characteristics for social human–robot interaction. In: International conference on social robotics. Springer, pp 184–193
Greco A, Roberto A, Saggese A, Vento M, Vigilante V (2019) Emotion analysis from faces for social robotics. In: 2019 IEEE international conference on systems, man and cybernetics (SMC). IEEE, pp 358–364
Guala F (2002) On the scope of experiments in economics: comments on siakantaris. Camb J Econ 26(2):261–267
Gunes H, Celiktutan O, Sariyanidi E (2019) Live human-robot interactive public demonstrations with automatic emotion and personality prediction. Philos Trans R Soc B 374(1771):1–8
Han J, Campbell N, Jokinen K, Wilcock G (2012) Investigating the use of non-verbal cues in human-robot interaction with a NAO robot. In: 2012 IEEE 3rd international conference on cognitive infocommunications (CogInfoCom). IEEE, pp 679–683
Hanson D(2006) Exploring the aesthetic range for humanoid robots. In: Proceedings of the ICCS/CogSci-2006 long symposium: toward social mechanisms of android science. Citeseer, pp 39–42
Haring KS, Silvera-Tawil D, Matsumoto Y, Velonaki M, Watanabe K (2014) Perception of an android robot in Japan and Australia: a cross-cultural comparison. In: International conference on social robotics. Springer, pp 166–175
Häring M, Bee N, André E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: 2011 RO-MAN. IEEE, pp 204–209
Hashimoto T, Hitramatsu S, Tsuji T, Kobayashi H (2006) Development of the face robot saya for rich facial expressions. In: 2006 SICE-ICASE international joint conference. IEEE, pp 5423–5428
Haslam N (2006) Dehumanization: an integrative review. Pers Soc Psychol Rev 10(3):252–264
Hatfield E, Cacioppo JT, Rapson RL (1993) Emotional contagion. Curr Dir Psychol Sci 2(3):96–100
Heerink M, Kröse B, Evers V, Wielinga B (2008) The influence of social presence on acceptance of a companion robot by older people. J Phys Agents 2(2):33–40
Hegel, F, Eyssel F, Wrede B (2010) The social robot ‘flobi’: key concepts of industrial design. In: 19th international symposium in robot and human interactive communication. IEEE, pp 107–112
Hegel F, Spexard T, Wrede B, Horstmann G, Vogt T (2006) Playing a different imitation game: interaction with an empathic android robot. In: 2006 6th IEEE-RAS international conference on humanoid robots IEEE, pp 56–61
Hochschild AR (2012) The managed heart: commercialization of human feeling. Univ of California Press
Hoffman G, Birnbaum GE, Vanunu K, Sass O, Reis HT (2014) Robot responsiveness to human disclosure affects social impression and appeal. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction, pp 1–8
Hoffman G, Zuckerman O, Hirschberger G, Luria M, Shani-Sherman T (2015) Design and evaluation of a peripheral robotic conversation companion. In: 2015 10th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 3–10
Hollinger GA, Georgiev Y, Manfredi A, Maxwell BA, Pezzementi ZA, Mitchell B (2006) Design of a social mobile robot using emotion-based decision mechanisms. In: 2006 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 3093–3098
Homburg N (2018) How to include humanoid robots into experimental research: a multi-step approach. In: Proceedings of the 51st Hawaii international conference on system sciences (2018)
Hu Y, Hoffman G (2019) Using skin texture change to design emotion expression in social robots. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 2–10 (2019)
Huang JY, Lee WP, Dong BW (2019) Learning emotion recognition and response generation for a service robot. In: IFToMM international symposium on robotics and mechatronics. Springer, pp 286–297
Huang L, Gillan D (2014) An exploration of robot builders’ emotional responses to their tournament robots. In: Proceedings of the human factors and ergonomics society annual meeting, vol 58, pp 2013–2017. SAGE Publications Sage CA: Los Angeles, CA
Hyun KH, Kim EH, Kwak YK (2007) Emotional feature extraction based on phoneme information for speech emotion recognition. In: RO-MAN 2007-The 16th IEEE international symposium on robot and human interactive communication. IEEE, pp 802–806
Ilić D, Žužić I, Brščić D (2019) Calibrate my smile: robot learning its facial expressions through interactive play with humans. In: Proceedings of the 7th international conference on human–agent interaction, pp 68–75
Inthiam J, Hayashi E, Jitviriya W, Mowshowitz A (2019) Mood estimation for human-robot interaction based on facial and bodily expression using a hidden Markov model. In: 2019 IEEE/SICE international symposium on system integration (SII). IEEE, pp 352–356
Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A (2004) Various emotional expressions with emotion expression humanoid robot WE-4RII. In: IEEE conference on robotics and automation, 2004. TExCRA technical exhibition based. IEEE, pp 35–36
Itoh K, Miwa H, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A(2006) Mechanical design of emotion expression humanoid robot WE-4RII. In: Romansy 16. Springer, pp 255–262
Jimenez F, Yoshikawa T, Furuhashi T, Kanoh M (2015) An emotional expression model for educational-support robots. J Artif Intell Soft Comput Res 5(1):51–57
Jung MF(2017) Affective grounding in human–robot interaction. In: 2017 12th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 263–273
Kanda T, Hirano T, Eaton D, Ishiguro H (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19(1–2):61–84
Kanda T, Sato R, Saiwaki N, Ishiguro H (2007) A two-month field trial in an elementary school for long-term human–robot interaction. IEEE Trans Rob 23(5):962–971
Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2009) An affective guide robot in a shopping mall. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, pp 173–180
Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2010) A communication robot in a shopping mall. IEEE Trans Rob 26(5):897–913
Kanoh M, Iwata S, Kato S, Itoh H (2005) Emotive facial expressions of sensitivity communication robot “ifbot.”. Kansei Eng Int 5(3):35–42
Kansizoglou I, Bampis L, Gasteratos A (2019) An active learning paradigm for online audio-visual emotion recognition. IEEE Trans Affective Computing
Kenny DA (2008) Reflections on mediation. Organ Res Methods 11(2):353–358
Keshari T, Palaniswamy S (2019) Emotion recognition using feature-level fusion of facial expressions and body gestures. In: 2019 international conference on communication and electronics systems (ICCES). IEEE, pp 1184–1189
Kim EH, Hyun KH, Kim SH, Kwak YK (2009) Improved emotion recognition with a novel speaker-independent feature. IEEE/ASME Trans Mechatron 14(3):317–325
Kim EH, Kwak SS, Kwak YK (2009) Can robotic emotional expressions induce a human to empathize with a robot? In: RO-MAN 2009—the 18th IEEE international symposium on robot and human interactive communication. IEEE, pp 358–362
Kim HR (2010) Hybrid emotion generation architecture with computational models based on psychological theory for human–robot interaction. Diss. Ph. D. dissertation, Korea Adv. Inst. Sci. Technol., Daejeon, Korea
Kim HR, Kwon DS (2010) Computational model of emotion generation for human–robot interaction based on the cognitive appraisal theory. J Intell Robot Syst 60(2):263–283
Kim HR, Lee K, Kwon DS (2005) Emotional interaction model for a service robot. In: ROMAN 2005. In: IEEE international workshop on robot and human interactive communication, 2005. IEEE, pp 672–678
Kim JH, Kim BG, Roy PP, Jeong DM (2019) Efficient facial expression recognition algorithm based on hierarchical deep neural network structure. IEEE Access 7:41273–41285
Kim Mg, Lee HS, Park JW, Jo SH, Chung MJ (2008) Determining color and blinking to support facial expression of a robot for conveying emotional intensity. In: RO-MAN 2008—The 17th IEEE international symposium on robot and human interactive communication. IEEE, pp 219–224
Kirby R, Forlizzi J, Simmons R (2010) Affective social robots. Robot Auton Syst 58(3):322–332
Kitagawa Y, Ishikura T, Song W, Mae Y, Minami M, Tanaka K (2009) Human-like patient robot with chaotic emotion for injection training. In: 2009 ICCAS-SICE. IEEE, pp 4635–4640
Klug M, Zell A (2013) Emotion-based human–robot–interaction. In: 2013 IEEE 9th international conference on computational cybernetics (ICCC). IEEE, pp 365–368
Kory-Westlund JM, Breazeal C (2019) Exploring the effects of a social robot’s speech entrainment and backstory on young children’s emotion, rapport, relationship, and learning. Front Robot AI 6(54):1–24
Koschate M, Potter R, Bremner P, Levine M (2016) Overcoming the uncanny valley: displays of emotions reduce the uncanniness of humanlike robots. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 359–366
Kozima H, Michalowski MP, Nakagawa C (2009) Keepon. Int J Soc Robot 1(1):3–18
Kozima H, Nakagawa C, Yasuda Y (2005) Interactive robots for communication-care: a case-study in autism therapy. In: ROMAN 2005. IEEE international workshop on robot and human interactive communication, 2005. IEEE, pp 341–346
Kurono Y, Sripian P, Chen F, Sugaya M (2019) A preliminary experiment on the estimation of emotion using facial expression and biological signals. In: International conference on human–computer interaction. Springer, pp 133–142
Kwon OH, Koo SY, Kim YG, Kwon DS (2010) Telepresence robot system for English tutoring. In: 2010 IEEE workshop on advanced robotics and its social impacts. IEEE, pp 152–155
Kwon OW, Chan K, Hao J, Lee TW (2003) Emotion recognition by speech signals. In: Eurospeech, Geneva, pp 125–128
Larsen RJ, Diener E (1992) Promises and problems with the circumplex model of emotion. In: Clark MS (ed) Review of personality and social psychology: emotion, vol 13. Sage, Newbury Park, pp 25–59
Law T, Chita-Tegmark M, Scheutz M (2020) The interplay between emotional intelligence, trust, and gender in human–robot interaction. Int J Soc Robot 86:1–3
Le BV, Lee S (2014) Adaptive hierarchical emotion recognition from speech signal for human–robot communication. In: 2014 tenth international conference on intelligent information hiding and multimedia signal processing. IEEE, pp 807–810
Le TL, Dong VT(2011) Toward a vietnamese facial expression recognition system for human–robot interaction. In: The 2011 international conference on advanced technologies for communications (ATC 2011). IEEE, pp 252–255
Lee HS, Kang BY (2019) Continuous emotion estimation of facial expressions on Jaffe and CK+ datasets for human–robot interaction. In: Intelligent service robotics, pp 1–13
Lehmann H, Broz F (2018) Contagious yawning in human–robot interaction. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction, pp 173–174
Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308
Leite I, McCoy M, Lohani M, Ullman D, Salomons N, Stokes C, Rivers S, Scassellati BEmotional (2015)storytelling in the classroom: individual versus group interaction between children and robots. In: 2015 10th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 75–82
Leventhal H (1979) A perceptual-motor processing model of emotion. In: Pliner P, Blankstein K, Spigel IM (eds) Perception of emotion in self and others, vol 5. Springer, New York, pp 1–46
Leventhal H (1980) Toward a comprehensive theory of emotion. In: Berkowitz L (ed) Advances in experimental social psychology, vol 13. Academic Press, New York, pp 139–207
Leventhal H, Scherer K (1987) The relationship of emotion to cognition: a functional approach to a semantic controversy. Cogn Emot 1(1):3–28
Levitt SD, List JA (2007) What do laboratory experiments measuring social preferences reveal about the real world? J Econ Perspect 21(2):153–174
Levitt SD, List JA (2009) Field experiments in economics: the past, the present, and the future. Eur Econ Rev 53(1):1–18
Leyzberg D, Avrunin E, Liu J, Scassellati B (2011) Robots that express emotion elicit better human teaching. In: 2011 6th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 347–354
Li Y, Jiang Y, Tian D, Hu L, Lu H, Yuan Z (2019) Ai-enabled emotion communication. IEEE Netw 33(6):15–21
Lisetti CL, Marpaung A (2005) A three-layered architecture for socially intelligent agents: modeling the multilevel process of emotions. In: International conference on affective computing and intelligent interaction. Springer, pp 956–963
Littlejohn SW, Foss KA (2010) Theories of human communication. Waveland Press, Long Grove
Liu Z, Wu M, Cao W, Chen L, Xu J, Zhang R, Zhou M, Mao J (2017) A facial expression emotion recognition based human–robot interaction system. IEEC/CAA J Automat Sinica 4(4):668–676
Löffler D, Schmidt N, Tscharn R (2018) Multimodal expression of artificial emotion in social robots using color, motion and sound. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction, pp 334–343
Lopez-Rincon A(2019) Emotion recognition using facial expressions in children using the nao robot. In: 2019 international conference on electronics, communications and computers (CONIELECOMP). IEEE, pp 146–153
MacDorman KF, Ishiguro H (2006) The uncanny advantage of using androids in cognitive and social science research. Interact Stud 7(3):297–337
Maeda Y, Geshi S (2018) Human–robot interaction using Markovian emotional model based on facial recognition. In: 2018 Joint 10th international conference on soft computing and intelligent systems (SCIS) and 19th international symposium on advanced intelligent systems (ISIS). IEEE, pp 209–214
Marmpena M, Lim A, Dahl TS (2018) How does the robot feel? perception of valence and arousal in emotional body language. Paladyn J Behav Robot 9(1):168–182
Marmpena M, Lim A, Dahl TS, Hemion N (2019) Generating robotic emotional body language with variational autoencoders. In: 2019 8th international conference on affective computing and intelligent interaction (ACII). IEEE, pp 545–551
Matsui D, Minato T, MacDorman KF, Ishiguro H(2005) Generating natural motion in an android by mapping human motion. In: 2005 IEEE/rsj international conference on intelligent robots and systems. IEEE, pp 3301–3308
McColl D, Hong A, Hatakeyama N, Nejat G, Benhabib B (2016) A survey of autonomous human affect detection methods for social robots engaged in natural HRI. J Intell Robot Syst 82(1):101–133
Mehrabian A, Russell JA (1974) An approach to environmental psychology. The MIT Press, Cambridge
de Melo CM, Terada K (2019) Cooperation with autonomous machines through culture and emotion. PloS One 14(11):e0224758
Michaud F, Robichaud E, Audet J (2001) Using motives and artificial emotions for prolonged activity of a group of autonomous robots. In: Proceedings of the AAAI fall symposium on emotions. Cape Code Massachussetts
Miwa H, Takanishi A, Takanobu H (2001) Experimental study on robot personality for humanoid head robot. In: Proceedings 2001 IEEE/RSJ international conference on intelligent robots and systems. Expanding the societal role of robotics in the the next millennium (Cat. No. 01CH37180), vol 2. IEEE, pp 1183–1188
Mizanoor RS, Spencer DA, Wang X, Wang Y (2014) Dynamic emotion-based human–robot collaborative assembly in manufacturing: the preliminary concepts. In: workshop on human-robot collaboration for industrial manufacturing at RSS’14 (2014)
Moosaei M, Das SK, Popa DO, Riek L.D(2017) Using facially expressive robots to calibrate clinical pain perception. In: 2017 12th ACM/ieee international conference on human–robot interaction (HRI). IEEE, pp 32–41
Mori M, MacDorman KF, Kageki N (2012) The uncanny valley [from the field]. IEEE Robot Autom Mag 19(2):98–100
Mori M et al (1970) The uncanny valley. Energy 7(4):33–35
Müller NH, Truschzinski M (2014) An emotional framework for a real-life worker simulation. In: International conference on human–computer interaction. Springer, pp 675–686
Murray JC, Cañamero L, Bard KA, Ross MD, Thorsteinsson K (2009) The influence of social interaction on the perception of emotional expression: a case study with a robot head. In: FIRA RoboWorld Congress. Springer, pp 63–72
Mutlu B, Yamaoka F, Kanda T, Ishiguro H, Hagita N (2009) Nonverbal leakage in robots: communication of intentions through seemingly unintentional behavior. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, pp 69–76
Nachar N et al (2008) The Mann–Whitney u: a test for assessing whether two independent samples come from the same distribution. Tutor Quant Methods Psychol 4(1):13–20
Nadel J, Simon M, Canet P, Soussignan R, Blancard P, Canamero L, Gaussier P (2006) Human responses to an expressive robot. In: Proceedings of the sixth international workshop on epigenetic robotics. Lund University
Niemelä M, Arvola A, Aaltonen, I (2017) Monitoring the acceptance of a social service robot in a shopping mall: first results. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction, pp 225–226
Nishio S, Taura K, Sumioka H, Ishiguro H (2013) Teleoperated android robot as emotion regulation media. Int J Soc Robot 5(4):563–573
Nomura T, Kanda T, Suzuki T, Kato K (2004) Psychology in human–robot communication: an attempt through investigation of negative attitudes and anxiety toward robots. In: RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759). IEEE, pp 35–40
Nunes ARV (2019) Deep emotion recognition through upper body movements and facial expression, student report spring semester, Aalborg University
Obaid M, Kuchenbrandt D, Bartneck C (2014) Empathy and yawn contagion: can we (humans) catch yawns from robots? In: Proceedings of the 2014 ACM/IEEE international conference on human-robot interaction, pp 260–261
Ogata T, Sugano S (2000) Emotional communication between humans and the autonomous robot Wamoeba-2 (Waseda amoeba) which has the emotion model. JSME Int J Ser C 43(3):568–574
Oliver RL, Balakrishnan PS, Barry B (1994) Outcome satisfaction in negotiation: a test of expectancy disconfirmation. Organ Behav Hum Decis Process 60(2):252–275
Ortony A, Clore G, Collins A (1988) The cognitive structure of emotions. Cambridge University Press, New York
Pandya H, Patel H (2019) Facial affect detection using transfer learning: a comparative study, PsyArXiv Preprints, pp 1–5
Park CH, Javed H, Jeon M (2019) Consensus-based human–agent interaction model for emotion regulation in ASD. In: International conference on human–computer interaction. Springer, pp 295–301
Park CH, Sim K.B (2003) Emotion recognition and acoustic analysis from speech signal. In: Proceedings of the international joint conference on neural networks, 2003, vol 4. IEEE, pp 2594–2598
Park E, Jin D, del Pobil AP (2012) The law of attraction in human–robot interaction. Int J Adv Rob Syst 9(2):35
Park JS, Kim JH, Oh YH (2009) Feature vector classification based speech emotion recognition for service robots. IEEE Trans Consum Electron 55(3):1590–1596
Parkinson B (1996) Emotions are social. Br J Psychol 87(4):663–683
Plutchik RE, Conte HR (1997) Circumplex models of personality and emotions. American Psychological Association, Washington
Podsakoff PM, MacKenzie SB, Lee JY, Podsakoff NP (2003) Common method biases in behavioral research: a critical review of the literature and recommended remedies. J Appl Psychol 88(5):879
Podsakoff PM, Organ DW (1986) Self-reports in organizational research: problems and prospects. J Manag 12(4):531–544
Prasad V, Stock-Homburg R, Peters J (2021) Human–robot handshaking: a review. Int J Soc Robot in press. https://doi.org/10.1007/s12369-021-00763-2
Rahman S, Wang Y (2015) Dynamic affection-based motion control of a humanoid robot to collaborate with human in flexible assembly in manufacturing. In: ASME 2015 dynamic systems and control conference. American Society of Mechanical Engineers Digital Collection
Rani P, Sarkar N (2004) Emotion-sensitive robots-a new paradigm for human–robot interaction. In: 4th IEEE/ras international conference on humanoid robots, 2004, vol 1. IEEE, pp 149–167
Rawal N, Stock-Homburg R.(2021) Facial emotion expressions in human–robot interaction: a survey. Int J Social Robot in press, arXiv preprints arXiv:2103.07169
Rázuri JG, Sundgren D, Rahmani R, Moran A, Bonet I, Larsson A (2015) Speech emotion recognition in emotional feedback for human–robot interaction. Int J Adv Res Artif Intell (IJARAI) 4(2):20–27
Read R, Belpaeme T (2012) How to use non-linguistic utterances to convey emotion in child–robot interaction. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 219–220
Reyes ME, Meza IV, Pineda LA (2019) Robotics facial expression of anger in collaborative human–robot interaction. Int J Adv Rob Syst 16(1):1729881418817972
Ribeiro T, Paiva, A (2012) The illusion of robotic life: principles and practices of animation for robots. In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, pp 383–390
Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161
Ruxton GD (2006) The unequal variance t-test is an underused alternative to student’s t-test and the Mann–Whitney u test. Behav Ecol 17(4):688–690
Sabelli AM, Kanda T, Hagita N (2011) A conversational robot in an elderly care center: an ethnographic study. In: 2011 6th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 37–44
Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: 2010 5th ACM/IEEE international conference on human–robot interaction (HRI. IEEE), pp 53–60
Schaaff K, Schultz T (2009) Towards an EEG-based emotion recognizer for humanoid robots. In: RO-MAN 2009-The 18th IEEE international symposium on robot and human interactive communication. IEEE, pp 792–796
Scheutz M, Schermerhorn P, Kramer J (2006) The utility of affect expression in natural language interactions in joint human–robot tasks. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, pp 226–233
Seo SH, Griffin K, Young JE, Bunt A, Prentice S, Loureiro-Rodríguez V (2018) Investigating people’s rapport building and hindering behaviors when working with a collaborative robot. Int J Soc Robot 10(1):147–161
Shao M, Alves SFDR, Ismail O, Zhang X, Nejat G, Benhabib, B(2019) You are doing great! only one rep left: an affect-aware social robot for exercising. In: 2019 IEEE international conference on systems, man and cybernetics (SMC). IEEE, pp 3811–3817
Shayganfar M, Rich C, Sidner CL (2012) A design methodology for expressing emotion on robot faces. In: 2012 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 4577–4583
Shi Y, Chen Y, Ardila LR, Venture G, Bourguet, ML (2019) A visual sensing platform for robot teachers. In: Proceedings of the 7th international conference on human–agent interaction, pp 200–201
Siegwart R, Arras KO, Bouabdallah S, Burnier D, Froidevaux G, Greppin X, Jensen B, Lorotte A, Mayor L, Meisser M et al (2003) Robox at expo 0.2: a large-scale installation of personal robots. Robot Auton Syst 42(3–4):203–222
Snyder H (2019) Literature review as a research methodology: an overview and guidelines. J Bus Res 104:333–339
Song KT, Han MJ, Wang SC (2014) Speech signal-based emotion recognition and its application to entertainment robots. J Chin Inst Eng 37(1):14–25
Song S, Yamada S (2017) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: 2017 12th ACM/IEEE international conference on human-robot interaction. IEEE, pp 2–11
Spekman ML, Konijn EA, Hoorn JF (2018) Perceptions of healthcare robots as a function of emotion-based coping: the importance of coping appraisals and coping strategies. Comput Hum Behav 85:308–318
Stafford RQ, MacDonald BA, Li X, Broadbent E (2014) Older people’s prior robot attitudes influence evaluations of a conversational robot. Int J Soc Robot 6(2):281–297
Stock R, Gross M (2016) How does knowledge workers’ social technology readiness affect their innovative work behavior? In: 2016 49th Hawaii international conference on system sciences (HICSS). IEEE, pp 2166–2175
Stock R, Merkle M, Eidens D, Hannig M, Heineck P, Nguyen MA, Völker J (2019) When robots enter our workplace: understanding employee trust in assistive robots
Stock R, Nguyen MA (2019) Robotic psychology. what do we know about human–robot interaction and what do we still need to learn? In: Proceedings of the 52nd Hawaii international conference on system sciences, pp 1936–1945
Stock RM (2014) How should customers be integrated for effective interorganizational NPD teams? An input-process-output perspective. J Prod Innov Manag 31(3):535–551
Stock RM (2016) Emotion transfer from frontline social robots to human customers during service encounters: Testing an artificial emotional contagion modell. In: 2016 international conference on information systems research (ICIS
Stock RM, Hoyer WD (2005) An attitude-behavior model of salespeople’s customer orientation. J Acad Mark Sci 33(4):536–552
Stock RM, Merkle M (2017) A service robot acceptance model: User acceptance of humanoid robots during service encounters. In: 2017 IEEE international conference on pervasive computing and communications workshops (PerCom Workshops). IEEE, pp 339–344
Stock RM, Merkle M (2018) Can humanoid service robots perform better than service employees? a comparison of innovative behavior cues. In: Proceedings of the 51st Hawaii international conference on system sciences
Su Y, Li W, Bi N, Lv Z (2019) Adolescents environmental emotion perception by integrating EEG and eye movements. Front Neurorobot 13:46
Sugaya, M (2019) Emotion aware robot by emotion estimation using biological sensors. In: 2019 IEEE international conference on pervasive computing and communications workshops (PerCom Workshops). IEEE, p 541
Sugunan N, Alekh V, Krishna S, Babu SK, Bhavani RR, et al (2018) Design and emotional evaluation of pepe jr: A cost-effective platform for human robot interaction studies. In: 2018 IEEE distributed computing, VLSI, electrical circuits and robotics (DISCOVER). IEEE, pp 76–81
Sun M, Mou Y, Xie H, Xia M, Wong M, Ma X (2019) Estimating emotional intensity from body poses for human–robot interaction. arXiv preprint arXiv:1904.09435
Sung J, Christensen HI, Grinter RE (2009) Robots in the wild: understanding long-term use. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, pp 45–52
Tahon M, Delaborde A, Devillers L (2011) Real-life emotion 1762 detection from speech in human–robot interaction: experiments 1763 across diverse corpora with child and adult voices. In: Cosi P, De Mori R, Di Fabbrizio G, Pieraccini R (eds) Interspeech 2011, 12th annual conference of the international speech communication association, August 27–31, pp 3121–3124
Tajfel H (1969) Cognitive aspects of prejudice. J Biosoc Sci 1(S1):173–191
Tajfel H (1981) Human groups and social categories: studies in social psychology. Cambridge University Press, Cambridge
Tajfel H (1982) Social identity and intergroup relations, vol 7. Cambridge University Press, Cambridge
Tajfel H, Billig MG, Bundy RP, Flament C (1971) Social categorization and intergroup behaviour. Eur J Soc Psychol 1(2):149–178
Tajfel H, Turner JC, Austin WG, Worchel S (1979) An integrative theory of intergroup conflict. In: Organizational identity: a reader, vol 56, p 65
Taki R, Maeda Y, Takahashi Y (2010) Personal preference analysis for emotional behavior response of autonomous robot in interactive emotion communication. J Adv Comput Intell Intell Inform 4(7):852–859
Tanaka F, Cicourel A, Movellan JR (2007) Socialization between toddlers and robots at an early childhood education center. Proc Natl Acad Sci 104(46):17954–17958
Terada K, Takeuchi C (2017) Emotional expression in simple line drawings of a robot’s face leads to higher offers in the ultimatum game. Front Psychol 8(724):1–9
Terada K, Yamauchi A, Ito A (2012) Artificial emotion expression for a robot by dynamic color change. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication. IEEE, pp 314–321
Thiessen R, Rea DJ, Garcha DS, Cheng C, Young J E (2019) Infrasound for HRI: a robot using low-frequency vibrations to impact how people perceive its actions. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 11–18
Thimmesch-Gill Z, Harder KA, Koutstaal W (2017) Perceiving emotions in robot body language: acute stress heightens sensitivity to negativity while attenuating sensitivity to arousal. Comput Hum Behav 76:59–67
Thompson LF, Gillan DJ (2016) Social factors in human-robot interaction. In: Barnes M, Jentsch F (eds) Human-robot interactions in future military operations. Ashgate, Surrey, pp 67–81
Tielman M, Neerincx M, Meyer JJ, Looije, R (2014) Adaptive emotional expression in robot–child interaction. In: 2014 9th ACM/ieee international conference on human–robot interaction (HRI). IEEE, pp 407–414
Tomkins SS (1984) Affect theory. Approach Emot 163:163–195
Tosi HL Jr, Slocum JW Jr (1984) Contingency theory: some suggested directions. J Manag 10(1):9–26
Tranfield D, Denyer D, Smart P (2003) Towards a methodology for developing evidence-informed management knowledge by means of systematic review. Br J Manag 14(3):207–222
Trovato G, Kishi T, Endo N, Hashimoto K, Takanishi A (2012) Development of facial expressions generator for emotion expressive humanoid robot. In: 2012 12th IEEE-RAS international conference on humanoid robots (humanoids 2012). IEEE, pp 303–308
Trovato G, Ramos JG, Azevedo H, Moroni A, Magossi S, Simmons R, Ishii H, Takanishi A (2017) A receptionist robot for Brazilian people: study on interaction involving illiterates. Paladyn J Behav Roboti 8(1):1–17
Tschöpe N, Reiser JE, Oehl M (2017) Exploring the uncanny valley effect in social robotics. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction, pp 307–308
Tsiourti C, Weiss A, Wac K, Vincze M (2019) Multimodal integration of emotional signals from voice, body, and context: effects of (in) congruence on emotion recognition and attitudes towards robots. Int J Soc Robot 11(4):555–573
Tsuchiya S, Imono M, Watabe H (2015) Judging emotion from EEGS, Procedia computer science, 60, pp 37–44
Turley LW, Milliman RE (2000) Atmospheric effects on shopping behavior: a review of the experimental evidence. J Bus Res 49(2):193–211
Val-Calvo M, Álvarez-Sánchez JR, Díaz-Morcillo A, Vicente JMF, Fernández-Jover E (2019) On the use of lateralization for lightweight and accurate methodology for EEG real time emotion estimation using Gaussian-process classifier. In: International work-conference on the interplay between natural and artificial computation. Springer, pp 191–201
Valenti A, Chita-Tegmark M, Gold M, Law T, Scheutz M (2019) In their own words: a companion robot for detecting the emotional state of persons with Parkinson’s disease. In: International conference on social robotics. Springer, pp 443–452
Valenti A, Chita-Tegmark M, Law T, Bock A, Oosterveld B, Scheutz M (2019) When your face and tone of voice don’t say it all: inferring emotional state from word semantics and conversational topics. In: Workshop on Cognitive Architectures for HRI: embodied models of situated natural language interactions of AAHAS 2019. Montreal, Canada
Vásquez BPEA, Matía F (2020) A tour-guide robot: moving towards interaction with humans. Eng Appl Artif Intell 88:103356
Venkatesh V, Brown SA, Bala H (2013) Bridging the qualitative-quantitative divide: Guidelines for conducting mixed methods research in information systems. MIS Q 37(4):21–54
Vithanawasam T, Madhusanka B (2019) Face and upper-body emotion recognition using service robot’s eyes in a domestic environment. In: 2019 international research conference on smart computing and systems engineering (SCSE). IEEE, pp 44–50
Wang S, Lilienfeld SO, Rochat P (2015) The uncanny valley: existence and explanations. Rev Gen Psychol 19(4):393–407
Wang W, Athanasopoulos G, Patsis G, Enescu V, Sahli H (2014) Real-time emotion recognition from natural bodily expressions in child–robot interaction. In: European conference on computer vision. Springer, pp 424–435
White RT, Arzi HJ (2005) Longitudinal studies: designs, validity, practicality, and value. Res Sci Educ 35(1):137–149
Wirtz J, Patterson PG, Kunz WH, Gruber T, Lu VN, Paluch S, Martins A (2018) Brave new world: service robots in the frontline. J Serv Manag 29(5):907–931
Wittig S, Kloos U, Rätsch M (2016) Emotion model implementation for parameterized facial animation in human–robot-interaction 11(6):439–445
Woods S, Dautenhahn K, Schulz J (2004) The design space of robots: investigating children’s views. In: RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759). IEEE, pp 47–52
Wu M, Su W, Chen L, Liu Z, Cao W, Hirota K (2019) Weight-adapted convolution neural network for facial expression recognition in human–robot interaction. IEEE Trans Syst Man Cybern Syst 5(1):1473–1484
Wu Z, Zheng L (2019) Emotional communication robot based on 3d face model and ASR technology. In: 2019 IEEE 9th international conference on electronics information and emergency communication (ICEIEC). IEEE, pp 1–4
Xin L, Lun X, Zhi-liang W, Dong-mei F (2013) Robot emotion and performance regulation based on HMM. Int J Adv Rob Syst 10(3):160
Xu J, Broekens J, Hindriks K, Neerincx M.A (2014) Robot mood is contagious: effects of robot body language in the imitation game. In: Proceedings of the 2014 international conference on Autonomous agents and multi-agent systems, pp 973–980. International foundation for autonomous agents and multiagent systems
Xu J, Broekens J, Hindriks K, Neerincx MA (2015) Mood contagion of robot body language in human robot interaction. Auton Agent Multi Agent Syst 29(6):1216–1248
Xu Q, Ng J, Tan O, Huang Z, Tay B, Park T (2015) Methodological issues in scenario-based evaluation of human–robot interaction. Int J Soc Robot 7(2):279–291
Yamashita Y, Ishihara H, Ikeda T, Asada M (2019) Investigation of causal relationship between touch sensations of robots and personality impressions by path analysis. Int J Soc Robot 11(1):141–150
Yan Z, Jouandeau N, Cherif AA (2013) A survey and analysis of multi-robot coordination. Int J Adv Rob Syst 10(12):399–417
Yang J, Wang R, Guan X, Hassan MM, Almogren A, Alsanad A (2020) Ai-enabled emotion-aware robot: the fusion of smart clothing, edge clouds and robotics. Futur Gener Comput Syst 102:701–709
Yoo C, Park J, MacInnis DJ (1998) Effects of store characteristics and in-store emotional experiences on store attitude. J Bus Res 42(3):253–263
Yoon Y, Ko WR, Jang M, Lee J, Kim J, Lee G (2019) Robots learn social skills: end-to-end learning of co-speech gesture generation for humanoid robots. In: 2019 international conference on robotics and automation (ICRA). IEEE, pp 4303–4309
You S, Robert L (2018) Teaming up with robots: an IMOI (inputs-mediators-outputs-inputs) framework of human–robot teamwork. Int J Robot Eng (IJRE) 2(3):1–7
Yu C, Tapus A (2019) Interactive robot learning for multimodal emotion recognition. In: International conference on social robotics. Springer, pp 633–642
Yu C, Xu L (2004) An emotion-based approach to decision making and self learning in autonomous robot control. In: Fifth world congress on intelligent control and automation (IEEE Cat. No. 04EX788), vol 3. IEEE, pp 2386–2390
Żarkowski M (2019) Multi-party turn-taking in repeated human–robot interactions: an interdisciplinary evaluation. Int J Soc Robot 11(5):693–707
Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A (2009) Whole body emotion expressions for kobian humanoid robot—preliminary experiments with different emotional patterns. In: RO-MAN 2009-The 18th IEEE international symposium on robot and human interactive communication. IEEE, pp 381–386
Zhang J, Xiao N (2020) Capsule network-based facial expression recognition method for a humanoid robot. In: Recent Trends in Intelligent Computing, Communication and Devices, pp. 113–121. Springer
Zhang L, Jiang M, Farid D, Hossain MA (2013) Intelligent facial emotion recognition and semantic-based topic detection for a humanoid robot. Expert Syst Appl 40(13):5160–5168
Zhang L, Mistry K, Jiang M, Neoh SC, Hossain MA (2015) Adaptive facial point detection and emotion recognition for a humanoid robot. Comput Vis Image Underst 140:93–114
Zhang T, Kaber DB, Zhu B, Swangnetr M, Mosaly P, Hodge L (2010) Service robot feature design effects on user perceptions and emotional responses. Intel Serv Robot 3(2):73–88
Zhang Z, Niu Y, Wu, S, Lin SM, Kong L (2018) Analysis of influencing factors on humanoid robots’ emotion expressions by body language. In: International symposium on neural networks. Springer, pp 775–785
Zheng X, Shiomi M, Minato T, Ishiguro H (2019) What kinds of robot’s touch will match expressed emotions? IEEE Robot Automa Lett 5(1):127–134
Zhu C, Ahmad W (2019) Emotion recognition from speech to improve human–robot interaction. In: 2019 IEEE international conference on dependable, autonomic and secure computing, international conference on pervasive intelligence and computing, international conference on cloud and big data computing, international conference on cyber science and technology congress (DASC/PiCom/CBDCom/CyberSciTech). IEEE, pp 370–375
Zimmerman DW (1987) Comparative power of student t test and Mann–Whitney u test for unequal sample sizes and variances. J Exp Educ 55(3):171–174
Funding
Open Access funding enabled and organized by Projekt DEAL.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Compliance with Ethical Standards
The authors declare that there are no compliance issues with this research.
Funding
This research was funded by the German Research Foundation (DFG, Deutsche Forschungsgemeinschaft), Zentrum für verantwortungsbewusste Digitalisierung (ZEVEDI), and the leap in time foundation.
Conflict of interest
The author declares that she has no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Stock-Homburg, R. Survey of Emotions in Human–Robot Interactions: Perspectives from Robotic Psychology on 20 Years of Research. Int J of Soc Robotics 14, 389–411 (2022). https://doi.org/10.1007/s12369-021-00778-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-021-00778-6