Abstract
Research suggests that emotionally responsive machines that can simulate empathy increase de acceptance of users towards them, as the feeling of affinity towards the machine reduces negative perceptual feedback. In order to endow a robot with emotional intelligence, it must be equipped with sensors capable of capturing users’ emotions (sense), appraisal captured emotions to regulate its internal state (compute), and finally perform tasks where actions are regulated by the computed “emotional” state (act). However, despite the impressive progress made in recent years in terms of artificial intelligence, speech recognition and synthesis, computer vision and many other disciplines directly and indirectly related to artificial emotional recognition and behavior, we are still far from being able to endow robots with the empathic capabilities of a human being. This article aims to give an overview of the implications of introducing emotional intelligence in robotic constructions by discussing recent advances in emotional intelligence in robotics.
Similar content being viewed by others
Keywords
1 Introduction
Should a robot have feelings? Based on the world of science fiction, the answer is clearly yes. Films such as Blade Runner; 2001: A Space Odissey; I, Robot or Ex Machina have shown machines capable of experiencing human feelings such as fear, anger or even love.
Assuming that research and technology reached the capacity to develop an artificial intelligence (AI) equal to human intelligence or even capable of surpassing humans in problem solving, there is an open debate of whether this AI would be capable of “feeling” emotions the way humans do. Furthermore, controversy exists as to whether such capacity is necessary or desirable. Recent advances in the field of affective computing show applications with increasingly elaborate (though still very basic compared to human) emotional intelligence [1, 2], which leads to believe that it is only a matter of time before the “fiction” tag disappears from the term science, at least as far as emotional intelligence is concerned.
Despite the growing interest in artificial intelligence (AI) there are numerous disagreements about its implications for our society. The future development of machines with AI equal or superior to humans (also known as strong AI), will make human-machine interaction (HMI) one of the main challenges for the success and acceptance of robots. If the acceptance of robotics by users is already a challenge in itself, [3, 4], developing robots with intelligence capable of matching or even surpassing human intelligence may be a further barrier to the integration of these devices into society [5].
Given the rapid evolution of AI in recent years, it is worth considering the convenience of incorporating these advances in the field of social robotics. The research of emotional intelligence oriented to robotics is based on two main pillars: emotion for enhancing social interaction, and emotion for improved performance. For interaction, emotion can be used to improve the robot likeability and believability, improving communication with users an enhancing the user experience. The second main purpose builds on the belief that emotion is key to animals’ ability to survive and navigate in the world and can likewise be applied to robotics [6]. Therefore, this article aims to give an overview of the implications of introducing emotional intelligence in robotic constructions.
2 Emotional Intelligence in Robotics
Emotional intelligence has been defined as a set of skills hypothesized to contribute to the accurate appraisal and expression of emotion in oneself and in others, the effective regulation of emotion in self and others, and the use of feelings to motivate, plan, and achieve day-to-day actions [7]. On the other hand, among the many definitions of robots one of the most accepted is the one provided in [8], where they define a robot as an autonomous machine capable of sensing its environment, carrying out computations to make decisions, and performing actions in the real world. In short, a robot can typically do three things: sense, compute, and act.
Therefore, in order to endow a robot with emotional intelligence, it must be equipped with sensors capable of capturing users’ emotions (sense), appraisal captured emotions to regulate its internal state (compute), and finally perform tasks where actions are regulated by the computed “emotional” state (act).
2.1 Automatic Emotion Recognition (Sense)
Automatic emotion recognition is the process of identifying human emotion from different communication channels such as facial or body gestures, physiological signals, voice nuances, speech content, etc. Collecting and labeling such signals has been a great focus of research during the last decades, driven by the search to enhance the user experience in human-machine interactions. In this sense, there has been a shift from unimodal systems (in which only one channel is used) to the combination of information coming from several channels simultaneously (multimodal emotion recognition).
Automatic emotional recognition is a complicated task, given the enormous number of small nuances in the human expression of emotions, as well as the variability between users, between cultures, etc. To develop computerized emotion recognition systems, it is first necessary to parameterize emotions, so that labelling can be done using quantitative computational techniques. Thus, in recent decades, discrete models of emotions have evolved into multidimensional models. One of the most widely used has been the circumplex model of affect, proposed by James A. Russell [9], where emotions can be categorized by 2 dimensions: valence, from unpleasant (negative) to pleasant (positive); and arousal, from passive (weak emotion) to active (strong emotion). By varying the values of each dimension, emotions can be plotted on two coordinate axes. The problem with Russell’s model is that there are emotions such as fear and anger that are located in the same quadrant and very close in a 2D space (both are negative and active), so the model has been extended with a third dimension called dominance, which ranges from submissive to dominant and reflects the person's control ability in a certain emotion (see Fig. 1).
Once the categorization of human expression has been attained, thanks to the latest developments in technologies for capturing movements [11], physiological parameters [10] or voice [12], complemented by the latest advances in deep learning techniques for 2D and 3D face and body expressions identification [13], voice analysis or speech interpretation [14], multimodal recognition systems have achieved high recognition rates that are little affected by the variation between users [15]. The ultimate challenge, however, is to achieve such success rates in what is known as emotion recognition “in the wild”. That is, to achieve multimodal emotion recognition in uncontrolled environments with a high variability conditions (light, noise, oclussions, etc.) [16, 17].
2.2 Emotional AI (Compute)
Once emotions have been categorized from the information captured by different sensors, the appraisal of these emotions is needed to regulate the robot internal state which in turn will determine the following actions to take. In human psychology, emotions are recognized as functional in decision-making by influencing motivation and action selection. For that reason, computational emotion models are usually embedded in the agent’s decision making architecture [18].
The factors that influence robots’ affective state and its evolution over time can be analyzed based on affective psychology theory [19]. Many important advancements in machine learning (ML) and AI are based on biological principles, such as neural networks or evolutionary algorithms. Thus, a computational approach to emotions can be considered as another example of bio-inspiration in computational science. Emotions can be seen as a response to a certain stimulus that elicits a tendency towards a certain action [20], and as complex feedback signals that shape behavior [21]. Therefore, processing emotions should be approached from a dual perspective: motivated action and feedback.
From an emotionally intelligent robot perspective, this refers to how the system transitions from one state to the other using emotional signals and feedback as one of the state inputs. Thus, emotional artificial intelligence in robotics can be seen as how the system processes emotion, focusing on how input is translated through an algorithm to an output and whether or not it contains a knowledge of past events or history [6].
There are several types of algorithms used in emotional intelligence, fuzzy models [22], Markov models [17], neural networks [23], Probability Tables [24], reinforcement learning [18] and unsupervised machine learning approaches such as K-means, K-medoids or self-organizing maps [25].
The future of artificial emotional intelligence is not only linked to the capacity to increase processing power, but also to a paradigm shift in artificial intelligence as it is understood today. The ultimate goal is to move from what is known as weak or narrow AI, which focuses on performing a specific task, to strong AI or artificial general intelligence (AGI), which is a hypothetical type of artificial intelligence that would surpass the AI known so far. It would be an artificial intelligence whose purpose would be to emulate human intelligence as closely as possible, enabling general problem solving and activities. In this sense, substantial work that has been performed in the province of artificial agents [26] since the early Unified Theories of Cognition UTC and the SOAR system [27]. In fact, recent works continue employing enhancements to the cognitive architecture proposed in SOAR, for example by combining the long-term cognitive planning ability of SOAR and the powerful feature detection ability of artificial neural networks (ANNs) [28].
It should be noted that “general” means that, instead of specializing in solving a single type of problem, the system would emulate what any human being can do. For example, current chatbots are built focused on developing a conversation, using increasingly complex algorithms and databases to determine an appropriate reply to the user. On the contrary, an ideal strong AI interface, however, would be developed having the same sensory perception capabilities as a human being, and go through the same education and learning processes. Thus, instead of building the emotional knowledge for the robot, developers would need to provide the robot with the ability to interact with the environment and learn from those interactions.
Robots’ Ability to Show Emotions (Act)
Emotions are psychophysiological reactions that represent ways in which an individual adapts to certain stimuli when perceiving an object, person, place, event, or memory. Psychologically, emotions alter attention and activate relevant associative networks in the memory. Physiologically, emotions rapidly organize the responses of different biological systems, including facial expressions, muscles, voice, endocrine system, in order to establish an optimal internal environment for the most effective behavior [26].
Given the definition of emotion, from a physiological point of view an emotional response requires more than an evolved AI. With an adequate level of technological development, it would be possible to create a machine capable of adapting to external stimuli, of changing its behavior by activating different internal systems, and showing emotions by generating a combination of body, facial and vocal expressions.
The way such emotional reaction is expressed highly depends on the robot’s degree of anthropomorphism [27]. For robots with a simple appearance, it may be sufficient to express emotions by means of e.g. lights or sounds [28]. However, as the degree of anthropomorphism increases, it turns necessary to match the robot's behavior with the appearance to avoid falling into the uncanny valley (see Fig. 2).
Behaviorally, emotions serve to establish our position in relation to our environment, driving us towards certain people, objects, actions, ideas and away from others. Emotions also act as a reservoir of innate and learned influences and possess certain invariant characteristics and others that may vary between individuals [30], groups [3, 5], and cultures [31].
For this reason, it is important to take into account the environment in which the robot operates. As with humans, the ability of robots to convincingly show emotions depends on being able to adapt their behavior to different interlocutors and environments.
3 Users’ Preferences for Emotionally Intelligent Robots
Empathy is defined as the cognitive ability to perceive what another being may feel and can be divided into two main components: affective empathy, also called emotional empathy, which is the ability to respond with appropriate emotional reaction to the mental states of another; and cognitive empathy, which is the ability to understand another's point of view or mental state.
The added value of the empathic communication comes from different sources. On the one hand, empathy improves the efficiency of interaction. Thus, while performing actions, human beings send signals that communicate their intentions (glances, hand movements, body movements, etc.), which may enable their interlocutors (if trained to perceive such signals) to identify them and to collaborate more efficiently to achieve joint goals. On the other hand, empathic interaction could help decrease people's reluctance to interact with robotic devices and help make humans more comfortable with robots and other machines.
Studies suggest that emotionally responsive machines that can simulate empathy increase de acceptance of users towards them, as the feeling of affinity towards the machine reduces negative perceptual feedback [32,33,34]. Thus, emotional AI is founded on behaviors in human sociology as communication, personality, and comprehension help promote understanding and empathy during human–human interaction [35]. In the case of robots designed to interact with humans, the ability to respond appropriately to the emotional states of the users can enhance the users’ acceptance, as robots’ behavior appears more believable and responds to expectations [36].
It seems that endowing a robot with emotional intelligence can increase empathy and improve the user experience, for example as the user does not get bored of interacting with a machine with repetitive and predictive behavior. However, the fact that many users are still reluctant to use robotic technology means that special care must be taken when developing the robot’s emotional intelligence and designing emotions for behavior control. The ‘control degree’ that emotional intelligence has over robot behaviors can make robot actions better suited from an empathic interaction point of view but may generate unexpected behaviors leading to user rejection. In this sense, early studies in human-robot interaction in home environments suggested that users do not want a robot companion to be a friend, but to perform the tasks they are intended for whereas humanlike behavior and appearance were less essential [40]. Contrary to these results, new studies suggest that robots able to accentuate their own personality are preferred by users [41].
These divergences may be explained if considering the aesthetic approach to design anthropomorphic robots, which can lead to fall deep into the uncanny valley if the user perceives a mismatch between robot’s appearance and behavior. In addition, cultural, age and gender differences have seen to influences the apparition of the uncanny valley during human-robot-interaction. For example, many studies show that eastern cultures often rate lower levels of the uncanny valley than western cultures [42,43,44]. Also, studies show that children are less susceptible to the uncanny valley as they are naturally more curious than adults, which is attributed to a lack of media influence and risk perception [44,45,46].
In view of the disparity of results and the difficulties to make generalizations, we can conclude that emotional human-robot interaction represents a challenge for robotic developers. As is the case of strong artificial intelligence discussed before, the tendency should be to develop robots that can adapt to each particular user or group of users, learning from the experiences gained from interacting with humans in their environment and adapting to user preferences, in the same way that we humans develop our social skills evolving from interaction with our circle of family and friends and learning as we gain new experiences.
4 Conclusions
Endowing robots with behavior that simulates human emotional behavior is one of the ultimate goals of robotics. Such emotional behavior could allow robots to display their moods as well as to perceive the moods of users interacting with them.
However, despite the impressive progress made in recent years in terms of artificial intelligence, speech recognition and synthesis, computer vision and many other disciplines directly and indirectly related to artificial emotional recognition and behavior, we are still far from being able to endow robots with the empathic capabilities of a human being.
As such, research in emotional robotics should focus on overcoming current challenges in emotional sensing, modelling & computing, and expression:
-
It is necessary to continue investigating what empathy means for different types of robots, such as exoskeletons, social robots, service robots, manufacturing robots, etc. and to see how they can express empathy in their respective application contexts.
-
Empathic interaction should be a dynamic process that evolves with the aim of building a relationship with the user over time. Pre-programmed repetitive behaviors are not perceived as empathic by the user, especially when the behavioral cues used to trigger the robot's actions are known to the user.
-
Since robots do not possess the physiological processes that allow them to be empathic, the short-term solution is to detect the socio-emotional cues transmitted by humans and have the robots mimic the empathic behavioral responses that would be displayed by humans. However, developments should evolve from this approach towards providing the robot with the ability to interact with the environment and learn from those interactions.
-
During experimentation with empathic robots, it is necessary to develop new complex systems with the capabilities to investigate the different aspects of empathic behavior and to quantitatively assess their impact.
References
Aranha, R.V., Corrêa, C.G., Nunes, F.L.S.: Adapting software with affective computing: a systematic review. IEEE Trans. Affect. Comput. 1 (2019). https://doi.org/10.1109/TAFFC.2019.2902379
Yadegaridehkordi, E., Noor, N.F.B.M., Ayub, M.N.B., Affal, H.B., Hussin, N.B.: Affective computing in education: a systematic review and future research. Comput. Educ. 142, 103649 (2019). https://doi.org/10.1016/j.compedu.2019.103649
Dudek, M., Baisch, S., Knopf, M., Kolling, T.: This isn’t me!: the role of age-related self- and user images for robot acceptance by elders. Int. J. Soc. Robot. (2020). https://doi.org/10.1007/s12369-020-00678-1
Mele, C., et al.: Understanding robot acceptance/rejection: the SAR model. In: 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp. 470–475 (2020). https://doi.org/10.1109/RO-MAN47096.2020.9223577
Meissner, A., Trübswetter, A., Conti-Kufner, A.S., Schmidtler, J.: Friend or foe? Understanding assembly workers 2019; acceptance of human-robot collaboration. ACM Trans. Hum.-Robot Interacti. 10(1), 3:1–3:30. https://doi.org/10.1145/3399433
Savery, R., Weinberg, G.: A survey of robotics and emotion: classifications and models of emotional interaction. In: 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp. 986–993 (2020). https://doi.org/10.1109/RO-MAN47096.2020.9223536
Salovey, P., Mayer, J.D.: Emotional intelligence. Imagin. Cogn. Pers. 9(3), 185–211 (1990). https://doi.org/10.2190/DUGG-P24E-52WK-6CDG
What is a robot? - ROBOTS: your guide to the world of robotics. (n.d.). https://robots.ieee.org/learn/what-is-a-robot/. Accessed: 28 Apr 2021
Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980). https://doi.org/10.1037/h0077714
Shu, L., et al.: A review of emotion recognition using physiological signals. Sensors 18(7), 2074 (2018). https://doi.org/10.3390/s18072074
van der Kruk, E., Reijne, M.M.: Accuracy of human motion capture systems for sport applications; state-of-the-art review. Eur. J. Sport Sci. 18(6), 806–819 (2018). https://doi.org/10.1080/17461391.2018.1463397
Rohlfing, M.L., Buckley, D.P., Piraquive, J., Stepp, C.E., Tracy, L.F.: Hey Siri: How effective are common voice recognition systems at recognizing dysphonic voices? The Laryngoscope (2021). https://doi.org/10.1002/lary.29082
Samadiani, N., et al.: A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors 19(8), 1863 (2019). https://doi.org/10.3390/s19081863
Khalil, R.A., Jones, E., Babar, M.I., Jan, T., Zafar, M.H., Alhussain, T.: Speech emotion recognition using deep learning techniques: a review. IEEE Access 7, 117327–117345 (2019). https://doi.org/10.1109/ACCESS.2019.2936124
Abdullah, S.M.S.A., Ameen, S.Y.A., Sadeeq, M.A.M., Zeebaree, S.: Multimodal emotion recognition using deep learning. J. Appl. Sci. Technol. Trends 2(2), 52–58 (2021). https://doi.org/10.38094/jastt20291
Tzirakis, P., Chen, J., Zafeiriou, S., Schuller, B.: End-to-end multimodal affect recognition in real-world environments. Inf. Fusion 68, 46–53 (2021). https://doi.org/10.1016/j.inffus.2020.10.011
Li, S., et al.: Bi-modality fusion for emotion recognition in the wild. In: 2019 International Conference on Multimodal Interaction, pp. 589–594 (2019). https://doi.org/10.1145/3340555.3355719
Moerland, T.M., Broekens, J., Jonker, C.M.: Emotion in reinforcement learning agents and robots: a survey. Mach. Learn. 107(2), 443–480 (2017). https://doi.org/10.1007/s10994-017-5666-0
Zhou, Q.: Multi-layer affective computing model based on emotional psychology. Electron. Commer. Res. 18(1), 109–124 (2017). https://doi.org/10.1007/s10660-017-9265-8
Calvo, R.: The Oxford Handbook of Affective Computing. Oxford University Press (2015). https://doi.org/10.1093/oxfordhb/9780199942237.001.0001
Broekens, J., Bosse, T., Marsella, S.C.: Challenges in computational modeling of affective processes. IEEE Trans. Affect. Comput. 4(3), 242–245 (2013). https://doi.org/10.1109/T-AFFC.2013.23
Taverner, J., Vivancos, E., Botti, V.: A fuzzy appraisal model for affective agents adapted to cultural environments using the pleasure and arousal dimensions. Inf. Sci. 546, 74–86 (2021). https://doi.org/10.1016/j.ins.2020.08.006
Ashwin, T.S., Guddeti, R.M.R.: Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks. Educ. Inf. Technol. 25(2), 1387–1415 (2019). https://doi.org/10.1007/s10639-019-10004-6
Bird, J.J., Ekárt, A., Faria, D.R.: Chatbot interaction with artificial intelligence: human data augmentation with T5 and language transformer ensemble for text classification. ArXiv: 2010.05990 [Cs] (2020). http://arxiv.org/abs/2010.05990
Fiorini, L., Mancioppi, G., Semeraro, F., Fujita, H., Cavallo, F.: Unsupervised emotional state classification through physiological parameters for social robotics applications. Knowl.-Based Syst. 190, 105217 (2020). https://doi.org/10.1016/j.knosys.2019.105217
Ivanovic, M., et al.: Emotional agents—state of the art and applications. Comput. Sci. Inf. Syst. (2015). https://doi.org/10.2298/CSIS141026047I
Newell, A.: SOAR as a unified theory of cognition: issues and explanations. Behav. Brain Sci. 15(3), 464–492 (1992). https://doi.org/10.1017/S0140525X00069740
Zuo, G., Pan, T., Zhang, T., Yang, Y.: SOAR improved artificial neural network for multistep decision-making tasks. Cogn. Comput. 13(3), 612–625 (2020). https://doi.org/10.1007/s12559-020-09716-6
Schindler, S., Bublatzky, F.: Attention and emotion: an integrative review of emotional face processing as a function of attention. Cortex 130, 362–386 (2020). https://doi.org/10.1016/j.cortex.2020.06.010
Marcos, S., Gómez-García-Bermejo, J., Zalama, E.: A realistic, virtual head for human–computer interaction. Interact. Comput. 22(3), 176–192 (2010). https://doi.org/10.1016/j.intcom.2009.12.002
Fernandez, R., John, N., Kirmani, S., Hart, J., Sinapov, J., Stone, P.: Passive demonstrations of light-based robot signals for improved human interpretability. In: 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 234–239. https://doi.org/10.1109/ROMAN.2018.8525728
MacDorman, K.F., Ishiguro, H.: The uncanny advantage of using androids in cognitive and social science research. Interact. Stud.: Soc. Behav. Commun. Biol. Artif. Syst. 7(3), 297–337 (2006). https://doi.org/10.1075/is.7.3.03mac
Matthews, G., et al.: Evolution and revolution: personality research for the coming world of robots, artificial intelligence, and autonomous systems. Pers. Individ. Differ. 169, 109969 (2021). https://doi.org/10.1016/j.paid.2020.109969
Korn, O., Akalin, N., Gouveia, R.: Understanding cultural preferences for social robots: a study in German and Arab communities. ACM Trans. Hum.-Robot Interact. 10(2), 12:1–12:19 (2021). https://doi.org/10.1145/3439717
Nishio, T., Yoshikawa, Y., Ogawa, K., Ishiguro, H.: Development of an effective information media using two android robots. Appl. Sci. 9(17), 3442 (2019). https://doi.org/10.3390/app9173442
Doering, M., Glas, D.F., Ishiguro, H.: Modeling interaction structure for robot imitation learning of human social behavior. IEEE Trans. Hum.-Mach. Syst. 49(3), 219–231 (2019). https://doi.org/10.1109/THMS.2019.2895753
Pablos, S.M., García-Bermejo, J.G., Zalama Casanova, E., López, J.: Dynamic facial emotion recognition oriented to HCI applications. Interact. Comput. 27(2), 99–119 (2015). https://doi.org/10.1093/iwc/iwt057
Strathearn, C., Ma, M.: Modelling user preference for embodied artificial intelligence and appearance in realistic humanoid robots. Informatics 7(3), 28 (2020). https://doi.org/10.3390/informatics7030028
Cañamero, L.: Emotion understanding from the perspective of autonomous robots research. Neural Netw. 18(4), 445–455 (2005). https://doi.org/10.1016/j.neunet.2005.03.003
Dautenhahn, K., Woods, S., Kaouri, C., Walters, M. L., Koay, K.L., Werry, I.: What is a robot companion—Friend, assistant or butler? In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1192–1197 (2005). https://doi.org/10.1109/IROS.2005.1545189
Whittaker, S., Rogers, Y., Petrovskaya, E., Zhuang, H.: Designing Personas for expressive robots: personality in the new breed of moving, speaking, and colorful social home robots. ACM Trans. Hum.-Robot Interact. 10(1), 8:1–8:25 (2021). https://doi.org/10.1145/3424153
Schoenherr, J.R., Burleigh, T.J.: Uncanny sociocultural categories. Front. Psychol. 5. (2015). https://doi.org/10.3389/fpsyg.2014.01456
Cheetham, M.: Editorial: the uncanny valley hypothesis and beyond. Front. Psychol. 8 (2017). https://doi.org/10.3389/fpsyg.2017.01738
Gee, F.C., Browne, W.N., Kawamura, K.: Uncanny valley revisited. In: ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, pp. 151–157 (2005). https://doi.org/10.1109/ROMAN.2005.1513772
Brink, K.A., Gray, K., Wellman, H.M.: Creepiness creeps In: uncanny valley feelings are acquired in childhood. Child Dev. 90(4), 1202–1214. (2019). https://doi.org/10.1111/cdev.12999
Feng, S., et al.: The uncanny valley effect in typically developing children and its absence in children with autism spectrum disorders. PLoS One 13(11), e0206343 (2018). https://doi.org/10.1371/journal.pone.0206343
Tinwell, A., Sloan, R.J.S.: Children’s perception of uncanny human-like virtual characters. Comput. Hum. Behav. 36, 286–296 (2014). https://doi.org/10.1016/j.chb.2014.03.073
Acknowledgments
This research was partially funded by the Spanish Government Ministry of Economy and Competitiveness through the DEFINES project grant number (TIN2016-80172-R) and the Ministry of Science and Innovation through the AVisSA project grant number (PID2020-118345RB-I00).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Marcos-Pablos, S., García-Peñalvo, F.J. (2022). Emotional Intelligence in Robotics: A Scoping Review. In: de Paz Santana, J.F., de la Iglesia, D.H., López Rivero, A.J. (eds) New Trends in Disruptive Technologies, Tech Ethics and Artificial Intelligence. DiTTEt 2021. Advances in Intelligent Systems and Computing, vol 1410. Springer, Cham. https://doi.org/10.1007/978-3-030-87687-6_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-87687-6_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-87686-9
Online ISBN: 978-3-030-87687-6
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)