A Preliminary Framework for a Social Robot “Sixth Sense”
Building a social robot that is able to interact naturally with people is a challenging task that becomes even more ambitious if the robots’ interlocutors are children involved in crowded scenarios like a classroom or a museum. In such scenarios, the main concern is enabling the robot to track the subjects’ social and affective state modulating its behaviour on the basis of the engagement and the emotional state of its interlocutors. To reach this goal, the robot needs to gather visual and auditory data, but also to acquire physiological signals, which are fundamental for understating the interlocutors’ psycho-physiological state. Following this purpose, several Human-Robot Interaction (HRI) frameworks have been proposed in the last years, although most of them have been based on the use of wearable sensors. However, wearable equipments are not the best technology for acquisition in crowded multi-party environments for obvious reasons (e.g., all the subjects should be prepared before the experiment by wearing the acquisition devices). Furthermore, wearable sensors, also if designed to be minimally intrusive, add an extra factor to the HRI scenarios, introducing a bias in the measurements due to psychological stress. In order to overcome this limitations, in this work, we present an unobtrusive method to acquire both visual and physiological signals from multiple subjects involved in HRI. The system is able to integrate acquired data and associate them with unique subjects’ IDs. The implemented system has been tested with the FACE humanoid in order to assess integrated devices and algorithms technical features. Preliminary tests demonstrated that the developed system can be used for extending the FACE perception capabilities giving it a sort of sixth sense that will improve the robot empathic and behavioural capabilities.
KeywordsAffective computing Behaviour monitoring Human-Robot Interaction Social robotics Synthetic tutor
This work was partially funded by the European Commission under the 7th Framework Program projects EASEL, Expressive Agents for Symbiotic Education and Learning, under Grant 611971-FP7- ICT-2013-10. Special thanks to Daniela Gasperini for her fundamental contribution in the experiments organization.
- 1.Damasio, A.: Descartes’ Error: Emotion, Reason, and the Human Brain. Grosset/Putnam, New York (1994)Google Scholar
- 2.Bower, G.H.: How might emotions affect learning. Handb. Emot. Mem. Res. Theor. 3, 31 (1992)Google Scholar
- 9.Metta, G., Fitzpatrick, P., Natale, L.: Yarp: yet another robot platform. Int. J. Adv. Robot. Syst. 3(1), 43–48 (2006)Google Scholar
- 10.Zaraki, A., Giuliani, M., Dehkordi, M.B., Mazzei, D., D’ursi, A., De Rossi, D.: An rgb-d based social behavior interpretation system for a humanoid social robot. In: 2014 Second RSI/ISM International Conference on Robotics and Mechatronics (ICRoM), pp. 185–190. IEEE (2014)Google Scholar
- 11.Sampson, D., Karagiannidis, C.: Personalised learning: educational, technological and standardisation perspective. Interact. Educ. Multimedia (4) 24–39 (2010)Google Scholar
- 13.Lisetti, C.L., Nasoz, F.: Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. Adv. Sign. Process. 2004(11), 1–16 (2004)Google Scholar
- 14.Tartarisco, G., Carbonaro, N., Tonacci, A., Bernava, G., Arnao, A., Crifaci, G., Cipresso, P., Riva, G., Gaggioli, A., De Rossi, D., et al.: Neuro-fuzzy physiological computing to assess stress levels in virtual reality therapy. Interact. Comput. (2015). iwv010Google Scholar
- 15.Carbonaro, N., Anania, G., Mura, G.D., Tesconi, M., Tognetti, A., Zupone, G., De Rossi, D.: Wearable biomonitoring system for stress management: a preliminary study on robust ECG signal processing. In: 2011 IEEE International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM), pp. 1–6. IEEE (2011)Google Scholar
- 16.Carbonaro, N., Greco, A., Anania, G., Dalle Mura, G., Tognetti, A., Scilingo, E., De Rossi, D., Lanata, A.: Unobtrusive physiological and gesture wearable acquisition system: a preliminary study on behavioral and emotional correlations. In: Global Health, pp. 88–92 (2012)Google Scholar
- 17.Mazzei, D., Zaraki, A., Lazzeri, N., De Rossi, D.: Recognition and expression of emotions by a symbiotic android head. In: 2014 14th IEEE-RAS International Conference on Humanoid Robots (Humanoids), pp. 134–139. IEEE (2014)Google Scholar