Advertisement

A Preliminary Framework for a Social Robot “Sixth Sense”

  • Lorenzo CominelliEmail author
  • Daniele Mazzei
  • Nicola Carbonaro
  • Roberto Garofalo
  • Abolfazl Zaraki
  • Alessandro Tognetti
  • Danilo De Rossi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9793)

Abstract

Building a social robot that is able to interact naturally with people is a challenging task that becomes even more ambitious if the robots’ interlocutors are children involved in crowded scenarios like a classroom or a museum. In such scenarios, the main concern is enabling the robot to track the subjects’ social and affective state modulating its behaviour on the basis of the engagement and the emotional state of its interlocutors. To reach this goal, the robot needs to gather visual and auditory data, but also to acquire physiological signals, which are fundamental for understating the interlocutors’ psycho-physiological state. Following this purpose, several Human-Robot Interaction (HRI) frameworks have been proposed in the last years, although most of them have been based on the use of wearable sensors. However, wearable equipments are not the best technology for acquisition in crowded multi-party environments for obvious reasons (e.g., all the subjects should be prepared before the experiment by wearing the acquisition devices). Furthermore, wearable sensors, also if designed to be minimally intrusive, add an extra factor to the HRI scenarios, introducing a bias in the measurements due to psychological stress. In order to overcome this limitations, in this work, we present an unobtrusive method to acquire both visual and physiological signals from multiple subjects involved in HRI. The system is able to integrate acquired data and associate them with unique subjects’ IDs. The implemented system has been tested with the FACE humanoid in order to assess integrated devices and algorithms technical features. Preliminary tests demonstrated that the developed system can be used for extending the FACE perception capabilities giving it a sort of sixth sense that will improve the robot empathic and behavioural capabilities.

Keywords

Affective computing Behaviour monitoring Human-Robot Interaction Social robotics Synthetic tutor 

Notes

Acknowledgment

This work was partially funded by the European Commission under the 7th Framework Program projects EASEL, Expressive Agents for Symbiotic Education and Learning, under Grant 611971-FP7- ICT-2013-10. Special thanks to Daniela Gasperini for her fundamental contribution in the experiments organization.

References

  1. 1.
    Damasio, A.: Descartes’ Error: Emotion, Reason, and the Human Brain. Grosset/Putnam, New York (1994)Google Scholar
  2. 2.
    Bower, G.H.: How might emotions affect learning. Handb. Emot. Mem. Res. Theor. 3, 31 (1992)Google Scholar
  3. 3.
    Scovel, T.: The effect of affect on foreign language learning: a review of the anxiety research. Lang. Learn. 28(1), 129–142 (1978)CrossRefGoogle Scholar
  4. 4.
    Hudlicka, E.: To feel or not to feel: the role of affect in human-computer interaction. Int. J. Hum. Comput. Stud. 59(1), 1–32 (2003)CrossRefGoogle Scholar
  5. 5.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robot. Auton. Syst. 42(3), 143–166 (2003)CrossRefzbMATHGoogle Scholar
  6. 6.
    Causo, A., Vo, G.T., Chen, I.M., Yeo, S.H.: Design of robots used as education companion and tutor. In: Zeghloul, S., Laribi, M.A., Gazeau, J.-P. (eds.) Robotics and Mechatronics. Mechanisms and Machine Science, vol. 37, pp. 75–84. Springer, Switzerland (2016)CrossRefGoogle Scholar
  7. 7.
    Yan, H., Ang Jr., M.H., Poo, A.N.: A survey on perception methods for human-robot interaction in social robots. Int. J. Soc. Robot. 6(1), 85–119 (2014)CrossRefGoogle Scholar
  8. 8.
    Zaraki, A., Mazzei, D., Giuliani, M., De Rossi, D.: Designing and evaluating a social gaze-control system for a humanoid robot. IEEE Trans. Hum. Mach. Syst. 44(2), 157–168 (2014)CrossRefGoogle Scholar
  9. 9.
    Metta, G., Fitzpatrick, P., Natale, L.: Yarp: yet another robot platform. Int. J. Adv. Robot. Syst. 3(1), 43–48 (2006)Google Scholar
  10. 10.
    Zaraki, A., Giuliani, M., Dehkordi, M.B., Mazzei, D., D’ursi, A., De Rossi, D.: An rgb-d based social behavior interpretation system for a humanoid social robot. In: 2014 Second RSI/ISM International Conference on Robotics and Mechatronics (ICRoM), pp. 185–190. IEEE (2014)Google Scholar
  11. 11.
    Sampson, D., Karagiannidis, C.: Personalised learning: educational, technological and standardisation perspective. Interact. Educ. Multimedia (4) 24–39 (2010)Google Scholar
  12. 12.
    Brusilovsky, P.: Developing adaptive educational hypermedia systems: from design models to authoring tools. In: Murray, T., Blessing, S.B., Ainsworth, S. (eds.) Authoring Tools for Advanced Technology Learning Environments, pp. 377–409. Springer, Netherlands (2003)CrossRefGoogle Scholar
  13. 13.
    Lisetti, C.L., Nasoz, F.: Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. Adv. Sign. Process. 2004(11), 1–16 (2004)Google Scholar
  14. 14.
    Tartarisco, G., Carbonaro, N., Tonacci, A., Bernava, G., Arnao, A., Crifaci, G., Cipresso, P., Riva, G., Gaggioli, A., De Rossi, D., et al.: Neuro-fuzzy physiological computing to assess stress levels in virtual reality therapy. Interact. Comput. (2015). iwv010Google Scholar
  15. 15.
    Carbonaro, N., Anania, G., Mura, G.D., Tesconi, M., Tognetti, A., Zupone, G., De Rossi, D.: Wearable biomonitoring system for stress management: a preliminary study on robust ECG signal processing. In: 2011 IEEE International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM), pp. 1–6. IEEE (2011)Google Scholar
  16. 16.
    Carbonaro, N., Greco, A., Anania, G., Dalle Mura, G., Tognetti, A., Scilingo, E., De Rossi, D., Lanata, A.: Unobtrusive physiological and gesture wearable acquisition system: a preliminary study on behavioral and emotional correlations. In: Global Health, pp. 88–92 (2012)Google Scholar
  17. 17.
    Mazzei, D., Zaraki, A., Lazzeri, N., De Rossi, D.: Recognition and expression of emotions by a symbiotic android head. In: 2014 14th IEEE-RAS International Conference on Humanoid Robots (Humanoids), pp. 134–139. IEEE (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Lorenzo Cominelli
    • 1
    Email author
  • Daniele Mazzei
    • 1
  • Nicola Carbonaro
    • 1
  • Roberto Garofalo
    • 1
  • Abolfazl Zaraki
    • 1
  • Alessandro Tognetti
    • 1
    • 2
  • Danilo De Rossi
    • 1
    • 2
  1. 1.Faculty of Engineering, Research Center “E. Piaggio”University of PisaPisaItaly
  2. 2.Department of Information EngineeringUniversity of PisaPisaItaly

Personalised recommendations