Advertisement

Evaluating the Emotional Valence of Affective Sounds for Child-Robot Interaction

  • Silvia RossiEmail author
  • Elena Dell’Aquila
  • Benedetta Bucci
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11876)

Abstract

Social Assistive Robots are starting to be widely used in paediatric health-care environments. In this domain, the development of effective strategies to keep the children engaged during the interaction with a social robot is still an open research area. On this subject, some approaches are investigating the combination of distraction strategies, as used in human-human interaction, and the display of emotional behaviours. In this study, we presented the results of a pilot study aimed to evaluate with children the valence of emotional behaviours enhanced with non-verbal sounds. The objective is to endow the NAO robot with emotional-like sounds, selected from a set of para-linguistic behaviours validated by valence. Results show that children aged 3–8 years perceive the robot’s behaviours and the related selected emotional semantic free sounds in terms of different degrees of arousal, valence and dominance: while valence and dominance are clearly perceived by the children, arousal is more difficult to distinguish.

Keywords

Para-verbal sounds Emotional behaviour User study 

References

  1. 1.
    Alemi, M., Ghanbarzadeh, A., Meghdari, A., Moghadam, L.J.: Clinical application of a humanoid robot in pediatric cancer interventions. Int. J. Soc. Robot. 8(5), 743–759 (2016)CrossRefGoogle Scholar
  2. 2.
    Anolli, L., Ciceri, R.: The voice of deception: vocal strategies of naive and able liars. J. Nonverbal Behav. 21(4), 259–284 (1997)CrossRefGoogle Scholar
  3. 3.
    Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)CrossRefGoogle Scholar
  4. 4.
    Breazeal, C.: Social robots for health applications. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5368–5371. IEEE (2011)Google Scholar
  5. 5.
    Dawe, J., Sutherland, C., Barco, A., Broadbent, E.: Can social robots help children in healthcare contexts? A scoping review. BMJ Paediatr. Open 3(1), e000371 (2019)CrossRefGoogle Scholar
  6. 6.
    Ekman, P., Friesen, W.: Emotion facial action coding system (EM-FACS). University of California, San Francisco (1984)Google Scholar
  7. 7.
    Erden, M.S.: Emotional postures for the humanoid-robot Nao. Int. J. Soc. Robot. 5(4), 441–456 (2013)CrossRefGoogle Scholar
  8. 8.
    Fridin, M., Belokopytov, M.: Robotics agent coacher for CP motor function (RAC CP Fun). Robotica 32(8), 1265–1279 (2014)CrossRefGoogle Scholar
  9. 9.
    Harrigan, J.A.: Proxemics, kinesics, and gaze. In: The New Handbook of Methods in Nonverbal Behavior Research, pp. 137–198 (2005)Google Scholar
  10. 10.
    Jeong, S., Logan, D.E., Goodwin, M.S., et al.: A social robot to mitigate stress, anxiety, and pain in hospital pediatric care. In: Proceedings of HRI - Extended Abstracts, pp. 103–104. ACM (2015)Google Scholar
  11. 11.
    Jürgens, R., Fischer, J., Schacht, A.: Hot speech and exploding bombs: autonomic arousal during emotion classification of prosodic utterances and affective sounds. Front. Psychol. 9, 228 (2018)CrossRefGoogle Scholar
  12. 12.
    Klasmeyer, G., Sendlmeier, W.F.: The classification of different phonation types in emotional and neutral speech. Int. J. Speech Lang. Law 4(1), 104–124 (2013)CrossRefGoogle Scholar
  13. 13.
    Kurdi, B., Lozano, S., Banaji, M.R.: Introducing the open affective standardized image set (OASIS). Behav. Res. Methods 49(2), 457–470 (2017)CrossRefGoogle Scholar
  14. 14.
    Libin, A.V., Libin, E.V.: Person-robot interactions from the robopsychologists’ point of view: the robotic psychology and robotherapy approach. Proc. IEEE 92(11), 1789–1803 (2004)CrossRefGoogle Scholar
  15. 15.
    Lopez, M.: Estimation of Cronbach’s alpha for sparse datasets. In: Proceedings of the 20th Annual Conference of the National Advisory Committee on Computing Qualifications (NACCQ), pp. 151–155 (2007)Google Scholar
  16. 16.
    Mehrabian, A., Russell, J.A.: An Approach to Environmental Psychology. MIT Press, Cambridge (1974)Google Scholar
  17. 17.
    Pittman, J., Scherer, K., Lewis, M., Haviland-Jones, J.: Vocal expression and communication of emotions. In: Handbook of Emotions, pp. 185–197 (1993)Google Scholar
  18. 18.
    Plutchik, R.: The nature of emotions: human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 89(4), 344–350 (2001)CrossRefGoogle Scholar
  19. 19.
    Rossi, S., Cimmino, T., Matarese, M., Raiano, M.: Coherent and incoherent robot emotional behavior for humorous and engaging recommendations. In: 28th IEEE RO-MAN, October 2019Google Scholar
  20. 20.
    Rossi, S., Ruocco, M.: Better alone than in bad company: Effects of incoherent non-verbal emotional cues for a humanoid robot. Interac. Stud. (2019, to appear)Google Scholar
  21. 21.
    Rossi, S., Staffa, M., Tamburro, A.: Socially assistive robot for providing recommendations: comparing a humanoid robot with a mobile application. Int. J. Soc. Robot. 10(2), 265–278 (2018)CrossRefGoogle Scholar
  22. 22.
    Russell, J.A., Barrett, L.F.: Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant. J. Pers. Soc. Psychol. 76(5), 805 (1999)CrossRefGoogle Scholar
  23. 23.
    Russell, J.A., Ward, L.M., Pratt, G.: Affective quality attributed to environments: a factor analytic study. Environ. Behav. 13(3), 259–288 (1981)CrossRefGoogle Scholar
  24. 24.
    Sauter, D.A., Eimer, M.: Rapid detection of emotion from human vocalizations. J. Cogn. Neurosci. 22(3), 474–481 (2010)CrossRefGoogle Scholar
  25. 25.
    Simoës-Perlant, A., Lemercier, C., Pêcher, C., Benintendi-Medjaoued, S.: Mood self-assessment in children from the age of 7. Euro. J. Psychol. 14(3), 599 (2018)CrossRefGoogle Scholar
  26. 26.
    Soares, A.P., Pinheiro, A.P., Costa, A., Frade, C.S., Comesaña, M., Pureza, R.: Affective auditory stimuli: adaptation of the international affective digitized sounds for European Portuguese. Behav. Res. Meth. 45(4), 1168–1181 (2013)CrossRefGoogle Scholar
  27. 27.
    Tielman, M., Neerincx, M., Meyer, J.J., Looije, R.: Adaptive emotional expression in robot-child interaction. In: Proceedings of HRI, pp. 407–414. ACM (2014)Google Scholar
  28. 28.
    Tsiourti, C., Weiss, A., Wac, K., Vincze, M.: Multimodal integration of emotional signals from voice, body, and context: effects of (in) congruence on emotion recognition and attitudes towards robots. Int. J. Soc. Robot. 11, 555–573 (2019)CrossRefGoogle Scholar
  29. 29.
    Yilmazyildiz, S., Henderickx, D., Vanderborght, B., Verhelst, W., Soetens, E., Lefeber, D.: Multi-modal emotion expression for affective human-robot interaction. In: Proceedings of the Workshop on Affective Social Speech Signals (2013)Google Scholar
  30. 30.
    Yilmazyildiz, S., Read, R., Belpeame, T., Verhelst, W.: Review of semantic-free utterances in social human-robot interaction. Int. J. Hum. Comput. Interact. 32(1), 63–85 (2016)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Silvia Rossi
    • 1
    Email author
  • Elena Dell’Aquila
    • 1
    • 2
  • Benedetta Bucci
    • 1
  1. 1.Department of Electrical Engineering and Information TechnologiesUniversity of Naples Federico IINaplesItaly
  2. 2.CRDC TecnologieNaplesItaly

Personalised recommendations