Conveying Audience Emotions Through Humanoid Robot Gestures to an Orchestra During a Live Musical Exhibition

  • Marcello GiardinaEmail author
  • Salvatore Tramonte
  • Vito Gentile
  • Samuele Vinanzi
  • Antonio Chella
  • Salvatore Sorce
  • Rosario Sorbello
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 611)


In the last twenty years, robotics have been applied in many heterogeneous contexts. Among them, the use of humanoid robots during musical concerts have been proposed and investigated by many authors. In this paper, we propose a contribution in the area of robotics application in music, consisting of a system for conveying audience emotions during a live musical exhibition, by means of a humanoid robot. In particular, we provide all spectators with a mobile app, by means of which they can select a specific color while listening to a piece of music (act). Each color is mapped to an emotion, and the audience preferences are then processed in order to select the next act to be played. This decision, based on the overall emotion felt by the audience, is then communicated by the robot through body gestures to the orchestra. Our first results show that spectators enjoy such kind of interactive musical performance, and are encouraging for further investigations.


HRI Musical robotics Humanoid robotics 



The authors wish to thank the musical ensemble members, the students Corsello, Caravello and the professors Betta, Correnti, D’Aquila and Rapisarda from the Conservatorio di Musica “Vincenzo Bellini” di Palermo for their fundamental contribution for the realization of the musical performance and for their invaluable open-mindedness.


  1. 1.
    Anzalone, S.M., Tilmont, E., Boucenna, S., Xavier, J., Jouen, A.L., Bodeau, N., Maharatna, K., Chetouani, M., Cohen, D., Group, M.S., et al.: How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3d+ time) environment during a joint attention induction task with a robot. Res. Autism Spectr. Dis. 8(7), 814–826 (2014)CrossRefGoogle Scholar
  2. 2.
    Anzalone, S., Cinquegrani, F., Sorbello, R., Chella, A.: An emotional humanoid partner. In: Proceedings of the 1st International Symposium on Linguistic and Cognitive Approaches to Dialog Agents - A Symposium at the AISB 2010 Convention, pp. 1–6 (2010)Google Scholar
  3. 3.
    Augello, A., Infantino, I., Pilato, G., Rizzo, R., Vella, F.: Binding representational spaces of colors and emotions for creativity. Biol. Inspired Cogn. Architectures 5, 64–71 (2013)CrossRefGoogle Scholar
  4. 4.
    Brown, L., Howard, A.M.: Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, pp. 471–476 (2014)Google Scholar
  5. 5.
    Burger, B., Bresin, R.: Communication of musical expression by means of mobile robot gestures. J. Multimodal User Interfaces 3(1), 109–118 (2010)CrossRefGoogle Scholar
  6. 6.
    Chella, A., Sorbello, R., Pilato, G., Vassallo, G., Balistreri, G., Giardina, M.: An architecture with a mobile phone interface for the interaction of a human with a humanoid robot expressing emotions and personality. In: Congress of the Italian Association for Artificial Intelligence, pp. 117–126. Springer (2011)Google Scholar
  7. 7.
    Cowie, R., Cornelius, R.R.: Describing the emotional states that are expressed in speech. Speech Commun. 40(1–2), 5–32 (2003)CrossRefzbMATHGoogle Scholar
  8. 8.
    Ekman, P.: Basic Emotions, pp. 45–60. Wiley, New York (2005)Google Scholar
  9. 9.
    Gentile, V., Sorce, S., Gentile, A.: Continuous hand openness detection using a kinect-like device. In: 2014 Eighth International Conference on Complex, Intelligent and Software Intensive Systems, pp. 553–557 (2014)Google Scholar
  10. 10.
    Gentile, V., Sorce, S., Malizia, A., Gentile, A.: Gesture recognition using low-cost devices: Techniques, applications, perspectives (Riconoscimento di gesti mediante dispositivi a basso costo: Tecniche, applicazioni, prospettive). Mondo Digitale 15(63), 161–169 (2016)Google Scholar
  11. 11.
    Gentile, V., Milazzo, F., Sorce, S., Gentile, A., Pilato, G., Augello, A.: Body gestures and spoken sentences: a novel approach for revealing user’s emotions. In: Proceedings of 11th International Conference on Semantic Computing (IEEE ICSC 2017) (2017)Google Scholar
  12. 12.
    Hoffman, G., Bauman, S., Vanunu, K.: Robotic experience companionship in music listening and video watching. Pers. Ubiquit. Comput. 20(1), 51–63 (2016)CrossRefGoogle Scholar
  13. 13.
    Lim, A., Ogata, T., Okuno, H.G.: Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music. EURASIP J. Audio Speech Music Process. 2012(1), 3 (2012)CrossRefGoogle Scholar
  14. 14.
    McCallum, L., McOwan, P.W.: Face the music and glance: how nonverbal behaviour aids human robot relationships based in music. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 237–244, HRI 2015, NY, USA. ACM, New York (2015)Google Scholar
  15. 15.
    Meudt, S., Schmidt-Wack, M., Honold, F., Schüssel, F., Weber, M., Schwenker, F., Palm, G.: Going further in affective computing: how emotion recognition can improve adaptive user interaction, pp. 73–103. Springer, Cham (2016)Google Scholar
  16. 16.
    Posner, J., Russel, J.A., Peterson, B.S.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17(3), 715–734 (2005)CrossRefGoogle Scholar
  17. 17.
    Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)CrossRefGoogle Scholar
  18. 18.
    Soleymani, M., Caro, M.N., Schmidt, E.M., Sha, C.Y., Yang, Y.H.: 1000 songs for emotional analysis of music. In: Proceedings of the 2nd ACM International Workshop on Crowdsourcing for Multimedia, pp. 1–6, CrowdMM 2013, NY, USA. ACM, New York (2013)Google Scholar
  19. 19.
    Sorbello, R., Chella, A., Calí, C., Giardina, M., Nishio, S., Ishiguro, H.: Telenoid android robot as an embodied perceptual social regulation medium engaging natural human-humanoid interaction. Robot. Auton. Syst. 62(9), 1329–1341 (2014). Intelligent Autonomous SystemsCrossRefGoogle Scholar
  20. 20.
    Sorbello, R., Chella, A., Giardina, M., Nishio, S., Ishiguro, H.: An architecture for telenoid robot as empathic conversational android companion for elderly people. In: Intelligent Autonomous Systems, vol. 13, pp. 939–953. Springer (2016)Google Scholar
  21. 21.
    Spataro, R., Chella, A., Allison, B., Giardina, M., Sorbello, R., Tramonte, S., Guger, C., La Bella, V.: Reaching and grasping a glass of water by locked-in ALS patients through a BCI-controlled humanoid robot. Front. Hum. Neurosci. 11, 68 (2017)CrossRefGoogle Scholar
  22. 22.
    Tkalčič, M., De Carolis, B., de Gemmis, M., Odić, A., Košir, A.: Introduction to Emotions and Personality in Personalized Systems, pp. 3–11. Springer, Cham (2016)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Marcello Giardina
    • 1
    Email author
  • Salvatore Tramonte
    • 1
  • Vito Gentile
    • 2
  • Samuele Vinanzi
    • 1
    • 3
  • Antonio Chella
    • 1
  • Salvatore Sorce
    • 2
  • Rosario Sorbello
    • 1
  1. 1.Robotics LabUniversità degli Studi di Palermo - Dipartimento dell’Innovazione Industriale e Digitale (DIID)PalermoItaly
  2. 2.Ubiquitous Systems and Interfaces Group (USI)Università degli Studi di Palermo - Dipartimento dell’Innovazione Industriale e Digitale (DIID)PalermoItaly
  3. 3.Centre for Robotics and Neural SystemsPlymouth UniversityPlymouthUK

Personalised recommendations