Personal and Ubiquitous Computing

, Volume 18, Issue 5, pp 1187–1199 | Cite as

Collaborative creativity: The Music Room

  • Fabio MorrealeEmail author
  • Antonella De Angeli
  • Raul Masu
  • Paolo Rota
  • Nicola Conci
Original Article


In this paper, we reflect on our experience of designing, developing and evaluating interactive spaces for collaborative creativity. In particular, we are interested in designing spaces which allow everybody to compose and play original music. The Music Room is an interactive installation where couples can compose original music by moving in the space. Following the metaphor of love, the music is automatically generated and modulated in terms of pleasantness and intensity, according to the proxemics cues extracted from the visual tracking algorithm. The Music Room was exhibited during the EU Researchers’ Night in Trento, Italy.


Musical interfaces User experience Performing art Active listening Proxemics 



First and foremost, we wish to thank all the visitors of the 2012 EU Researchers’ Night in Trento, who often faced a long wait in order to experience The Music Room. We sincerely hope you enjoyed the installation and we are working on an improved version of it, basing on your valuable suggestions. We are much obliged to the members in our team, who helped us to set up and run The Music Room in two weeks of hard voluntary work. Special thanks to Patrizio Fava, Andrea Pecchielan, Maria Menendez, Chiara Gadotti, Zeno Menestrina and Michele Bianchi. Finally, we wish to express our gratitude to Costanza Vettori, who helped us to edit this paper.


  1. 1.
    Alpert J, Alpert M (1990) Music influences on mood and purchase intentions. Psychol Market 7(2):109–133CrossRefGoogle Scholar
  2. 2.
    Amabile T (1996) Creativity in context: update to the social psychology of creativity. Westview Press, BoulderGoogle Scholar
  3. 3.
    Balkwill L, Thompson W, Matsunaga R (2004) Recognition of emotion in japanese, western, and hindustani music by japanese listeners 1. Jpn Psychol Res 46(4):337–349CrossRefGoogle Scholar
  4. 4.
    Blaine T, Fels S (2003) Contexts of collaborative musical experiences. In: Proceedings of the 2003 conference on new interfaces for musical expression. National University of Singapore, pp 129–134Google Scholar
  5. 5.
    Blaine T, Perkis T (2000) The jam-o-drum interactive music system: a study in interaction design. In: Proceedings of the 3rd conference on designing interactive systems: processes, practices, methods, and techniques. ACM, pp 165–173Google Scholar
  6. 6.
    Boyatzis RE (1998) Transforming qualitative information: thematic analysis and code development. Sage, Beverly Hills, CAGoogle Scholar
  7. 7.
    Bradski G (1998) Computer vision face tracking for use in a perceptual user interface. In: IEEE workshop on applications of computer vision. Princeton, pp, 214–219Google Scholar
  8. 8.
    Bresin R, Friberg A (2000) Emotional coloring of computer-controlled music performances. Comput Music J 24(4):44–63CrossRefGoogle Scholar
  9. 9.
    Brooks F, Hopkins A, Neumann P, Wright W (1957) An experiment in musical composition. IRE Trans Electron Comput 3:175–182CrossRefGoogle Scholar
  10. 10.
    Calderara S, Cucchiara R (2012) Understanding dyadic interactions applying proxemic theory on videosurveillance trajectories. In: 2012 IEEE computer society conference on computer vision and pattern recognition workshops (CVPRW). IEEE, pp 20–27Google Scholar
  11. 11.
    Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum–Comput Stud 59(1):213–225CrossRefGoogle Scholar
  12. 12.
    Camurri A, Volpe G, De Poli G, Leman M (2005) Communicating expressiveness and affect in multimodal interactive systems. IEEE Multimed 12(1):43–53CrossRefGoogle Scholar
  13. 13.
    Camurri A, Canepa C, Volpe G (2007) Active listening to a virtual orchestra through an expressive gestural interface: the Orchestra Explorer. In: Proceedings of the 7th international conference on new interfaces for musical expression. ACM, pp 56–61Google Scholar
  14. 14.
    Camurri A, Varni G, Volpe G (2010) Towards analysis of expressive gesture in groups of users: computational models of expressive social interaction. Gesture in embodied communication and human–computer interaction. Springer, Heidelberg, pp 122–133Google Scholar
  15. 15.
    Camurri A, Hashimoto S, Ricchetti M, Ricci A, Suzuki K, Hashimotot S (2013) EyesWeb: toward gesture music systems. Comput Music J 24(1):57–69CrossRefGoogle Scholar
  16. 16.
    Cope D (2005) Computer models of musical creativity. MIT Press, CambridgeGoogle Scholar
  17. 17.
    Eriksson E, Hansen T, Lykke-Olesen A (2007) Movement-based interaction in camera spaces: a conceptual framework. Pers Ubiquit Comput 11(8):621–632CrossRefGoogle Scholar
  18. 18.
    Fischer G, Giaccardi E (2007) Sustaining social creativity. Commun ACM 50(12):28–29Google Scholar
  19. 19.
    Fritz T, Jentschke S, Gosselin N, Sammler D, Peretz I, Turner R, Friederici A, Koelsch S (2009) Universal recognition of three basic emotions in music. Curr Biol 19(7):573–576CrossRefGoogle Scholar
  20. 20.
    Gabrielsson A, Juslin P (1996) Emotional expression in music performance: between the performer’s intention and the listener’s experience. Psychol Music 24(1):68–91CrossRefGoogle Scholar
  21. 21.
    Gabrielsson A, Lindström E (2001) The influence of musical structure on emotional expression. In: Juslin PN, Sloboda JA (eds) Music and emotion: theory and research. Oxford University Press, Oxford, pp 223–248Google Scholar
  22. 22.
    Gartland-Jones A, Copley P (2003) The suitability of genetic algorithms for musical composition. Contemp Music Rev 22(3):43–55CrossRefGoogle Scholar
  23. 23.
    Greenberg S, Marquardt N, Ballendat T, Diaz-Marino R, Wang M (2011) Proxemic interactions: the new ubicomp? Interactions 18(1):42–50CrossRefGoogle Scholar
  24. 24.
    Hall E (1973) The silent language. AnchorGoogle Scholar
  25. 25.
    Hevner K (1937) The affective value of pitch and tempo in music. Am J Psychol 49(4):621–630CrossRefGoogle Scholar
  26. 26.
    Hiller L, Isaacson L (1958) Musical composition with a high-speed digital computer. J Audio-Eng Soc 6(3):154–160Google Scholar
  27. 27.
    Ilie G, Thompson W (2006) A comparison of acoustic cues in music and speech for three dimensions of affect. Music Percept 23(4):319–330CrossRefMathSciNetGoogle Scholar
  28. 28.
    Jetter H, Geyer F, Schwarz T, Reiterer H (2012) Blended interaction—toward a framework for the design of interactive spaces. In: Proceedings of workshop on designing collaborative interactive spaces AVI 2012Google Scholar
  29. 29.
    Jordà S, Geiger G, Alonso M, Kaltenbrunner M (2007) The reactable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st international conference on tangible and embedded interaction. ACM, pp 139–146Google Scholar
  30. 30.
    Juslin P, Laukka P (2004) Expression, perception, and induction of musical emotions: A review and a questionnaire study of everyday listening. J New Music Res 33(3):217–238CrossRefGoogle Scholar
  31. 31.
    Juslin P, Sloboda J et al (2001) Music and emotion, vol 315. Oxford University Press, OxfordGoogle Scholar
  32. 32.
    Juslin P, Sloboda J (2009) Handbook of music and emotion: theory, research, applications: theory, research, applications. Oxford University Press, OxfordGoogle Scholar
  33. 33.
    KaewTraKulPong P, Bowden R (2001) An improved adaptive background mixture model for real-time tracking with shadow detection. In: Proceedings of the 2nd European workshop on advanced video based surveillance systems, vol 25, pp 1–5Google Scholar
  34. 34.
    Legaspi R, Hashimoto Y, Moriyama K, Kurihara S, Numao M (2007) Music compositional intelligence with an affective flavor. In: Proceedings of the 12th international conference on intelligent user interfaces. ACM, pp 216–224Google Scholar
  35. 35.
    Lepri B, Mana N, Cappelletti A, Pianesi F, Zancanaro M (2010) What is happening now? detection of activities of daily living from simple visual features. Pers Ubiquit Comput 14(8):749–766CrossRefGoogle Scholar
  36. 36.
    Mancini M, Castellano G, Peters C, McOwan P (2011) Evaluating the communication of emotion via expressive gesture copying behaviour in an embodied humanoid agent. Affective computing and intelligent interaction. Lecture notes in computer science, vol 6974. pp 215–224Google Scholar
  37. 37.
    Maslow AH (1947) A theory of human motivation. Psychol Rev 50(4):370–396CrossRefGoogle Scholar
  38. 38.
    Meyer L (1956) Emotion and meaning in music. University of Chicago Press, ChicagoGoogle Scholar
  39. 39.
    Miranda E, Biles J (2007) Evolutionary computer music. SpringerGoogle Scholar
  40. 40.
    Morreale F, Masu R, De Angeli A (2013) Robin: an algorithmic composer for interactive scenarios. In: Proceedings of SMC 2013: 10th sound and music computing conference, pp 207–212Google Scholar
  41. 41.
    Morreale F, Masu R, De Angeli A, Fava P (2013) The Effect of expertise in evaluating emotions in music. In: Proceedings of the 3rd international conference on music and emotionGoogle Scholar
  42. 42.
    Nierhaus G (2009) Algorithmic composition: paradigms of automated music generation. Springer, WienCrossRefGoogle Scholar
  43. 43.
    O’Hara K, Lipson M, Jansen M, Unger A, Jeffries H, Macer P (2004) Jukola: democratic music choice in a public space. In: Symposium on designing interactive systems: proceedings of the 5th conference on designing interactive systems: processes, practices, methods, and techniques, 2004, pp 145–154Google Scholar
  44. 44.
    Oliveira A, Cardoso A (2010) A musical system for emotional expression. Knowl-Based Syst 23(8):901–913CrossRefGoogle Scholar
  45. 45.
    O’Modhrain M, Adviser-Chafe C (2001) Playing by feel: incorporating haptic feedback into computer-based musical instruments. Stanford University, StanfordGoogle Scholar
  46. 46.
    Pachet F (2003) The continuator: musical interaction with style. J New Music Res 32(3):333–341Google Scholar
  47. 47.
    Rader G (1974) A method for composing simple traditional music by computer. Commun ACM 17(11):631–638CrossRefGoogle Scholar
  48. 48.
    Rasamimanana N et al (2012) The urban musical game: using sport balls as musical interfaces. In: CHI extended abstracts. ACM, pp 1027–1030Google Scholar
  49. 49.
    Rota P, Conci N, Sebe N (2012) Real time detection of social interactions in surveillance video. Computer vision–ECCV 2012. Workshops and demonstrations, pp 111–120Google Scholar
  50. 50.
    Rowe R (1993) Interactive music systems: machine listening and composition. MIT Press, Cambridge, MAGoogle Scholar
  51. 51.
    Russell J (1980) A circumplex model of affect. J Person Soc Psychol 39(6):1161CrossRefGoogle Scholar
  52. 52.
    Sawyer R (2012) Explaining creativity: the science of human innovation: the science of human innovation. Oxford University Press, USAGoogle Scholar
  53. 53.
    Shneiderman B (2007) Creativity support tools: accelerating discovery and innovation. Commun ACM 50(12):20–32CrossRefGoogle Scholar
  54. 54.
    Simon I, Morris D, Basu S (2008) Mysong: automatic accompaniment generation for vocal melodies. In: Proceedings of the twenty-sixth annual SIGCHI conference on Human factors in computing systems. ACM, pp 725–734Google Scholar
  55. 55.
    Sloboda J, Juslin P (2010) At the interface between the inner and outer world: psychological perspectives. In: Juslin PN, Sloboda JA (eds) Music and emotion: theory and research. Oxford University Press, Oxford, pp 73–97Google Scholar
  56. 56.
    Steedman M (1984) A generative grammar for jazz chord sequences. Music Percept 2:52–77Google Scholar
  57. 57.
    Taylor RSG, Shearer J (2011) Designing from within: humanaquarium. In: Proceedings of CHI 2011. ACM, pp 725–734Google Scholar
  58. 58.
    Temperley D (2004) The cognition of basic musical structures. MIT press, CambridgeGoogle Scholar
  59. 59.
    Todd P, Werner G (1999) Frankensteinian methods for evolutionary music. Musical networks: parallel distributed perception and performance. MIT Press/Bradford Books, Cambridge, pp 313–339Google Scholar
  60. 60.
    van Boerdonk K, Tieben R, Klooster S, van den Hoven E (2009) Contact through canvas: an entertaining encounter. Pers Ubiquit Comput 13(8):551–567CrossRefGoogle Scholar
  61. 61.
    Varni G, Mancini M, Volpe G, Camurri A (2010) Sync’n’move: social interaction based on music and gesture. In: Daras P, Mayora O (eds) User Centric Media.  Springer, Heidelberg, pp 31–38Google Scholar
  62. 62.
    Wallis I, Ingalls T, Campana E (2008) Computer-generating emotional music: the design of an affective music algorithm. DAFx-08, Espoo, Finland, pp 7–12Google Scholar
  63. 63.
    Wallis I, Ingalls T, Campana E, Goodman J (2011) A rule-based generative music system controlled by desired valence and arousal. In: Proceedings of 8th international sound and music computing conference (SMC)Google Scholar
  64. 64.
    Wiethoff A, Gehring S (2012) Designing interaction with media façades: a case study. In: Proceedings of the designing interactive systems conference. ACM, pp 308–317Google Scholar
  65. 65.
    Wiggins G, Papadopoulos G, Phon-Amnuaisuk S, Tuson A (1999) Evolutionary methods for musical composition. Int J Comput Anticipatory SystGoogle Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  • Fabio Morreale
    • 1
    Email author
  • Antonella De Angeli
    • 1
  • Raul Masu
    • 1
  • Paolo Rota
    • 1
  • Nicola Conci
    • 1
  1. 1.Experiential Music Lab, MMLab, Department of Information Engineering and Computer ScienceUniversity of TrentoTrentoItaly

Personalised recommendations