Skip to main content
Log in

Collaborative creativity: The Music Room

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

In this paper, we reflect on our experience of designing, developing and evaluating interactive spaces for collaborative creativity. In particular, we are interested in designing spaces which allow everybody to compose and play original music. The Music Room is an interactive installation where couples can compose original music by moving in the space. Following the metaphor of love, the music is automatically generated and modulated in terms of pleasantness and intensity, according to the proxemics cues extracted from the visual tracking algorithm. The Music Room was exhibited during the EU Researchers’ Night in Trento, Italy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. Videos from The Music Room can be viewed at http://youtu.be/92UDoy8QCDs and http://youtu.be/qbmETsxcVc0.

References

  1. Alpert J, Alpert M (1990) Music influences on mood and purchase intentions. Psychol Market 7(2):109–133

    Article  Google Scholar 

  2. Amabile T (1996) Creativity in context: update to the social psychology of creativity. Westview Press, Boulder

    Google Scholar 

  3. Balkwill L, Thompson W, Matsunaga R (2004) Recognition of emotion in japanese, western, and hindustani music by japanese listeners 1. Jpn Psychol Res 46(4):337–349

    Article  Google Scholar 

  4. Blaine T, Fels S (2003) Contexts of collaborative musical experiences. In: Proceedings of the 2003 conference on new interfaces for musical expression. National University of Singapore, pp 129–134

  5. Blaine T, Perkis T (2000) The jam-o-drum interactive music system: a study in interaction design. In: Proceedings of the 3rd conference on designing interactive systems: processes, practices, methods, and techniques. ACM, pp 165–173

  6. Boyatzis RE (1998) Transforming qualitative information: thematic analysis and code development. Sage, Beverly Hills, CA

    Google Scholar 

  7. Bradski G (1998) Computer vision face tracking for use in a perceptual user interface. In: IEEE workshop on applications of computer vision. Princeton, pp, 214–219

  8. Bresin R, Friberg A (2000) Emotional coloring of computer-controlled music performances. Comput Music J 24(4):44–63

    Article  Google Scholar 

  9. Brooks F, Hopkins A, Neumann P, Wright W (1957) An experiment in musical composition. IRE Trans Electron Comput 3:175–182

    Article  Google Scholar 

  10. Calderara S, Cucchiara R (2012) Understanding dyadic interactions applying proxemic theory on videosurveillance trajectories. In: 2012 IEEE computer society conference on computer vision and pattern recognition workshops (CVPRW). IEEE, pp 20–27

  11. Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum–Comput Stud 59(1):213–225

    Article  Google Scholar 

  12. Camurri A, Volpe G, De Poli G, Leman M (2005) Communicating expressiveness and affect in multimodal interactive systems. IEEE Multimed 12(1):43–53

    Article  Google Scholar 

  13. Camurri A, Canepa C, Volpe G (2007) Active listening to a virtual orchestra through an expressive gestural interface: the Orchestra Explorer. In: Proceedings of the 7th international conference on new interfaces for musical expression. ACM, pp 56–61

  14. Camurri A, Varni G, Volpe G (2010) Towards analysis of expressive gesture in groups of users: computational models of expressive social interaction. Gesture in embodied communication and human–computer interaction. Springer, Heidelberg, pp 122–133

  15. Camurri A, Hashimoto S, Ricchetti M, Ricci A, Suzuki K, Hashimotot S (2013) EyesWeb: toward gesture music systems. Comput Music J 24(1):57–69

    Article  Google Scholar 

  16. Cope D (2005) Computer models of musical creativity. MIT Press, Cambridge

  17. Eriksson E, Hansen T, Lykke-Olesen A (2007) Movement-based interaction in camera spaces: a conceptual framework. Pers Ubiquit Comput 11(8):621–632

    Article  Google Scholar 

  18. Fischer G, Giaccardi E (2007) Sustaining social creativity. Commun ACM 50(12):28–29

    Google Scholar 

  19. Fritz T, Jentschke S, Gosselin N, Sammler D, Peretz I, Turner R, Friederici A, Koelsch S (2009) Universal recognition of three basic emotions in music. Curr Biol 19(7):573–576

    Article  Google Scholar 

  20. Gabrielsson A, Juslin P (1996) Emotional expression in music performance: between the performer’s intention and the listener’s experience. Psychol Music 24(1):68–91

    Article  Google Scholar 

  21. Gabrielsson A, Lindström E (2001) The influence of musical structure on emotional expression. In: Juslin PN, Sloboda JA (eds) Music and emotion: theory and research. Oxford University Press, Oxford, pp 223–248

  22. Gartland-Jones A, Copley P (2003) The suitability of genetic algorithms for musical composition. Contemp Music Rev 22(3):43–55

    Article  Google Scholar 

  23. Greenberg S, Marquardt N, Ballendat T, Diaz-Marino R, Wang M (2011) Proxemic interactions: the new ubicomp? Interactions 18(1):42–50

    Article  Google Scholar 

  24. Hall E (1973) The silent language. Anchor

  25. Hevner K (1937) The affective value of pitch and tempo in music. Am J Psychol 49(4):621–630

    Article  Google Scholar 

  26. Hiller L, Isaacson L (1958) Musical composition with a high-speed digital computer. J Audio-Eng Soc 6(3):154–160

    Google Scholar 

  27. Ilie G, Thompson W (2006) A comparison of acoustic cues in music and speech for three dimensions of affect. Music Percept 23(4):319–330

    Article  MathSciNet  Google Scholar 

  28. Jetter H, Geyer F, Schwarz T, Reiterer H (2012) Blended interaction—toward a framework for the design of interactive spaces. In: Proceedings of workshop on designing collaborative interactive spaces AVI 2012

  29. Jordà S, Geiger G, Alonso M, Kaltenbrunner M (2007) The reactable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st international conference on tangible and embedded interaction. ACM, pp 139–146

  30. Juslin P, Laukka P (2004) Expression, perception, and induction of musical emotions: A review and a questionnaire study of everyday listening. J New Music Res 33(3):217–238

    Article  Google Scholar 

  31. Juslin P, Sloboda J et al (2001) Music and emotion, vol 315. Oxford University Press, Oxford

    Google Scholar 

  32. Juslin P, Sloboda J (2009) Handbook of music and emotion: theory, research, applications: theory, research, applications. Oxford University Press, Oxford

    Google Scholar 

  33. KaewTraKulPong P, Bowden R (2001) An improved adaptive background mixture model for real-time tracking with shadow detection. In: Proceedings of the 2nd European workshop on advanced video based surveillance systems, vol 25, pp 1–5

  34. Legaspi R, Hashimoto Y, Moriyama K, Kurihara S, Numao M (2007) Music compositional intelligence with an affective flavor. In: Proceedings of the 12th international conference on intelligent user interfaces. ACM, pp 216–224

  35. Lepri B, Mana N, Cappelletti A, Pianesi F, Zancanaro M (2010) What is happening now? detection of activities of daily living from simple visual features. Pers Ubiquit Comput 14(8):749–766

    Article  Google Scholar 

  36. Mancini M, Castellano G, Peters C, McOwan P (2011) Evaluating the communication of emotion via expressive gesture copying behaviour in an embodied humanoid agent. Affective computing and intelligent interaction. Lecture notes in computer science, vol 6974. pp 215–224

  37. Maslow AH (1947) A theory of human motivation. Psychol Rev 50(4):370–396

    Article  Google Scholar 

  38. Meyer L (1956) Emotion and meaning in music. University of Chicago Press, Chicago

    Google Scholar 

  39. Miranda E, Biles J (2007) Evolutionary computer music. Springer

  40. Morreale F, Masu R, De Angeli A (2013) Robin: an algorithmic composer for interactive scenarios. In: Proceedings of SMC 2013: 10th sound and music computing conference, pp 207–212

  41. Morreale F, Masu R, De Angeli A, Fava P (2013) The Effect of expertise in evaluating emotions in music. In: Proceedings of the 3rd international conference on music and emotion

  42. Nierhaus G (2009) Algorithmic composition: paradigms of automated music generation. Springer, Wien

    Book  Google Scholar 

  43. O’Hara K, Lipson M, Jansen M, Unger A, Jeffries H, Macer P (2004) Jukola: democratic music choice in a public space. In: Symposium on designing interactive systems: proceedings of the 5th conference on designing interactive systems: processes, practices, methods, and techniques, 2004, pp 145–154

  44. Oliveira A, Cardoso A (2010) A musical system for emotional expression. Knowl-Based Syst 23(8):901–913

    Article  Google Scholar 

  45. O’Modhrain M, Adviser-Chafe C (2001) Playing by feel: incorporating haptic feedback into computer-based musical instruments. Stanford University, Stanford

    Google Scholar 

  46. Pachet F (2003) The continuator: musical interaction with style. J New Music Res 32(3):333–341

    Google Scholar 

  47. Rader G (1974) A method for composing simple traditional music by computer. Commun ACM 17(11):631–638

    Article  Google Scholar 

  48. Rasamimanana N et al (2012) The urban musical game: using sport balls as musical interfaces. In: CHI extended abstracts. ACM, pp 1027–1030

  49. Rota P, Conci N, Sebe N (2012) Real time detection of social interactions in surveillance video. Computer vision–ECCV 2012. Workshops and demonstrations, pp 111–120

  50. Rowe R (1993) Interactive music systems: machine listening and composition. MIT Press, Cambridge, MA

  51. Russell J (1980) A circumplex model of affect. J Person Soc Psychol 39(6):1161

    Article  Google Scholar 

  52. Sawyer R (2012) Explaining creativity: the science of human innovation: the science of human innovation. Oxford University Press, USA

    Google Scholar 

  53. Shneiderman B (2007) Creativity support tools: accelerating discovery and innovation. Commun ACM 50(12):20–32

    Article  Google Scholar 

  54. Simon I, Morris D, Basu S (2008) Mysong: automatic accompaniment generation for vocal melodies. In: Proceedings of the twenty-sixth annual SIGCHI conference on Human factors in computing systems. ACM, pp 725–734

  55. Sloboda J, Juslin P (2010) At the interface between the inner and outer world: psychological perspectives. In: Juslin PN, Sloboda JA (eds) Music and emotion: theory and research. Oxford University Press, Oxford, pp 73–97

  56. Steedman M (1984) A generative grammar for jazz chord sequences. Music Percept 2:52–77

    Google Scholar 

  57. Taylor RSG, Shearer J (2011) Designing from within: humanaquarium. In: Proceedings of CHI 2011. ACM, pp 725–734

  58. Temperley D (2004) The cognition of basic musical structures. MIT press, Cambridge

    Google Scholar 

  59. Todd P, Werner G (1999) Frankensteinian methods for evolutionary music. Musical networks: parallel distributed perception and performance. MIT Press/Bradford Books, Cambridge, pp 313–339

  60. van Boerdonk K, Tieben R, Klooster S, van den Hoven E (2009) Contact through canvas: an entertaining encounter. Pers Ubiquit Comput 13(8):551–567

    Article  Google Scholar 

  61. Varni G, Mancini M, Volpe G, Camurri A (2010) Sync’n’move: social interaction based on music and gesture. In: Daras P, Mayora O (eds) User Centric Media.  Springer, Heidelberg, pp 31–38

  62. Wallis I, Ingalls T, Campana E (2008) Computer-generating emotional music: the design of an affective music algorithm. DAFx-08, Espoo, Finland, pp 7–12

  63. Wallis I, Ingalls T, Campana E, Goodman J (2011) A rule-based generative music system controlled by desired valence and arousal. In: Proceedings of 8th international sound and music computing conference (SMC)

  64. Wiethoff A, Gehring S (2012) Designing interaction with media façades: a case study. In: Proceedings of the designing interactive systems conference. ACM, pp 308–317

  65. Wiggins G, Papadopoulos G, Phon-Amnuaisuk S, Tuson A (1999) Evolutionary methods for musical composition. Int J Comput Anticipatory Syst

Download references

Acknowledgments

First and foremost, we wish to thank all the visitors of the 2012 EU Researchers’ Night in Trento, who often faced a long wait in order to experience The Music Room. We sincerely hope you enjoyed the installation and we are working on an improved version of it, basing on your valuable suggestions. We are much obliged to the members in our team, who helped us to set up and run The Music Room in two weeks of hard voluntary work. Special thanks to Patrizio Fava, Andrea Pecchielan, Maria Menendez, Chiara Gadotti, Zeno Menestrina and Michele Bianchi. Finally, we wish to express our gratitude to Costanza Vettori, who helped us to edit this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fabio Morreale.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Morreale, F., De Angeli, A., Masu, R. et al. Collaborative creativity: The Music Room. Pers Ubiquit Comput 18, 1187–1199 (2014). https://doi.org/10.1007/s00779-013-0728-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-013-0728-1

Keywords

Navigation