iSoundScape: Adaptive Walk on a Fitness Soundscape

  • Reiji Suzuki
  • Souichiro Yamaguchi
  • Martin L. Cody
  • Charles E. Taylor
  • Takaya Arita
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6625)

Abstract

Adaptive walk on a fitness soundscape [7] is a new kind of interactive evolutionary computation for musical works. This system provides a virtual two-dimensional grid called a “soundscape” in which each point corresponds to a genotype that generates a sound environment. By using the human abilities of localization and selective listening, the user can “walk” toward genotypes that generate more favorable sounds. This corresponds to a hill-climbing process on the “fitness soundscape.” This environment can be realized by multiple speakers or a headphone creating “surround sound.” In this work we describe two new applications of adaptive walk. The first is developed for creating spatially grounded musical pieces as an interactive art based on fitness soundscapes. The second provides a new way to explore the ecology and evolution of bird songs, from scientific and educational viewpoints, by exploring the ecological space of “nature’s music”, produced by populations of virtual songbirds.

Keywords

interactive evolutionary computation musical composition fitness landscape surround sound birdsongs artificial life 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Biles, J.A.: GenJam: A Genetic Algorithms for Generating Jazz Solos. In: Proceedings of the 1994 International Computer Music Conference (1994)Google Scholar
  2. 2.
    Cherry, E.C.: Some Experiments on the Recognition of Speech with One and with Two Ears. Journal of the Acoustical Society of America 25, 975–979 (1953)CrossRefGoogle Scholar
  3. 3.
    Dahlstedt, P.: Creating and Exploring Huge Parameter Spaces: Interactive Evolution as a Tool for Sound Generation. In: Proceedings of the International Computer Music Conference 2001, pp. 235–242 (2001)Google Scholar
  4. 4.
    Knees, P., Schedl, M., Pohle, T., Widmer, G.: Exploring Music Collections in Virtual Landscapes. IEEE Multimedia 14(3), 46–54 (2007)CrossRefGoogle Scholar
  5. 5.
    Rocchesso, D., Bresin, R., Fernström, M.: Sounding Objects. IEEE Multimedia 10(2), 42–52 (2003)CrossRefGoogle Scholar
  6. 6.
    Sims, K.: Interactive Evolution of Dynamical Systems. In: Towards a Practice of Autonomous Systems: Proceedings of the First European Conference on Artificial Life, pp. 171–178 (1992)Google Scholar
  7. 7.
    Suzuki, R., Arita, T.: Adaptive Walk on Fitness Soundscape. In: Proceedings of the Tenth European Conferences on Artificial Life (ECAL 2009). LNCS, LNAI (2009) (in press)Google Scholar
  8. 8.
    Takagi, H.: Interactive Evolutionary Computation: Fusion of the Capabilities of EC Optimization and Human Evaluation. Proceedings of the IEEE 89(9), 1275–1296 (2001)CrossRefGoogle Scholar
  9. 9.
    Unemi, T.: SBEAT3: A Tool for Multi-part Music Composition by Simulated Breeding. In: Proceedings of the Eighth International Conference on Artificial Life, pp. 410–413 (2002)Google Scholar
  10. 10.
    Wright, S.: The Roles of Mutation, Inbreeding, Crossbreeding, and Selection in Evolution. In: Proceedings of the Sixth International Congress on Genetics, pp. 355–366 (1932)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Reiji Suzuki
    • 1
    • 2
  • Souichiro Yamaguchi
    • 1
  • Martin L. Cody
    • 2
  • Charles E. Taylor
    • 2
  • Takaya Arita
    • 1
  1. 1.Graduate School of Information Science / SISNagoya UniversityNagoyaJapan
  2. 2.Department of Ecology and Evolutionary BiologyUniversity of CaliforniaLos AngelesUSA

Personalised recommendations