Advertisement

Multimodal Techniques for Human/Robot Interaction

  • Ajay Kapur
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 74)

Abstract

Is it possible for a robot to improvise with a human performer in real-time? This chapter describes a framework for interdisciplinary research geared towards finding solutions to this question. Custom built controllers, influenced by the Human Computer Interaction (HCI) community, serve as new interfaces to gather musical gestures from a performing artist. Designs on how to modify a sitar, the 19-stringed North Indian string instrument, with sensors and electronics are described. Experiments using wearable sensors to capture ancillary gestures of a human performer are also included. A twelve-armed solenoid-based robotic drummer was built to perform on a variety of traditional percussion instruments from around India. The chapter describes experimentation on interfacing a human sitar performer with the robotic drummer. Experiments include automatic tempo tracking and accompaniment methods. This chapter shows contributions in the areas of musical gesture extraction, musical robotics and machine musicianship. However, one of the main novelties was completing the loop and fusing all three of these areas together into a real-time framework.

Keywords

Musical Performance Wearable Sensor Late Fusion Computer Music Musical Expression 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Trimpin.: SoundSculptures: Five Examples. Munich MGM MediaGruppe, Munchen (2000)Google Scholar
  2. 2.
    Solis, J., Bergamasco, M., Isoda, S., Chida, K., Takanishi, A.: Learning to Play the Flute with an Anthropomorphic Robot. In: International Computer Music Conference, Miami, Florida (2004)Google Scholar
  3. 3.
    Jorda, S.: Afasia: the Ultimate Homeric One-Man-Multimedia-Band. In: International Conference on New Interfaces for Musical Expression, Dublin, Ireland (2002)Google Scholar
  4. 4.
    MacMurtie, C.: Amorphic Robot WorksGoogle Scholar
  5. 5.
    Baginsky, N.A.: The Three Sirens: A Self Learning Robotic Rock BandGoogle Scholar
  6. 6.
    Raes, G.W.: Automations by Godfried-Willem RaesGoogle Scholar
  7. 7.
    Kapur, A.: A History of Robotic Musical Instruments. In: International Computer Music Conference, Barcelona, Spain (2005)Google Scholar
  8. 8.
    Machover, T., Chung, J.: Hyperinstruments: Musically Intelligent and Interactive Performance and Creativity Systems. In: International Computer Music Conference, pp. 186–190 (1989)Google Scholar
  9. 9.
    Trueman, D., Cook, P.R.: BoSSA: The Deconstructed Violin Reconstructed. In: Proceedings of the International Computer Music Conference, Beijing, China (1999)Google Scholar
  10. 10.
    Rowe, R.: Machine Musicianship. MIT Press, Cambridge (2004)Google Scholar
  11. 11.
    Dannenberg, R.B.: An On-line Algorithm for Real-Time Accompaniment. In: International Computer Music Conference, Paris, France, pp. 193–198 (1984)Google Scholar
  12. 12.
    Lewis, G.: Too Many Notes: Computers, Complexity and Culture in Voyager. Leonardo Music Journal 10, 33–39 (2000)CrossRefGoogle Scholar
  13. 13.
    Pachet, F.: The Continuator: Musical interaction with Style. In: International Computer Music Conference, Goteborg, Sweden (2002)Google Scholar
  14. 14.
    Singer, E., Larke, K., Bianciardi, D.: LEMUR GuitarBot: MIDI Robotic String Instrument. In: International Conference on New Interfaces for Musical Expression, Montreal, Canada (2003)Google Scholar
  15. 15.
    Weinberg, G., Driscoll, S., Parry, M.: Haile - A Preceptual Robotic Percussionist. In: International Computer Music Conference, Barcelona, Spain (2005)Google Scholar
  16. 16.
    Weinberg, G., Driscoll, S., Thatcher, T.: Jam’aa - A Middle Eastern Percussion Ensemble for Human and Robotic Players. In: International Computer Music Conference, New Orleans, pp. 464–467 (2006)Google Scholar
  17. 17.
    Kapur, A., Lazier, A., Davidson, P., Wilson, R.S., Cook, P.R.: The Electronic Sitar Controller. In: International Conference on New Interfaces for Musical Expression, Hamamatsu, Japan (2004)Google Scholar
  18. 18.
    Menon, R.R.: Discovering Indian Music. Somaiya Publications Pvt. Ltd, Mumbai (1974)Google Scholar
  19. 19.
    Vir, R.A.: Learn to Play on Sitar. Punjab Publications, New Delhi (1998)Google Scholar
  20. 20.
    Bagchee, S.: NAD: Understanding Raga Music. Ceshwar Business Publications, Inc, Mumbai (1998)Google Scholar
  21. 21.
    Wilson, R.S., Gurevich, M., Verplank, B., Stang, P.: Microcontrollers in Music Education - Reflections on our Switch to the Atmel AVR Platform. In: International Conference on New Interfaces for Musical Expression, Montreal, Canada (2003)Google Scholar
  22. 22.
    Sharma, S.: Comparative Study of Evolution of Music in India & the West. Pratibha Prakashan, New Delhi (1997)Google Scholar
  23. 23.
    Wait, B.: Mirror-6 MIDI Guitar Controller Owner’s Manual. Zeta Music Systems, Inc., Oakland (1989)Google Scholar
  24. 24.
    Puckette, M.: Pure Data: Another Integrated Computer Music Environment. In: Second Intercollege Computer Music Concerts, Tachikawa, Japan, pp. 37–41 (1996)Google Scholar
  25. 25.
    Zolzer, U.: DAFX: Digital Audio Effects. John Wiley and Sons, Ltd. England (2002)Google Scholar
  26. 26.
    Merrill, D.: Head-Tracking for Gestural and Continuous Control of Parameterized Audio Effects. In: International Conference on New Interfaces for Musical Expression, Montreal, Canada (2003)Google Scholar
  27. 27.
    Wang, G., Cook, P.R.: ChucK: A Concurrent, On-the-fly Audio Programming Language. In: International Computer Music Conference, Singapore (2003)Google Scholar
  28. 28.
    Kapur, A., Yang, E.L., Tindale, A.R., Driessen, P.F.: Wearable Sensors for Real-Time Musical Signal Processing. In: IEEE Pacific Rim Conference, Victoria, Canada (2005)Google Scholar
  29. 29.
    Till, B.C., Benning, M.S., Livingston, N.: Wireless Inertial Sensor Package (WISP). In: International Conference on New Interfaces for Musical Expression, New Yark City (2007)Google Scholar
  30. 30.
    Wright, M., Freed, A., Momeni, A.: OpenSound Control: State of the Art 2003. In: International Conference on New Interfaces for Musical Expression, Montreal, Canada (2003)Google Scholar
  31. 31.
    Schloss, A.W.: Using Contemporary Technology in Live Performance: The Dilemma of the Performer. Journal of New Music Research 32, 239–242 (2003)CrossRefGoogle Scholar
  32. 32.
    Singer, E., Feddersen, J., Redmon, C., Bowen, B.: LEMUR’s Musical Robots. In: International Conference on New Interfaces for Musical Expression, Hamamatsu, Japan (2004)Google Scholar
  33. 33.
    Trimpin, Portfolio, Seattle, WashingtonGoogle Scholar
  34. 34.
    Brown, R.G., Hwang, P.Y.C.: Introduction of Random Signals and Applied Kalman Filtering. John Wiley & Sons Inc., Chichester (1992)Google Scholar
  35. 35.
    Kasteren T.V.: Realtime Tempo Tracking using Kalman Filtering. In: Computer Science. University of Amsterdam, Masters Amsterdam (2006) Google Scholar
  36. 36.
    Gouyon, F., Klapuri, A., Dixon, S., Alonso, M., Tzanetakis, G., Uhle, C., Cano, P.: An Experimental Comparison of Audio Tempo Induction Algorithms. IEEE Transactions on Speech and Audio Processing 14 (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Ajay Kapur
    • 1
    • 2
  1. 1.California Institute of the ArtsValenciaUSA
  2. 2.New Zealand School of MusicVictoria University of Wellington, New ZealandValenciaUSA

Personalised recommendations