Interacting with a Virtual Conductor

  • Pieter Bos
  • Dennis Reidsma
  • Zsófia Ruttkay
  • Anton Nijholt
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4161)


This paper presents a virtual embodied agent that can conduct musicians in a live performance. The virtual conductor conducts music specified by a MIDI file and uses input from a microphone to react to the tempo of the musicians. The current implementation of the virtual conductor can interact with musicians, leading and following them while they are playing music. Different time signatures and dynamic markings in music are supported.


Live Performance Comb Filter Audio Processing Tempo Change Midi File 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Wang, T., Zheng, N., Li, Y., Xu, Y., Shum, H.: Learning kernel-based HMMs for dynamic sequence synthesis. In: Veloso, M., Kambhampati, S. (eds.) Graphical Models, vol. 65, pp. 206–221 (2003)Google Scholar
  2. 2.
    Ruttkay, Z., Huang, A., Eliëns, A.: The Conductor: Gestures for embodied agents with logic programming. In: Proc. of the 2nd Hungarian Computer Graphics Conference, Budapest, pp. 9–16 (2003)Google Scholar
  3. 3.
    Borchers, J., Lee, E., Samminger, W., Mühlhäuser, M.: Personal orchestra: a real-time audio/video system for interactive conducting. Multimedia Systems 9, 458–465 (2004)CrossRefGoogle Scholar
  4. 4.
    Marrin Nakra, T.: Inside the Conductor’s Jacket: Analysis, Interpretation and Musical Synthesis of Expressive Gesture. Ph.D. Thesis, Media Laboratory. Cambridge, MA, Mass. Inst. of Technology (2000)Google Scholar
  5. 5.
    Murphy, D., Andersen, T.H., Jensen, K.: Conducting Audio Files via Computer Vision. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS, vol. 2915, pp. 529–540. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  6. 6.
    Dannenberg, R., Mukaino, H.: New Techniques for Enhanced Quality of Computer Accompaniment. In: Proc. of the International Computer Music Conference, Computer Music Association, pp. 243–249 (1988)Google Scholar
  7. 7.
    Vercoe, B.: The synthetic performer in the context of live musical performance. In: Proc. of the International Computer Music Association, p. 185 (1984)Google Scholar
  8. 8.
    Raphael, C.: Musical Accompaniment Systems. Chance Magazine 17(4), 17–22 (2004)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Gouyon, F., Dixon, S.: A Review of Automatic Rhythm Description Systems. Computer music journal 29, 34–54 (2005)CrossRefGoogle Scholar
  10. 10.
    Gouyon, F., Klapuri, A., Dixon, S., Alonso, M., Tzanetakis, G., Uhle, C., Cano, P.: An Experimental Comparison of Audio Tempo Induction Algorithms. IEEE Transactions on Speech and Audio Processing (2006)Google Scholar
  11. 11.
    Scheirer, E.D.: Tempo and beat analysis of acoustic musical signals. Journal of the Acoustical Society of America 103, 558–601 (1998)CrossRefGoogle Scholar
  12. 12.
    Klapuri, A., Eronen, A., Astola, J.: Analysis of the meter of acoustic musical signals. IEEE transactions on Speech and Audio Processing (2006)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2006

Authors and Affiliations

  • Pieter Bos
    • 1
  • Dennis Reidsma
    • 1
  • Zsófia Ruttkay
    • 1
  • Anton Nijholt
    • 1
  1. 1.HMI, Dept. of CSUniversity of TwenteEnschedeThe Netherlands

Personalised recommendations