Autonomous Robots

, Volume 31, Issue 2–3, pp 133–153 | Cite as

Interactive improvisation with a robotic marimba player

Article

Abstract

Shimon is a interactive robotic marimba player, developed as part of our ongoing research in Robotic Musicianship. The robot listens to a human musician and continuously adapts its improvisation and choreography, while playing simultaneously with the human. We discuss the robot’s mechanism and motion-control, which uses physics simulation and animation principles to achieve both expressivity and safety. We then present an interactive improvisation system based on the notion of physical gestures for both musical and visual expression. The system also uses anticipatory action to enable real-time improvised synchronization with the human player.

We describe a study evaluating the effect of embodiment on one of our improvisation modules: antiphony, a call-and-response musical synchronization task. We conducted a 3×2 within-subject study manipulating the level of embodiment, and the accuracy of the robot’s response. Our findings indicate that synchronization is aided by visual contact when uncertainty is high, but that pianists can resort to internal rhythmic coordination in more predictable settings. We find that visual coordination is more effective for synchronization in slow sequences; and that occluded physical presence may be less effective than audio-only note generation.

Finally, we test the effects of visual contact and embodiment on audience appreciation. We find that visual contact in joint Jazz improvisation makes for a performance in which audiences rate the robot as playing better, more like a human, as more responsive, and as more inspired by the human. They also rate the duo as better synchronized, more coherent, communicating, and coordinated; and the human as more inspired and more responsive.

Keywords

Human-robot interaction Robotic musicianship Musical robots Embodied cognition Gestures Anticipation Joint action Synchronization User studies 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Baginsky, N. (2004). The three sirens: a self-learning robotic rock band. http://www.the-three-sirens.info/.
  2. Bainbridge, W., Hart, J., Kim, E., & Scassellati, B. (2008). The effect of presence on human-robot interaction. In Proceedings of the 17th IEEE international symposium on robot and human interactive communication (RO-MAN) 2008. Google Scholar
  3. Cadoz, C., & Wanderley, M.M. (2000). Gesture—music. In M.M. Wanderley & M. Battier (Eds.), Trends in gestural control of music (pp. 71–94). Paris: Ircam—Centre Pompidou. Google Scholar
  4. Crick, C., & Scassellati, B. (2006). Synchronization in social tasks: Robotic drumming. In Proceedings of the 15th IEEE international symposium on robot and human interactive communication (RO-MAN), Reading, UK. Google Scholar
  5. Dannenberg, R. B., Brown, B., Zeglin, G., & Lupish, R. (2005). Mcblare: a robotic bagpipe player. In NIME ’05: proceedings of the 2005 conference on new interfaces for musical expression, (pp. 80–84). Singapore: National University of Singapore. Google Scholar
  6. Degallier, S., Santos, C., Righetti, L., & Ijspeert, A. (2006). Movement generation using dynamical systems: a humanoid robot performing a drumming task. In Proceedings of the IEEE-RAS international conference on humanoid robots (HUMANOIDS06). Google Scholar
  7. Hoffman, G. (2009). Human-robot jazz improvisation (full performance). http://www.youtube.com/watch?v=qy02lwvGv3U.
  8. Hoffman, G., & Breazeal, C. (2004). Collaboration in human-robot teams. In Proc. of the AIAA 1st intelligent systems technical conference. Chicago: AIAA. Google Scholar
  9. Hoffman, G., & Breazeal, C. (2006). Robotic partners’ bodies and minds: an embodied approach to fluid human-robot collaboration. In Fifth international workshop on cognitive robotics, AAAI’06. Google Scholar
  10. Hoffman, G., & Breazeal, C. (2007). Cost-based anticipatory action-selection for human-robot fluency. IEEE Transactions on Robotics and Automation, 23(5), 952–961. Google Scholar
  11. Hoffman, G., & Breazeal, C. (2008). Anticipatory perceptual simulation for human-robot joint practice: theory and application study. In Proceedings of the 23rd AAAI conference for artificial intelligence (AAAI’08). Google Scholar
  12. Hoffman, G., & Weinberg, G. (2010). Gesture-based human-robot jazz improvisation. In Proceedings of the IEEE international conference on robotics and automation (ICRA). Google Scholar
  13. Hoffman, G., Kubat, R., & Breazeal, C. (2008). A hybrid control system for puppeterring a live robotic stage actor. In Proceedings of the 17th IEEE international symposium on robot and human interactive communication (RO-MAN) 2008. Google Scholar
  14. Kidd, C., & Breazeal, C. (2004). Effect of a robot on user perceptions. In Proceedings of the IEEE/RSJ international conference on intelligent robots and systems (IROS2004). Google Scholar
  15. Komatsu, T., & Miyake, Y. (2004). Temporal development of dual timing mechanism in synchronization tapping task. In Proceedings of the 13th IEEE international workshop on robot and human communication (RO-MAN) 2004. Google Scholar
  16. Lasseter, J. (1987). Principles of traditional animation applied to 3d computer animation. Computer Graphics, 21(4), 35–44. CrossRefGoogle Scholar
  17. Levenshtein, V.I. (1966). Binary codes capable of correcting deletions, insertions and reversals. Soviet Physics Doklady, 10, 707. MathSciNetGoogle Scholar
  18. Lim, A., Mizumoto, T., Cahier, L., Otsuka, T., Takahashi, T., Komatani, K., Ogata, T., & Okuno, H. (2010). Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist. In IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 1964–1969). doi:10.1109/IROS.2010.5650427. Google Scholar
  19. Meisner, S., & Longwell, D. (1987). Sanford Meisner on acting (1st edn.). New York: Vintage. Google Scholar
  20. Petersen, K., Solis, J., & Takanishi, A. (2010). Musical-based interaction system for the Waseda flutist robot. Autonomous Robots, 28, 471–488. doi:10.1007/s10514-010-9180-5. CrossRefGoogle Scholar
  21. Rowe, R. (2001). Machine musicianship. Cambridge: MIT Press. Google Scholar
  22. Singer, E., Larke, K., & Bianciardi, D. (2003). Lemur guitarbot: Midi robotic string instrument. In NIME ’03: Proceedings of the 2003 conference on new interfaces for musical expression (pp. 188–191). Singapore: National University of Singapore. Google Scholar
  23. Solis, J., Taniguchi, K., Ninomiya, T., Petersen, K., Yamamoto, T., & Takanishi, A. (2009). Implementation of an auditory feedback control system on an anthropomorphic flutist robot inspired on the performance of a professional flutist. Advanced Robotics, 23, 1849–1871. doi:10.1163/016918609X12518783330207, http://www.ingentaconnect.com/content/vsp/arb/2009/00000023/00000014/art00003. CrossRefGoogle Scholar
  24. Toyota (2010). Trumpet robot. http://www2.toyota.co.jp/en/tech/robot/p_robot/.
  25. Weinberg, G., & Driscoll, S. (2006a). Robot-human interaction with an anthropomorphic percussionist. In Proceedings of international ACM computer human interaction conference (CHI) (pp. 1229–1232), Montréal, Canada. Google Scholar
  26. Weinberg, G., & Driscoll, S. (2006b). Toward robotic musicianship. Computer Music Journal, 30(4), 28–45. CrossRefGoogle Scholar
  27. Weinberg, G., & Driscoll, S. (2007). The design of a perceptual and improvisational robotic marimba player. In Proceedings of the 18th IEEE symposium on robot and human interactive communication (RO-MAN 2007) (pp. 769–774), Jeju, Korea. CrossRefGoogle Scholar
  28. Ye, P., Kim, M., & Suzuki, K. (2010). A robot musician interacting with a human partner through initiative exchange. In Proc. of 10th intl. conf. on new interfaces for musical expression (NIME2010). Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.Georgia Institute of TechnologyCenter for Music TechnologyAtlantaUSA

Personalised recommendations