Anthropomorphic Musical Robots Designed to Produce Physically Embodied Expressive Performances of Music

Chapter

Abstract

The recent technological advances in robot technology, musical information retrieval, artificial intelligence, and so forth, may enable anthropomorphic robots to roughly emulate the physical dynamics and motor dexterity of humans while playing musical instruments. In particular, research on musical robots provides opportunity to study several aspects outside of robotics, including understanding human motor control from an engineering point of view, understanding how humans generate expressive music performances, and finding new methods for interactive musical expression. Research into computer systems for expressive music performance has been more frequent during the recent decades; such systems are usually being designed to convert a musical score into an expressive musical performance typically including time, sound, and timbre deviations from a deadpan realization of the score and then reproducing this for a MIDI-enabled instrument. However, the lack of a physical response (embodiment) limits the unique experience of the live performance found in human performances. New research paradigms can be conceived from research on musical robots which focuses on the production of a live performance by mechanical means. However, there are still several technical issues to be solved – enabling musical robots to analyze and synthesize musical sounds as musicians do, to understand and reason about music, and to adapt behaviors accordingly. In this chapter, an overview on the current research trends on wind-instrument-playing musical robots will be given by detailing some examples. In particular, the development of an anthropomorphic flutist robot will be presented by describing its mechanical design, the implementation of intelligent control strategies, and the analysis of a number of musical parameters which enable the robot to play an instrument with expressiveness.

Notes

Acknowledgments

A part of this research was done at the Humanoid Robotics Institute (HRI), Waseda University. This research was supported in part by a Gifu-in-Aid for the WABOT-HOUSE Project, by Gifu Prefecture. This work is also supported in part by Global COE Program “Global Robot Academia” from the Ministry of Education, Culture, Sports, Science and Technology of Japan.

References

  1. 1.
    de Vaucanson J. Le Mécanisme du Fluteur Automate; An account of the mechanism of an automation: or, image playing on the German-Flute. In: Frans Vester (ed) The flute library, First series (1979) No. 5. Intro. David Lasocki. Buren (GLD). Uitgeverij Frits Knuf, The NetherlandsGoogle Scholar
  2. 2.
    Kajitani M (1989) Development of musician robots. J Robot Mech 1:254–255Google Scholar
  3. 3.
    Singer E, Feddersen J, Redmon C, Bowen B (2004) LEMUR’s musical robots. In: International conference on new interfaces for musical expression (NIME), Hamamatsu, Japan. pp 181–184Google Scholar
  4. 4.
    Dannenberg RB, Brown B, Zeglin G, Lupish R (2005) McBlare: a robotic bagpipe player. In: International conference on new interface for musical expression (NIME), Vancouver, Canada. pp 80–84Google Scholar
  5. 5.
    Kuwabara H, Seki H, Sasada Y, Aiguo M, Shimojo M (2006) The development of a violin musician robot. In: IEEE/RSJ international conference on intelligent robots and systems – workshop: musical performance robots and its applications, Beijing, China. pp 18–23Google Scholar
  6. 6.
    Takashima S, Miyawaki T (2006) Control of an automatic performance robot of saxophone: performance control using standard MIDI files. In: IEEE/RSJ international conference on intelligent robots and systems – workshop: musical performance robots and its applications, Beijing, China. pp 30–35Google Scholar
  7. 7.
    Shibuya K (2006) Analysis of human KANSEI and development of a violin playing robot. In: IEEE/RSJ international conference on intelligent robots and systems – workshop: musical performance robots and its applications, Beijing, China. pp 13–17Google Scholar
  8. 8.
    Hayashi E (2006) Development of an automatic piano that produce appropriate: touch for the accurate expression of a soft tone. In: IEEE/RSJ international conference on intelligent robots and systems – workshop: musical performance robots and its applications, Beijing, China. pp 7–12Google Scholar
  9. 9.
    Kapur A (2005) A history of robotic musical instruments. In: International computer music conference, Barcelona, Spain. pp 21–28Google Scholar
  10. 10.
    Solis J, Takanishi A (2007) An overview of the research approaches on musical performance robots. In: International conference on computer music, Copenhagen, Denmark. pp 356–359Google Scholar
  11. 11.
    Kato I, Ohteru S, Shirai K, Matsushima T, Narita S, Sugano S, Kobayashi T, Fujisawa E (1987) The robot musician “WABOT-2” (waseda robot-2). Robotics 3(2):143–155CrossRefGoogle Scholar
  12. 12.
    Baginsky NA. The three sirens: a self-learning robotic rock band, accessed 22 May 2012. www.the-three-sirens.info
  13. 13.
    Hoffman G, Weinberg G (2010) Gesture-based human-robot jazz improvisation. In: IEEE international conference on robotics and automation (ICRA), Anchorage, Alaska. pp 582–587Google Scholar
  14. 14.
    Kim Y, Batula A, Grunberg D, Lofaro DM, Oh J, Oh PY (2010) Developing humanoids for musical interaction. In: IEEE/RSJ international conference on intelligent robots and systems – workshop on robots and musical expressions, Taipei, Taiwan. pp 36–43Google Scholar
  15. 15.
    Lim A, Mizumoto T, Cahier LK, Otsuka T, Takahashi T, Komatani K, Ogata T, Okuno HG (2010) Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist. In: IEEE/RSJ international conference on intelligent robots and systems (IROS2010), Taipei, Taiwan. pp 1964–1969Google Scholar
  16. 16.
    Petersen K, Solis J, Takanishi A (2010) Musical-based interaction system for the Waseda Flutist Robot: implementation of the visual tracking interaction module. Auton Robot J 28(4):439–455CrossRefGoogle Scholar
  17. 17.
    Weinberg G, Driscoll S (2007) The design of a perceptual and improvisational robotic marimba player. In: 16th IEEE international conference on robot and human interactive communication, Jeju, Korea. pp 769–774Google Scholar
  18. 18.
    Toyota Robot Partner, . (2010), accessed 31 July 2010. Available at:.http://www2.toyota.co.jp/en/tech/robot/p_robot/index.html
  19. 19.
    Klaedefabrik KB (2005) Martin Riches – Maskinerne/the machines. Kehrer Verlag, Heidelberg, Germany. pp 10–13Google Scholar
  20. 20.
    Solis J, Petersen K, Yamamoto T, Takeuchi M, Ishikawa S, Takanishi A, Hashimoto K (2010) Design of new mouth and hand mechanisms of the anthropomorphic saxophonist robot and implementation of an air pressure feed-forward control with dead-time compensation. In: International conference on robotics and automation, Anchorage, Alaska. pp 42–47Google Scholar
  21. 21.
    Ando Y (1970) Drive conditions of the flute and their influence upon harmonic structure of generated tone. J Acoust Soc Jpn 26(7):297–305Google Scholar
  22. 22.
    Friberg A, Colombo V, Fryden L, Sundberg J (1991) Performance rules for computer-controlled contemporary keyboard music. Comput Music J 15:49–55CrossRefGoogle Scholar
  23. 23.
    Friberg A (1995) A quantitative rule system for musical performance, PhD thesis, Department of Speech, Music and Hearing, Royal Institute of Technology, StockholmGoogle Scholar
  24. 24.
    Arcos JL, Mantaras RL (2001) An interactive case-based reasoning approach for generating expressive music. Appl Intell 14:115–129MATHCrossRefGoogle Scholar
  25. 25.
    Suzuki T, Tolunaga T, Tanaka H (1999) A case based approach to the generation of musical expression. In: 16th international joint conference on artificial intelligence, Stockholm, Stockholm, Sweden. pp 642–648Google Scholar
  26. 26.
    Bresin R, Poli GD, Ghetta R (1995) A fuzzy approach to performance rules. In: XI colloquium on musical informatics, Bologna, Italy. pp 163–168Google Scholar
  27. 27.
    Ishikawa O, Aono Y, Katayose H, Inokuchi S (2000) Extraction of musical performance rule using a modified algorithm of multiple regression analysis. In: KTH symposium on grammars for music performance, Stockholm, pp 348–351Google Scholar
  28. 28.
    Bishop CM (2004) Neural networks for pattern recognition. University Press, OxfordGoogle Scholar
  29. 29.
    Katayose H, Inokuchi S (1993) Learning performance rules in a music interpretation system. Comp Hum 27:31–40CrossRefGoogle Scholar
  30. 30.
    Solis J, Taniguchi K, Ninomiya T, Takanishi A (2007) Towards an expressive performance of the Waseda Flutist Robot: production of vibrato. In: 16th IEEE international conference on robot and human interactive communication, Jeju, Korea. pp 780–785Google Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  1. 1.Department of Physics & Electrical EngineeringKarlstad UniversityKarlstadSweden
  2. 2.Research Institute for Advanced Science and EngineeringWaseda UniversityTokyoJapan
  3. 3.Department of Modern Mechanical Engineering & Humanoid Robotics InstituteWaseda UniversityTokyoJapan

Personalised recommendations