Abstract
This paper introduces a luminance intensity interface driven by grasping forces, and its application as a musical instrument. In traditional musical instruments, the relationship between the action and the generated sound is determined by the physical structures of the instruments. Moreover, the freedom of the musical performances is limited by the structures. We developed a ball-shaped interface with handheld size. A photo diode is embedded in the translucent rubber ball to reacts to the grasping force of the performer. The grasping force is detected as the luminance intensity. The performer can use the ball interface only by grasping directly but also by holding to the environmental light or shading it by hands. The output of the interface is fed to the sound generator and the relationship between the performer’s action and the generated sound is determined by the instrumental program to make the universal musical instrument.
Chapter PDF
Similar content being viewed by others
References
Boie, B., Mathews, M.V., Schloss, A.: The radio drum as a synthesozer controller. In: Proc. ICMC, pp. 42–45 (1989)
Rubine, D., McAvinney, P.: The videoharp. In: Proc. ICMC, pp. 49–55 (1988)
Keane, D., Gross, P.: The midi baton. In: Proc. ICMC, pp. 151–154 (1989)
Chabot, X.: Performance with Electronics. In: Proc. ICMC, pp. 65–68 (1989)
Cook, P.R.: A Meta-Wind-Instrument Physical Model, and a Meta-Controller for Real Time Performance Control. In: Proc. ICMC, pp. 273–276 (1992)
Tanaka, A.: Musical technical issue in using interactive instrument technology with application to the BioMuse. In: Proc. ICMC, pp. 124–126 (1993)
Siegel, W., Jacobsen, J.: Composing for the Digital Dance Interface. In: Proc. of ICMC, pp. 276–277 (1999)
Paradiso, J.: The Brain Opera Technology: New instruments and gestural sensors for musical interaction and performance. Journal of New Music Research 28, 130–149 (1999)
Camurri, A., Hashimoto, S., Ricchetti, M., Ricci, A., Suzuki, K., Trocca, R., Volpe, G.: EyesWeb - Toward Gesture and Affect Recognition in Dance/Music Interactive Systems. Computer Music Journal 24(1), 57–69 (2000)
Wanderley, M., Baffer, M. (eds.): Trends in Gestural Control of Music CD-ROM. IRCAM, Paris (2000)
Jordà, S.: Interactive Music Systems for Everyone: Exploring Visual Feedback as a Way for Creating More Intuitive, Efficient and Learnable Instruments. In: Proc. of the Stockholm Music Acoustics Conference (SMAC 2003), pp. 1–3 (2003)
Kaltenbrunner, M., Geiger, G., Jordà, S.: Dynamic Patches for Live Musical Performance. In: Proc. of the 4th Conference on New Interfaces for Musical Expression (NIME 2004), pp. 19–22 (2004)
Morita, H., Hashimoto, S., et al.: A Computer Music System that Follows a Human Conductor. IEEE Computer 24(7), 44–53 (1991)
Paradiso, J., Hsiao, K.Y., Hu, E.: Interactive Music for Instrumented Dancing Shoes. In: Proc. of ICMC, pp. 453–456 (1999)
Ng, K.C.: Sensing and Mapping for Interactive Performers. Organized Sound. Organized Sound 7, 191–200 (2002)
Nakamura, J., Kaku, T., Hyun, K., Noma, T., Yoshida, S.: Automatic Background Music Generation based on Actors’ Mood and Motions. The Journal of Visualization and Computer Animation 5, 247–264 (1994)
Lyons, M., Tetsutani, N.: Facing the Music: A Facial Action Controlled Musical Interface. In: Proc. of CHI, pp. 309–310 (2001)
Weinberg, G., Aimi, R., Jennings, K.: The Beatbug Network - A Rhythmic System for Interdependent Group Collaboration. In: Proc. of NIME 2002, pp. 106–111 (2002)
Sawada, H., Onoe, N., Hashimoto, S.: Sounds in Hands -A Sound Modifier Using Datagloves and Twiddle Interface. In: Proc. ICMC1997, pp. 309–312 (1997)
Hashimoto, S., Sawada, H.: A Grasping Device to Sense Hand Gesture for Expressive Sound Generation. J. of New Music Research 34(1), 115–123 (2005)
Sugano, Y., Ohtsuji, J., Usui, T., Mochizuki, Y., Okude, N.: SHOOTBALL: The Tangible Ball Sport in Ubiquitous Computing. In: ACM ACE 2006 Demonstration Session (2006)
Izuta, O., Nakamura, J., Sato, T., Kodama, S., Koike, H., Fukuchi, K., Shibasaki, K., Mamiya, H.: Digital Sports Using the “Bouncing Star” Rubber Ball Comprising IR and Full-color LEDs and an Acceleration Sensor. In: SIGGRAPH 2008 New Tech. Demo (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yamaguchi, T., Hashimoto, S. (2009). Grasping Interface with Photo Sensor for a Musical Instrument. In: Jacko, J.A. (eds) Human-Computer Interaction. Novel Interaction Methods and Techniques. HCI 2009. Lecture Notes in Computer Science, vol 5611. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02577-8_59
Download citation
DOI: https://doi.org/10.1007/978-3-642-02577-8_59
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02576-1
Online ISBN: 978-3-642-02577-8
eBook Packages: Computer ScienceComputer Science (R0)