Skip to main content

2001: Problems and Prospects for Intimate Musical Control of Computers

  • Chapter
  • First Online:
A NIME Reader

Part of the book series: Current Research in Systematic Musicology ((CRSM,volume 3))

Abstract

In this paper we describe our efforts towards the development of live performance computer-based musical instrumentation. Our design criteria include initial ease of use coupled with a long term potential for virtuosity, minimal and low variance latency, and clear and simple strategies for programming the relationship between gesture and musical result. We present custom controllers and unique adaptations of standard gestural interfaces, a programmable connectivity processor, a communications protocol called Open Sound Control (OSC), and a variety of metaphors for musical control. We further describe applications of our technology to a variety of real musical performances and directions for future research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See http://www.cnmat.berkeley.edu/OSC for more detail and downloadable OSC software.

  2. 2.

    http://www.buchla.com.

  3. 3.

    The Tactex company is now out of business; see (Jones 2001) and https://web.archive.org/web/20100317144002/http://www.tactex.com for more information about their then-revolutionary MTC Express multi-touch controller product.

  4. 4.

    The demonstration video http://www.youtube.com/watch?v=vOvQCPLkPt4 is compelling.

  5. 5.

    http://cnmat.berkeley.edu/user/david_wessel/blog/2009/01/15/slabs_arrays_pressure_sensitive_touch_pads.

References

  • Avizienis, R., Freed, A., Suzuki, T., & Wessel, D. (2000). Scalable connectivity processor for computer music performance systems. In Proceedings of the International Computer Music Conference (pp. 523–526).

    Google Scholar 

  • Campion, E. (1999). Playback. Musical composition. Paris: CNMAT and IRCAM. Retrieved from http://www.edmundcampion.com/project_playback/.

  • Cook, P. R. (2001). Principles for designing computer music controllers. In Proceedings of the International Conference on New Interfaces for Musical Expression, Seattle, WA.

    Google Scholar 

  • Dannenberg, R. (1989). Real-time scheduling and computer accompaniment. In M. V. Mathews & J. R. Pierce (Eds.), Current Directions in Computer Music Research (pp. 225–261). Cambridge, MA: MIT Press.

    Google Scholar 

  • Freed, A., & Avizienis, R. (2000). A new music keyboard featuring continuous key-position sensing and high-speed communication options. In Proceedings of the International Computer Music Conference (pp. 515–516) San Francisco, CA.

    Google Scholar 

  • Freed, A., & Isvan, O. (2000). Musical applications of new, multi-axis guitar string sensors. In Proceedings of the International Computer Music Conference (pp. 543–546) San Francisco, CA.

    Google Scholar 

  • Freed, A., Avizienis, R., Suzuki, T., & Wessel, D. (2000). Scalable connectivity processor for computer music performance systems. Presented at International Computer Music Conference. Berlin, Germany.

    Google Scholar 

  • Freed, A., Chaudhary, A., & Davila, B. (1997). Operating systems latency measurement and analysis for sound synthesis and processing applications. In Proceedings of the 1997 International Computer Music Conference (pp. 479–481), San Francisco, CA.

    Google Scholar 

  • Gurevich, M., Stapelton, P., & Marquez-Borbon, A. (2010). Style and constraint in electronic music instruments. Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 106–111). Sydney: Australia.

    Google Scholar 

  • Henning, G. B., & Gaskell, H. (1981). Monaural phase sensitivity measured with Ronken’s paradigm. Journal of the Acoustical Society of America, 70(6), 1669–1673.

    Article  Google Scholar 

  • Jordà, S., & Mealla, S. (2014). A methodological framework for teaching, evaluating and informing NIME design with a focus on expressiveness and mapping. In Proceedings of the International Conference on New Interfaces for Musical Expression, London, UK.

    Google Scholar 

  • Jota, R., Ng, A., Dietz, P., & Wigdor, D. (2013). How fast is fast enough? A study of the effects of latency in direct-touch pointing tasks. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI) (pp. 2291–2300). Paris, France.

    Google Scholar 

  • Lakoff, G., & Johnson, M. (1999). Philosophy in the flesh: The embodied mind and its challenge to western thought. New York: Basic Books.

    Google Scholar 

  • Lakoff, G., & Núñez, R. E. (2000). Where mathematics comes from: How the embodied mind brings mathematics into being. New York: Basic Books.

    Google Scholar 

  • Lee, M. A., & Wessel, D. (1992). Connectionist models for real-time control of synthesis and compositional algorithms. In Proceedings of the 1992 International Computer Music Conference (pp. 277–280), San Francisco, CA.

    Google Scholar 

  • McMillen, K. (1994). ZIPI - Origins and Motivations. Computer Music Journal, 18(4), 47–51.

    Google Scholar 

  • Moore, R. F. (1988). The dysfunctions of MIDI. Computer Music Journal, 12(1), 19–28.

    Article  Google Scholar 

  • Pressing, J. (1990). Cybernetic issues in interactive performance systems. Computer Music Journal, 14(1), 12–25.

    Article  Google Scholar 

  • Ronken, D. (1970). Monaural detection of a phase difference between clicks. Journal of the Acoustical Society of America, 47(4), 1091–1099.

    Article  Google Scholar 

  • Rovan, J. B., Wanderley, M. M., Dubnov, S., & Depalle, P. (1997). Instrumental gestural mapping strategies as expressivity determinants in computer music performance. In A. Camurri (Ed.), Kansei, The Technology of Emotion. Proceedings of the AIMI International Workshop (pp. 68–73). Genoa: Associazione di Informatica Musicale Italiana.

    Google Scholar 

  • Ryan, J. (1992). Effort and expression. In Proceedings of the 1992 International Computer Music Conference (pp. 414–416), San Francisco, CA.

    Google Scholar 

  • Wessel, D., Wright, M., & Khan, S. A. (1998). Preparation for improvised performance in collaboration with a khyal singer. In Proceedings of the 1998 International Computer Music Conference (pp. 497–503), Ann Arbor, MI.

    Google Scholar 

  • Wessel, D., & Wright, M. (2002). Problems and prospects for intimate musical control of computers. Computer Music Journal, 26(3), 11–22.

    Article  Google Scholar 

  • Wright, M., & Wessel, D. (1998). An improvisation environment for generating rhythmic structures based on north Indian ‘tal’patterns. In Proceedings of the 1998 International Computer Music Conference (pp. 125–128), Ann Arbor, MI.

    Google Scholar 

  • Wright, M. (1994). A comparison of MIDI and ZIPI. Computer Music Journal, 18(4), 86–91.

    Article  Google Scholar 

  • Zbyszynski, M., Wright, M., Momeni, A., & Cullen, D. (2007). Ten years of tablet musical interfaces at CNMAT. In Proceedings of the International Conference on New Interfaces for Musical Expression, New York, NY.

    Google Scholar 

Download references

Acknowledgements

Special thanks to Rimas Avizienis, Adrian Freed, and Takahiko Suzuki for their work on the connectivity processor and to Gibson Guitar, DIMI, and the France Berkeley Fund for their generous support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Matthew Wright .

Editor information

Editors and Affiliations

Appendices

Unsolved Problems and Continuing Prospects for Intimate Musical Control of Computers

Matthew Wright

When we wrote this article in 2001 (and expanded it into a fuller article the following year (Wessel and Wright 2002)) there seemed to be a small community of people making music interactively with computer-based instruments of their own design, and as far as I could tell all of them were friends with David Wessel. We jumped on the opportunity to share our thoughts on goals, difficulties, and techniques for achieving a quality of intimacy that we strongly desired in our own instruments and found lacking in most of the triggering-based instruments then readily available. We were excited that the esteemed ACM SIGCHI was to some degree turning its attention to questions of musical interaction, and welcomed the opportunity to shape this discussion by injecting our own values and methods. We had no idea this small fringe gathering would develop into the NIME conference and community as it now exists.

Today I question the relationship between NIME and mainstream CHI. Although computer music has always benefitted from and indeed depended on advances in EE and computer science generally not made with music in mind, it does not seem that the concerns of NIME have had much influence on CHI in particular. In the other direction, much of the apparent influence of CHI on NIME has been shoehorning the inherently creative and exploratory artistic process of designing musical instruments into the objectively quantifiable mindset of evaluating task performance. Today the minimum viable NIME paper seems to be a slightly novel combination of technologies that is demonstrably good at something, with only the best making theoretical contributions to guide the exploration of the infinite-dimensional space of possible instruments with ethical consideration of the potential of the human performer. Perhaps if we could better capture and theorize the experiences of skilled practitioners as contributions to knowledge, then themes of performance, expertise, experience, artistry, and control could become more prominent to the betterment of both communities.

Low entry fee with no ceiling on virtuosity, aka Low Floor High Ceiling (a phrase popularized by Seymour Papert that reached us via Brian Harvey), is perhaps the most-cited idea in this paper, though of course it is much easier to demonstrate the floor than the ceiling. We disparaged low-ceiling NIMEs as “toy-like;” I now regret this phrase and prefer to frame the question as how to design high-ceiling toys. Toys and their enclosing adaptive systems can harm or help the continued musical evolution of kids of all ages, inviting and rewarding human motor learning in many ways. Gurevich et al. rightly question the importance of an instrument’s inherent properties in light of people’s seemingly inexhaustible creativity in adapting their music making to even the simplest of NIMEs (Gurevich et al. 2010). Nevertheless, I would still argue that their beeping button box’s and other excellent toys’ remarkably high ceilings (e.g., variety of musical results) can be predicted and partially attributed to low latency/jitter and musicians’ drive to apply continuous control even to an instrument that presents a discrete-only interface.

Latency raises the floor and jitter lowers the ceiling; indeed these have been a major theme of NIME, questions that are today almost required of proposed NIMEs and their component technologies. A few papers focus exclusively on these issues. We arrived at our 10 ± 1 ms criterion through subjective personal experiences using instruments with various measured latencies and at our flam timbre justification through thought experiment; to my knowledge the literature still lacks better empirical justification for these engineering constraints. In the context of graphical objects following a user’s finger on a touch screen, 1ms total latency is clearlyFootnote 4 much better than 10ms (Jota et al. 2013). Perceptual and cognitive studies of various musical activities and results under various latency/jitter conditions could also result in better models for how skilled performers adapt and possibly shed light on development of musicality in general.

Wessel’s SLABSFootnote 5 represent the fullest embodiment of this paper’s goals and ideas, with extremely low latency, OSC (whose huge impact on the NIME community is addressed in the commentaries to the chapter “2003: Open Sound Control: State of the Art 2003”), a representation of gestural data as audio signals, a direct descendent of the programmable connectivity processor isochronously scanning a poly-point touch-sensitive gestural interface, and the “dipping” metaphor in its most satisfying form. Many NIMEs centering on a digitizing tablet (Zbyszynski et al. 2007) have joined my continuing work, which includes recent experiments spatially arranging sound grains in 3D according to analyzed acoustical properties, then flying about this space, still a rich and powerful metaphor. Scrubbing remains vital to my tablet interfaces, but the analysis portion of sinusoidal modeling is still difficult and likely to produce unsatisfactory results without extensive fussing, hampering widespread adoption of this technique. Granular scrubbing seems more promising, with the high dimensional control space of window size, rate, etc. both a challenge and opportunity.

Understandably, the number of cases of developing a deep, longstanding, and fruitful connection to a NIME is far less than the number of NIMEs that have been proposed. One could argue that modular synthesis’ current popularity, with many performers developing intimate relationships with their personal instruments, is because they “treat gestures with a sample-synchronous signal processing approach”; unfortunately it is still very common for NIMEs to sample or delay control signals much more slowly and erratically, to our detriment. I would like to see and perform additional research into these control signals (e.g., their spectral and phase properties) in sophisticated musical contexts.

The Sempiternal Quest for Intimacy

Sergi Jordà

2001 was the year the first NIME took place. A small workshop inside the huge CHI conference in Seattle, I was lucky enough to be one of the first dozen NIME participants ever. There, I had the honor to meet some of the pioneers and gurus of the burgeoning “live computer music world” (for the lack of a better term, “NIME” not yet existing before the workshop): Max Mathews, Bill Verplank, Perry Cook, Joe Paradiso were all there, giving presentations that would enlighten and guide the younger ones such as myself. If I remember correctly, David Wessel did not attend the workshop—I would have to wait one or two more years for meeting him in person and becoming good friends—and Matt Wright gave their presentation.

Wessel and Wright’s paper introduces one of the concepts that would come to recur most frequently in our NIME community: that of “low entry fee with no ceiling on virtuosity.” The paper starts also with one sentence, that while less popular, still constitutes one of my favourite NIME quotes; one, which I subsequently often borrowed and even used for starting my Phd thesis: “When asked what musical instruments they play, there are not many computer music practitioners who would respond spontaneously with ‘I play the computer.’” But the paper encompasses much more than some popular NIME quotes; it does, in fact, contain almost too much. It ranges from conceptual topics, such as control intimacy and its importance for attaining a hypothetical instrumental virtuosity, to fully technical solutions such as the OSC protocol, which had been specifically designed for addressing the low latency and the high resolution requirements previously identified by the authors as being essential for attaining the aforementioned intimate control. On top of this, it introduces some musical interfaces and controllers which were then being designed, revamped or customised at CNMAT, and it concludes with a discussion on some of the control metaphors that inspired the development of these interfaces. If that wasn’t enough, some essential knowledge (then novel and original) is dropped here and there, such as the empirically observed toy-like behaviour of “one-to-one mappings” (although the authors do not employ that term), and their unsuitability for the development of virtuosity.

Considering its generous loquacity, it is very likely that by today’s standards, a paper like this would have been rejected for publication: “lack of clear objective and structure, and lack of evidence and proof for most of the authors’ statements.” While its overwhelming flood of ideas was tamed for the extended and matured journal version published a year later (Wessel and Wright 2002), and which probably constitutes a more cohesive reading, I would like to stress the fact that urgency to communicate and effervescence, were indeed the trademarks of most of the papers presented at this first NIME workshop in 2001. Much knowledge and know-how had been accumulated and was pressurized, eager to be shared with a sympathetic audience! But this specific paper, together with Perry Cook’s “Principles for Designing Computer Music Controllers” (Cook 2001), which was also presented at the 2001 workshop, clearly stand as two of the early pillars or compendiums of NIME knowledge. Together with two even earlier seminal publications by Pressing (Pressing 1990) and Ryan (Ryan 1992), they still constitute the first four introductory must-read papers in any NIME-related course I have ever organized!

In 2001, NIME-knowledge was necessarily based on personal experience and empiricism. This type of knowledge is not easily replicable, not always demonstrable, not very scientific in sum, but it is not less valid for these reasons. Fifteen years later, the NIME community has matured and a growing number of university courses are being taught (Jordà and Mealla 2014). Yet, creating DMIs is still, in many respects, very similar to creating music. It involves a great deal and variety of know-how and technicalities, while at the same time, as in music, there are no inviolable laws. Indeed, in his contemporary aforementioned paper (Cook 2001), Cook sceptically and romantically affirmed: “musical interface construction proceeds as more art than science, and possibly this is the only way it can be done?” Because I tend to agree with him, I believe we should never forget the previous successes and the failures of experienced practitioners. In that sense, it comes as no surprise that the first tentative and informal—but still very valid—NIME design and conceptual frameworks were proposed by experienced digital luthiers, performers and educators, and David Wessel, regretfully not longer with us, definitely excelled in all these categories.

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Wessel, D., Wright, M. (2017). 2001: Problems and Prospects for Intimate Musical Control of Computers. In: Jensenius, A., Lyons, M. (eds) A NIME Reader. Current Research in Systematic Musicology, vol 3. Springer, Cham. https://doi.org/10.1007/978-3-319-47214-0_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47214-0_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47213-3

  • Online ISBN: 978-3-319-47214-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics