Skip to main content

Instrumental Modality. On Wanting to Play Something

  • Chapter
  • First Online:
Musical Instruments in the 21st Century
  • 1522 Accesses

Abstract

Composer/Performer Jeff Carey and Performer/Composer Bjørnar Habbestad have collaborated since 2002, developing a chain of works, performances and improvised events as the duo USA/USB. Parallel to their artistic work runs a development effort in SuperCollider, now branched into what is known as Modality (http://modality.bek.no)—a network of developers collaborating on the creation of a toolkit to support live electronic performance environments. In this interview Habbestad and Carey share insights to this development process whilst discussing what it means to play an instrument in a computer music context.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://steim.org/2012/01/lisa-x-v1-25/.

  2. 2.

    http://www.paristransatlantic.com/magazine/interviews/parker.html.

  3. 3.

    MKeys (2005)—or modal keys—was the first generation software written by Jeff Carey, that allowed for a layered keyboard. MKeys later became the foundation or starting point for the work with Modality.

  4. 4.

    A solo piece for flute and electronics by Bjørnar Habbestad, using the eponymously named software and hardware https://respira2or.wordpress.com/about/.

  5. 5.

    Human Interface Device, a computer device and communication protocol allowing for interaction between humans and computer.

  6. 6.

    Musical Instrument Digital Interface, a communication protocol describing musical information.

  7. 7.

    Open Sound Control, a protocol for networking sound synthesizers, computers and other devices.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bjørnar Habbestad .

Editor information

Editors and Affiliations

Appendix

Appendix

MKeys,

a basic software utility that allowed for paged toiling of musical functions: Starting/stopping of synthesis processes, sample playback and recording, algorithmic tasks, transformations etc. Developed by Jeff Carey between 2005 and 2010.

Modality,

a software framework bringing control input for electro-instruments into a unified structure. Various control sources such as HID, OSC and MIDI are all accessed in a uniform manner, with the goal of unifying their interface to allow for easy swapping and trading of control sources among different processes. Developed by the Modality team http://modality.bek.no/.

Machine Gun Etiquette,

composed in 2005 by Jeff Carey and Bjørnar Habbestad, premiered May 13th at NuMusic, Stavanger. Later performed in Norway, Holland and Germany. The piece was comprised of three large form sections: (1) duo for notated acoustic flute with SuperCollider processing, (2) duo for amplified flute with live transformations and electro instrument (3) trio for transformed sampled flute and two live electro instruments.

On the technical side, MGE included networked control over flute processing, paged transformation instrument with foot pedal controllers for flute and paged transformation and synthesis with mouse, keyboard, foot pedals and bank of four joysticks. Compositionally, the goal was to make a piece characterised by a progression of material and interactive development, starting out with the internal sound of the acoustic flute. The exploration of the technicalities and ‘administrative’ responsibilities of the piece coupled with the desire for interaction between two players led to developing a single instrument from two networked computers. Computer A was used for sampling and transforming flute sounds. Computer B, via network control, triggered various transformations to the flute signal chain and administered sample recording on computer A. Samples from computer A were also copied over the network for further processing on computer B.

Respirator,

composed in 2009 by Bjørnar Habbestad, premiered at Borealis 2010. A multi-layered interaction framework where each layer was associated with ‘musical behaviours’ or sections of a piece. A layer would have its own signal routing framework, where physical controls like foot pedals or computer keyboards would be redefined. Each layer, simultaneously running/sounding, allowed up to 4 series of transformations on any number of parallel transformations. Entry and exit of a layer would execute a collection of necessary functions: transformations to turn on or off, sample recording, muting of inputs, changes of mix settings or start/stop algorithmic functions etc.

Chop-Chop,

composed 2010 by Jeff Carey at Landmark, Bergen. Later performed at DNK in 2011 and in Baltimore 2012. A 16-channel surround sound piece for electro-instrument based on the development and transformation of a small collection of source sound synthesis materials—phase modulation and feedback based voices—with direct physical control over voices, transformations, signal routing and panning.

The piece, divided into 9 subsections based on differing musical behaviour, was meant to focus on the physical development of sonic material, spatialization, and compositional logic over time. As such it wanted to explore the idea of a changeable interface. Mappings per section changed, allowing the performer to move back and forth to turn various transformers on and off. In some section the mappings changed from toggles to drum pads where a pair of pads would inform a particular algorithmic process by ‘pumping’ energy into a leaky integrator mapped to time stretch buffer pointers, filter tutors and amplitudes. This piece was the ‘proof of concept’ into/for the larger Modality project. Its experimental implementation had some limitations but the notion that collections of musical behaviours could be matched with a variety of control scenarios in a manageable instrumental setup was a limited success leading towards the full development of Modality.

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer Nature Singapore Pte Ltd.

About this chapter

Cite this chapter

Habbestad, B., Carey, J. (2017). Instrumental Modality. On Wanting to Play Something. In: Bovermann, T., de Campo, A., Egermann, H., Hardjowirogo, SI., Weinzierl, S. (eds) Musical Instruments in the 21st Century. Springer, Singapore. https://doi.org/10.1007/978-981-10-2951-6_17

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-2951-6_17

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-2950-9

  • Online ISBN: 978-981-10-2951-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics