Skip to main content

2012: TouchKeys: Capacitive Multi-touch Sensing on a Physical Keyboard

  • Chapter
  • First Online:

Part of the book series: Current Research in Systematic Musicology ((CRSM,volume 3))

Abstract

Capacitive touch sensing is increasingly used in musical controllers, particularly those based on multi-touch screen interfaces. However, in contrast to the venerable piano-style keyboard, touch screen controllers lack the tactile feedback many performers find crucial. This paper presents an augmentation system for acoustic and electronic keyboards in which multi-touch capacitive sensors are added to the surface of each key. Each key records the position of fingers on the surface, and by combining this data with MIDI note onsets and aftertouch from the host keyboard, the system functions as a multidimensional polyphonic controller for a wide variety of synthesis software. The paper will discuss general capacitive touch sensor design, keyboard-specific implementation strategies, and the development of a flexible mapping engine using OSC and MIDI.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    http://createdigitalmusic.com/2012/11/endeavours-evo-touch-sensitive-keyboard-reimagined-now-from-eur499-gallery-videos/.

  2. 2.

    http://www.cypress.com/products/psoc-1.

  3. 3.

    http://www.cypress.com/documentation/application-notes/an65973-cy8c20xx6ahas-capsense-design-guide.

  4. 4.

    https://www.fxpansion.com/products/dcamsynthsquad/.

References

  • Cook, P. R. (2001). Principles for designing computer music controllers. In Proceedings of the International Conference on New Interfaces for Musical Expression, Seattle, WA.

    Google Scholar 

  • Dahlstedt, P. (2015). Mapping strategies and sound engine design for an augmented hybrid piano. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 271–276), London, UK.

    Google Scholar 

  • Eaton, J., & Moog, R. (2005). Multiple-touch-sensitive keyboard. In Proceedings of the International Conference on New Interfaces for Musical Expression, Vancouver, Canada.

    Google Scholar 

  • Haken, L., Tellman, E., & Wolfe, P. (1998). An indiscrete music keyboard. CMJ, 22(1), 30–48.

    Google Scholar 

  • Lamb, R., & Robertson, A. (2011). Seaboard: A new piano keyboard-related interface combining discrete and continuous control. In Proceedings of the International Computer Music Conference.

    Google Scholar 

  • McPherson, A. (2012). TouchKeys: Capacitive multi-touch sensing on a physical keyboard. In Proceedings of the International Conference on New Interfaces for Musical Expression, Ann Arbor, MI.

    Google Scholar 

  • McPherson, A., & Kim, Y. (2010). Augmenting the acoustic piano with electromagnetic string actuation and continuous key position sensing. In Proceedings of the International Conference on New Interfaces for Musical Expression, Sydney, Australia.

    Google Scholar 

  • McPherson, A., & Kim, Y. (2011). Design and applications of a multi-touch musical keyboard. In Proceedings of the Sound & Music Computing Conference.

    Google Scholar 

  • McPherson, A. (2015). Buttons, handles, and keys: Advances in continuous-control keyboard instruments. Computer Music Journal, 39(2), 28–46.

    Article  MathSciNet  Google Scholar 

  • Miranda, E. R., & Wanderley, M. M. (2006). New digital musical instruments: Control and interaction beyond the keyboard. Middleton, WI: A-R Editions.

    Google Scholar 

  • Moog, R. A., & Rhea, T. L. (1990). Evolution of the keyboard interface: The Bösendorfer 290 SE recording piano and the moog multiply-touch-sensitive keyboards. Computer Music Journal, 14(2), 52–60.

    Google Scholar 

  • Nicolls, S. (2010). Seeking out the spaces between: Using improvisation in collaborative composition with interactive technology. Leonardo Music Journal, 20, 47–55.

    Article  Google Scholar 

  • Paradiso, J., & Gershenfeld, N. (1997). Musical applications of electric field sensing. Computer Music Journal, 21(2), 69–89.

    Article  Google Scholar 

  • Roads, C. (1996). The computer music tutorial. Cambridge, MA: MIT Press.

    Google Scholar 

  • Star-Ledger, N. (2006). Waiting to be heard, wanted: Someone to bring the successor to the moog synthesizer to life.

    Google Scholar 

  • Tremblay, P. A., & Schwarz, D. (2010). Surfing the waves: Live audio mosaicing of an electric bass performance as a corpus browsing interface. In Proceedings of the International Conference on New Interfaces for Musical Expression, Sydney, Australia.

    Google Scholar 

  • Wessel, D., Avizienis, R., Freed, A., & Wright, M. (2007). A force sensitive multi-touch array supporting multiple 2-D musical control structures. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression (pp. 41–45), New York, NY.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew McPherson .

Editor information

Editors and Affiliations

Appendices

Author Commentary: Taking an Augmented Keyboard from the Lab to the Stage

Andrew McPherson

While the basic layout of the keyboard has changed little in over 500 years, generations of inventors have attempted to circumvent a basic limitation of most keyboard instruments: that the keyboard is essentially discrete. The performer can control the onset and release of each note but has comparatively few ways to shape what happens in the middle. From Leonardo da Vinci’s 15th-century sketches of the ‘viola organista’ to 18th-century keyboard variants of Benjamin Franklin’s glass harmonica to Hugh Le Caine’s 1948 Electronic Sackbut, luthiers have used the technology of their day to pursue something of an instrumental holy grail: an instrument combining the flexibility of the keyboard with the continuous note shaping of the violin or voice (McPherson 2015).

The role of the keyboard within the NIME community is complex. On one hand, many augmented keyboards and keyboard-inspired interfaces have appeared at NIME. On the other hand, the keyboard might also be seen as a mental obstacle to be overcome on the way to new forms of musical interaction. Indeed, the subtitle of Miranda and Wanderley’s comprehensive 2006 text on digital musical instruments is “Control and Interaction Beyond the Keyboard” (Miranda and Wanderley 2006).

TouchKeys follows in a long line of new keyboard technology, most notably Robert Moog’s Multiply-Touch-Sensitive keyboard (Moog and Rhea 1990). In my view, its most important lesson for the NIME community is to highlight the power and musical potential of performer familiarity. For novel instruments, the performer’s experience level quickly becomes the dominant limitation, since few musicians are able or willing to devote years of practice to become proficient on an unfamiliar instrument.

In its basic physical layout and in its particular mapping strategies, TouchKeys builds on existing keyboard skills, letting pianists control new effects while minimally interfering with traditional playing techniques. It can be seen as a manifestation of Tremblay and Schwartz’s principle of ‘recycling virtuosity’(Tremblay and Schwarz 2010) or of Perry Cook’s recommendation on ‘leveraging expert technique’(Cook 2001).

Partly because of the close connection to existing piano technique, TouchKeys has gone on to be used by quite a few musicians. In 2013, after 5 generations of hardware design, I ran a Kickstarter campaign which shipped self-install kits and prebuilt instruments to musicians in 20 countries. Since then, I’ve built and shipped another large production run, and the project is now in the process of spinning out of the university into an independent startup company. It has been enjoyable, and also a learning experience, to build a community of performers and composers around this instrument.

A limitation inherent to the TouchKeys design is that it is a controller, generating MIDI or OSC messages to manipulate external sounds. Compared to my other augmented keyboard project, the magnetic resonator piano (McPherson 2015), the lack of an identifiable ‘TouchKeys’ sound has meant that it less easily generates a signature identity for new compositions. Of course, this is balanced by the flexibility that it gives performers to integrate it into their own workflow.

There remains a question about whether a generic separation between controller and sound generator will ever achieve the level of control intimacy provided by fully integrated systems (whether acoustic or electronic). Of course, even for re-mappable controllers, the performer can decide to settle on just one configuration and learn it over many years. However, there may be a temptation to regularly change settings, in which case the performer’s relationship with the instrument may relate more to its breadth of possibilities than the depth and subtlety of any one configuration. A related question is whether the designer, in seeking to make a controller generically applicable, will forego certain idiosyncrasies of behaviour that could be exploited in a fully integrated instrument.

One general challenge I have found, which is not uncommon in NIME, is navigating the tension between academic and commercial/musical expectations. Most academic fields prioritise novelty over continuing development; the pressure is to develop a project sufficiently to publish a high-quality paper, then leave it aside in favour of the next research project. But creating a digital musical instrument suitable for the highest echelons of performance can take many rounds of development, user testing and refinement, and not all of these technical details are publishable. Instead, the most interesting research outcomes are the things we can learn about the players themselves and the way they engage with technology. In my opinion, the human aspect of NIME is an area where many broad and fascinating questions remain to be explored.

Expert Commentary: Turning the Piano Keyboard Inside Out

Palle Dahlstedt

Many musicians have tried to extend control over piano tone beyond that initial attack. Electronics have been used for sound processing, sensors have been added, and pianists have reached inside the instrument to interact directly with the strings. But in my title, I refer not to Sarah Nicolls’ Inside out piano (Nicolls 2010), but to what Andrew McPherson did conceptually to the piano keyboard with the introduction of his TouchKeys sensor system (McPherson 2012).

There is a kind of inverse correlation between polyphony of sound and intimacy of control in musical instruments, distributing the cognitive load of the player over available variables. In one end of the scale is the human voice, with supreme control over monophonic sound. Somewhere in the middle is the violin with intimate continuous control and some polyphonic capabilities, but in polyphonic playing individual control is reduced. Near the other end, just before the church organ, is the piano, where tonal control stops when the hammer leaves the string, but this is compensated by structural complexity and focus on relationship (in time and dynamics) between simultaneous and succeeding notes.

With TouchKeys, McPherson has turned the piano keyboard inside out, and effectively transformed it from one end of the scale to the other, at least on the interface side. High-resolution multitouch sensing on each key turns the focus of the pianist from structure and multiplicity in pitch and time, to the potential of intimate gestural micro-narratives, and high-dimensional control of single sounds. The mental charts have to be redrawn, and as a pianist, my attention lands carefully on the key surface to explore this potential. The rest of the keyboard is still there in the background, blurred and distant.

McPherson is an instrument maker, a toolmaker, in the classic sense: a virtuoso engineer and craftsman, economical and rational, who constructs an extension of an existing interface in close collaboration with musicians, and makes it available to them. McPherson’s respect for musicians is evident, e.g., in his detailed discussion about the key surface material—he is aware that minute differences in texture and friction affect playability and musicianship. So he asks the players.

Further research and experimentation is needed—since TouchKeys is an open-ended interface, and as such it is but one of three main components of an instrument, followed by the mapping and the sound engine. To design mappings that fully (whatever that means) utilize the gestural possibilities in a musically meaningful way is highly nontrivial, and different solutions will be musically meaningful in very different ways. McPherson only touches upon mapping issues in his text, just making sure that core gestural data is available. I think this is a good decision. It makes his invention more open, and allows for diverse approaches and continued research.

In comparison to related interfaces, such as the Haaken Continuum (Haken et al. 1998), TouchKeys allows the pianist to dive into the single key (or single note)—instead of expanding the whole keyboard into a large continuum. It keeps the keyboard paradigm, but expands it inwards. This means a lot for playability. It makes it accessible for keyboard players, easy to conceptualize with regards to synthesis and processing, and does not get in the way of hard-earned playing skills—but augments them instead.

In a normal multi-touch surface, e.g., a tablet computer, each finger generates a blob, with location (and possibly area). The identity of the blob has to be inferred from initial coordinates or a similar mechanism. With TouchKeys you instead have a large set of separate multitouch surfaces (albeit small), each with continuous properties. This reminds me more of David Wessel’s last performance instrument (Wessel et al. 2007), based around an array of small touch surfaces, allowing for intimate complex x/y/z gestures with distinct identities.

As a pianist and keyboard improviser, I see many possibilities for musical applications. The most obvious (and boring) is to add continued control of a sounding note (pitch, dynamics, timbre), through relative movement of the finger. The ability to sense touch on unpressed keys could be used for control of formant resonances and filtering—or spectral muting—where pressed keys generate the carrier sound. Touch gestures on keys (pressed or unpressed) allows for bowing, scratching or tapping metaphors to inject energy into physical synthesis models. In my own approaches to augmented piano (Dahlstedt 2015), I have tried to transcend the zeroeth-order mapping paradigm using dynamic mappings and virtual string models. In the next iteration, TouchKeys will play an important role. The possibilities are endless, and will keep us busy.

In a non-invasive way, keeping the main structure of the original interface, TouchKeys adds so many degrees of freedom that it has the potential to completely transform keyboard playing. It is no longer only about discrete entities—inside those, McPherson has opened an expressive continuum, effectively turning the keyboard inside out. And I’m going in.

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

McPherson, A. (2017). 2012: TouchKeys: Capacitive Multi-touch Sensing on a Physical Keyboard. In: Jensenius, A., Lyons, M. (eds) A NIME Reader. Current Research in Systematic Musicology, vol 3. Springer, Cham. https://doi.org/10.1007/978-3-319-47214-0_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47214-0_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47213-3

  • Online ISBN: 978-3-319-47214-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics