Advertisement

Understanding Music Interaction, and Why It Matters

  • Simon HollandEmail author
  • Tom Mudd
  • Katie Wilkie-McKenna
  • Andrew McPherson
  • Marcelo M. Wanderley
Chapter
Part of the Springer Series on Cultural Computing book series (SSCC)

Abstract

This is the introductory chapter of a book dedicated to new research in, and emerging new understandings of, music and human-computer interaction—known for short as music interaction. Music interaction research plays a key role in innovative approaches to diverse musical activities, including performance, composition, education, analysis, production and collaborative music making. Music interaction is pivotal in new research directions in a range of activities, including audience participation, interaction between music and dancers, tools for algorithmic music, music video games, audio games, turntablism and live coding. More generally, music provides a powerful source of challenges and new ideas for human-computer interaction (HCI). This introductory chapter reviews the relationship between music and human-computer interaction and outlines research themes and issues that emerge from the collected work of researchers and practitioners in this book.

1.1 Introduction

Music is part of what it means to be human (Brown 1991; DʼErrico et al. 2003). All known human cultures have music, although what is considered to be music varies widely across cultures, both structurally and functionally (Cross 2016). The earliest human artefacts identified unambiguously as musical are bone flutes dating back some 38,000 years (Cross 2016) though other probable artefacts have been dated a little earlier. These dates coincide more or less with the emergence of so-called ‘behavioural and cognitive modernity’—the suite of traits distinguishing modern homo sapiens from earlier anatomically modern humans. While the exact social function of these flutes is likely to remain obscure, DʼErrico et al. (2003) note that their sophisticated and subtle design features suggest substantial importance being attached to music. Music may predate spoken language (Mithen 2006) and may have played a key role in the development of human speech (Fitch 2006).

Music is universally accessible; it is one of the first social activities to engage young children (Mehr et al. 2016). Despite the intense cognitive and perceptual demands (Justus and Jamshed 2002) involved in music, even new born children can detect the beat in music (Honing et al. 2009). Healthy toddlers react with their whole bodies to music (Fenichel 2002) and there is evidence that music therapy offers clinical benefits to premature infants in intensive care (Standley 2011). Similarly, music is one of the last social activities still able to emotionally engage those with a range of neurodegenerative diseases (Zhang et al. 2017), and music also plays valued roles in Deaf culture (Darrow 2006).

While active engagement with music is more or less open to all, many amateur musicians take pains to develop virtuosic music skills far beyond those required for universally accessible musical activities, and dedicate lifetimes to refining highly specialised musical expertise for its own sake, in the absence of financial incentive or extrinsic motivation (Wallis et al. 2013). Music plays a rich variety of roles in social activities (Hargreaves and North 1997). Centuries old musical genres are still regularly studied and performed, yet music continually develops radically new and unanticipated forms.

Music involves the integration of a wide range of cognitive, perceptual, motor and emotional human capabilities (Vuilleumier and Trost 2015). While many human activities draw on relatively limited parts of the brain, imaging studies have shown that playing and listening to music creates co-ordinated activity all over brain: the prefrontal, motor, sensory, visual and auditory cortices; the corpus callosum which connects both sides of the brain; and structures governing memory, emotion and balance—the hippocampus, nucleus accumbens, amygdala and cerebellum (Brown et al. 2004; Salimpoor et al. 2013). Indeed, Chap.  11 of this book “Detecting and Adapting to Users’ Cognitive and Affective State to Develop Intelligent Musical Interfaces” by Yuksel et al. (2019) explores the direct use in musical interaction of brain activity unconnected with conscious intention.

But human engagement with music is not just a matter of the brain. Music is a highly embodied activity in which performers and listeners routinely engage in complex synchronised movements in time with sound. Performance on many traditional as well as digital musical instruments requires precise real-time control of tight sensory-motor loops: not just note onset, but also pitch and timbre may be subject to millisecond-scale muscular manipulation—giving rise to interaction design issues considered in detail in two chapters: Chap.  2 “A Design WorkBench for Interactive Music Systems” by Malloch et al. (2019) and Chap.  9 “Embodied Musical Interaction: Body Physiology, Cross modality, and Sonic Experience” by Tanaka (2019).

Alongside its richly embodied social role, music has a complex abstract side, and has played key roles historically in the development of science, mathematics, psychology and technology (Xenakis 1992). Early in the history of science, music appears to have provided the impetus for Pythagoras’ seminal realisation (via tuning relationships) that empirical physical laws could be expressed mathematically. Some of the earliest formal notional systems (Duchesne-Guillemin 1963) and programmable automata (Koetsier 2001) were focused on music.

Conversely, theories of music from mathematics (Milne et al. 2011), computational neuroscience (Angelis et al. 2013) and cognitive psychology (Balzano 1980; Longuet-Higgins 1976; Milne and Holland 2016; Holland et al. 2018) have been used to illuminate aspects of musical practice and even aesthetic and subjective musical judgement—one notable example of which is explored by Milne (2019) in Chap.  6 (“XronoMorph: Investigating Paths Through Rhythmic Space”).

Technological developments in music diffuse into musical cultures and can alter music profoundly over timescales ranging from days to centuries, with unpredictable effects. Reciprocally, the innovative uses to which musicians put technologies can affect the path of technological development in unforeseen ways both inside and beyond music (Holland et al. 2013). Computers in particular have had a profound influence on most aspects of music, ranging from how music is made, to how it is learned, performed, listened to, distributed, and valued. In this book, we use the term ‘music interaction’ specifically as a shorthand for music and human-computer interaction (although the term can be used to in a much wider sense to encompass musical engagement through any kind of technology, however primitive).

1.2 Early Links Between Music and HCI

Because of the particularly demanding nature of musical activities, as outlined above, and the deep involvement of music with so many diverse human capabilities, music interaction can be particularly challenging for human-computer interaction (HCI) researchers and designers, and can act as valuable source of challenges, new ideas and new techniques for HCI generally. At the same time, new forms of music interaction can alter what is possible in diverse areas of music, and thus has serious implications for music, musicians, educators, learners and anyone seeking deeper involvement in music.

The links between HCI and music go back to the beginnings of HCI. There are essentially just three key HCI research landmarks that predate any musical examples: work at Lincoln Labs including Ivan Sutherland’s visionary Sketchpad (Sutherland 1964); Doug Engelbart’s ‘mother of all demos’ (Engelbart and English 1968) and Baecker’s (1969) GENESYS animation system.

In 1971, the most advanced interaction design in any computer system anywhere in the world usable by non-technical users appears to have been the ‘Music Machine’ (NRC 1970, 1971) a music system developed at Queens University, Canada. This system (Buxton 2008) offered bimanual input and musical output, and was designed for, and in daily use by non-expert users, some 11 years before the Xerox Star—widely considered foundational to HCI.

In the eighties, Jaron Lanier’s pioneering development of virtual reality (Blanchard et al. 1990) grew out of Thomas Zimmerman’s devising and patenting of the first generally practical real-time data glove (Zimmerman 1982; Zimmerman et al. 1986)—motivated, according to Lanier, by Zimmerman’s desire to play air guitar as a precise musical interface (Lanier 1989).

A seminal survey by Bruce Pennycook in 1985 laid out many of the common issues between computer music and HCI (Pennycook 1985). A variety of innovative user interfaces for music were analysed and compared from an HCI perspective in 1989 by Holland (1989) focusing in particular on music interfaces for learners. A review of applications of Artificial Intelligence in Music Education critically examined diverse interaction design strategies for music systems aimed at education (Holland 2000).

In 2000, a wide-ranging collection of developments in the area was presented in an electronic publication, ‘Trends in Gestural Control of Music’ (Wanderley and Battier 2000). Chapters in this book by authors such as Bongers (2000), Hunt et al. (2000), Ungvary and Vertegaal (2000) and Tanaka (2000) discussed issues spanning music and HCI. The book also addressed a range of interdisciplinary topics via a round table discussion between a remarkable collection of electronic and computer music pioneers: William Buxton; Don Buchla; Chris Chafe; Tod Machover; Max Mathews; Bob Moog; Jean-Claude Risset; Laetitia Sonami and Michel Waisvisz. This book was one of the references supporting the proposal for a wider dialogue between the HCI and music communities through a workshop held in Seattle in 2001 as part of SIGCHI 2001.

Conferences have played important roles in developing the dialogue between Music and HCI communities. The ACM Conference on Human Factors in Computing Systems (CHI) is the premier scientific conference for research in Human Computer Interaction. The International Conference on New Instruments for Musical Expression (NIME), began life as a workshop at CHI in 2001, but rapidly opted to become an independent international conference focusing on new musical instruments and new means of musical expression (Poupyrev et al. 2001). Since 2002, NIME has been a forum for the diffusion of research and performances in these and wider related areas.

The focus of the NIME community is generally and justifiably on musical ends, rather than balancing this with reflections on implications for wider issues in HCI. Consequently, opportunities exist to widen and deepen the dialogue between the music interaction and wider HCI communities, and to highlight new insights and ideas that emerge from this two-way exchange. To capitalize on this opportunity, Music and Human-Computer Interaction (Holland et al. 2013) presented fifteen refereed chapters analysing and reflecting on the latest research in music and human computer interaction. This collection covered a broad spectrum of musical activities and dealt with perspectives ranging from embodied cognition, spatial cognition, evolutionary interaction, and affective interaction, to novel methodologies.

1.3 Origins of This Book

Taking this previous book as a starting point, the present book was developed to further widen and deepen the dialogue between research in music interaction and HCI and to examine new developments. The book grew out of an international refereed workshop focused on addressing new directions in Music and HCI as part of the 2016 CHI conference (Holland et al. 2016). This workshop brought together three overlapping communities: HCI researchers; music interaction researchers; and musical practitioners. Following the workshop, a selection of papers was extended, refined and refereed for inclusion in this book. Three specially commissioned chapters were developed and refereed in the same way (Chaps.  5,  6 and  12). The book also includes in-depth interviews with five highly experienced researchers and practitioners working in different aspects of music interaction. These interviewees reflect on their own work and illuminate key cross-cutting themes that emerge from the research in this book.

The chapters in this book are grouped by three loose themes—Part 1: Design; Part II: Interaction; and Part III: Collaboration. In practice, most chapters deal with at least two of these themes and explore a wide range of other issues.

1.4 Part I: Design

Music raises many distinctive challenges for interaction design. The chapters in the first part of this book (four research papers, and two interviews with practitioners) illustrate these challenges and ways of addressing them. The following research questions formulated for the 2016 CHI workshop (Holland et al. 2016) serve as a general backdrop for all of the chapters in this part:
  • What is the relationship between creative practice and music interaction design?

  • How can research questions in music interaction be productively framed?

  • What methodologies, theories models and approaches can guide music interaction designers?

  • How can research in music interaction be evaluated?

The six chapters (Chaps.  2 7) comprising Part I focus on the following six specific research areas:
  • Tools, methods and guidelines for the design of interactive music systems (Chap.  2).

  • Design for technologically mediated audience participation (Chap.  3).

  • Reflections on historical connections between HCI and music interaction (Chap.  4).

  • Interaction design for playful exploration and analysis of rhythm through reflection on existing genres and repertoire (Chap.  5).

  • Mathematical theories of rhythms as challenges for interaction design (Chap.  6).

  • Matching opportunities and challenges from music interaction with insights from HCI and vice versa (Chap.  7).

1.4.1 Tools, Methods and Guidelines for the Design of Interactive Music Systems

In Chap.  2 (“A Design WorkBench for Interactive Music Systems”) Malloch et al. (2019) consider how to design interactive musical systems for extreme users who are committed to developing their own instrumental expertise. Tools, methods and guidelines to assist in designing for this exacting user group are explored through a series of case studies. The chapter explores a series of challenges: exact timing is critical; there are no easily definable tasks to evaluate; bi-manual input is the rule; the practices of performers and composers are highly diverse; meaningful evaluations require extended time periods, and the primary goal of interactions is typically not primarily about the orderly transfer of information. In order to deal with these and other challenges, a range of models to support the design and evaluation of interactive music systems is considered. For example, Rasmussen’s Human Information Processing (Rasmussen 1986; Malloch et al. 2019) framework is explored to help make sense of the wide span of temporal scales involved in musical interactions, ranging from tightly coupled sensory motor loops measured in milliseconds to the more leisurely timescales of live coding and algorithmic music. This framework can be related in part to the concept of liveness in HCI as discussed at a 2012 CHI workshop (Hook et al. 2012) and to Tanimoto’s theoretical framework for levels of liveness in programming environments (Tanimoto 1990)—as adapted to music systems by Church et al. (2010).

The authors note that many innovative music interaction tools tend to be implemented as one-off features that cannot easily be re-used in other systems. Beaudouin-Lafon’s HCI technique of Instrumental Interaction (2000) is considered as a framework with the potential to allow much greater re-use of tools such as instrumental ensembles, rhythmic grids, harmonic systems, sequences and scores—with Garcia’s ‘substrates for composers’ (Garcia et al. 2012) noted as an illuminating case study. Finally, a range of approaches and models are considered, including Mackay’s notion of Co-adaptive Systems (Mackay 2000) and eight guidelines are presented to support the design of interfaces for musical expression.

1.4.2 Design for Technologically Mediated Audience Participation

Historically, audiences have had time-honoured but limited ways of participating in live music performances to influence the behaviour of performers. However, mobile, wearable and pervasive technologies have created new opportunities for such interactions. Designing such systems can be challenging, as it is easy to unintentionally disturb balances between the coherence of the music, the needs of musicians, audience expectations and feelings of collective engagement. In Chap.  3 (“TMAP Design Cards for Technology-Mediated Audience Participation in Live Music”) Hödl et al. (2019) outline a framework for supporting interaction design and evaluation created specifically to help interaction designers explore and balance such tensions. This chapter reports on the design and evaluation of an innovative card-based system based on this framework.

1.4.3 Reflections on Historical Connections Between HCI and Music Interaction

The first of the book’s five interviews, Chap.  4 (“The Poetry of Strange Connections: an Interview with Bill Verplank”) considers the career of HCI pioneer Bill Verplank (McPherson and Verplank 2019). Verplank’s career as a designer stretches from the earliest days of human computer interaction at Stanford and MIT (when it was known as ‘man machine systems’) through to his current influential role as champion, amongst other things, of sketching as a vital design practice. During his career, Verplank has contributed to key research groups and organisations in the development of human computer interaction, design, the arts and music including periods at Xerox Parc, IDEO (a seminal design and consultancy firm), Interval Research (an influential technology incubator), CCRMA (Stanford’s Center for Computer Research in Music and Acoustics) the Royal College of Art in London, and the Interaction Design Institute at Ivrea. Together with Bill Moggridge of IDEO, he coined the term ‘Interaction Design’.

Drawing on his experiences, Verplank discusses how his career as a designer has periodically intersected with musical interaction, and reflects on subtle interconnections between engineering, computer science, art and music. Verplank explores the genealogy of some of the well-known tools and methods in music and maker technology and highlights the importance of haptics in musical interfaces.

1.4.4 Interaction Design for Playful Exploration and Analysis of Rhythm Through Reflection on Existing Genres and Repertoire

The preceding book (Holland et al. 2013) stressed approaches to developing rhythm skills (Bouwer et al. 2013) that drew on notions of embodiment (Wilkie et al. 2009, 2010) haptic interaction (Holland et al. 2018) and whole-body interaction (Holland et al. 2011). However, two chapters in this book (Chaps.  5 and  6) explore powerful approaches to interaction with rhythm based on entirely different starting points. While both draw heavily on visualisation and use visual representations that look superficially similar, they exploit two different sources of power and propose contrasting approaches to learning about rhythm. Milne’s XronoMorph (discussed below) draws its source of power from recent mathematical theories of rhythm, while Hein and Srinivasan’s (2019) “Groove Pizza” (Chap.  5) draws power from a clear, readily manipulable tool for exploring existing genres and repertoire in a manner well suited to helping those with and without formal musical training gain an understanding of the evolution of rock, funk, hip-hop, electronic dance music, afro-Latin and related styles. Examples from popular music are mapped onto visual patterns to help support the identification of elements of rhythm that characterise and distinguish disparate idioms. In line with its inclusive aims, Groove Pizza has been developed as a free web app to encourage the widest possible access and inclusion.

1.4.5 Mathematical Theories of Rhythms as Challenges for Interaction Design

Design challenges arise in the transformation of theories about aspects of music from mathematics (Prechtl et al. 2009; Milne et al. 2011) physics and cognitive psychology (Holland et al. 2016; Bouwer et al. 2013) into useful interactive tools for composers, performers and learners. In Chap.  6 (“XronoMorph: Investigating Paths Through Rhythmic Space”), Milne (2019) takes two recent mathematical theories of rhythm—balanced rhythms and well-formed rhythms (Toussaint 2013) and considers applications to musical performance and composition. Milne pursues the implication of these theories, creating an interactive tool that allows composers and performers to create new rhythms either offline or in real time, visualising relevant structures, varying relevant parameters, and hearing transformations in real time with a high level of liveness (Church et al. 2010). Both categories of rhythm are amenable to musically meaningful stepwise or continuous change in real time. Rhythms that are well known in various genres can be readily recreated, but it is equally possible to produce unfamiliar but engaging polyrhythms, such as rare examples catalogued by Arom (1991) or to navigate through unexplored rhythmic territory. Given Milne’s tool, no great musical knowledge is required to create interesting rhythms or to perform them in real time (though to use this to follow the lead of other improvisers, for example might be challenging). However, despite its openness to beginners, Milne’s tool has the capacity to repay extensive practice and study on the part of expert musicians.

1.4.6 Matching Opportunities and Challenges from Music Interaction with Insights from HCI and Vice Versa

Wendy Mackay, interviewed in Chap.  7 “HCI, Music and Art: An Interview with Wendy Mackay” (Wanderley and Mackay 2019) is well placed to match opportunities and challenges from music interaction with powerful tools, methods and frameworks drawn from HCI. Mackay is a past chair of CHI and former holder of all of the CHI offices. Mackay has pioneered research that spans HCI and the arts, including several collaborations with IRCAM composers, performers and researchers. Mackay co-developed the influential ‘technology probe’ design methodology (Hutchinson et al. 2003), which combines engineering aspects of iterative design, social science enquiry and participative problem framing. This methodology has proved well suited to illuminating design challenges in music interaction research (for example, Garcia et al. 2012). In the interview, Mackay discusses diverse aspects of her work, including issues that arise when designing for particular individuals (e.g. individual composers and performers) as opposed to classes of users. Methods appropriate to evaluating systems in this category are discussed. Mackay highlights the importance of evaluating not just what a new technology is supposed to do, but what people actually do with it—often unexpectedly. Mackay highlights the importance of issues of discoverability, appropriability and expressivity for interaction designers and evaluators.

1.5 Part II Interaction

The second major theme in this volume is Interaction. Several chapters engage with the aesthetic nature of interactions with musical tools, and the aesthetic nature of the tools themselves. Tools shape the way we think; they encourage certain possibilities and discourage others. For examples of such influences, see Magnusson’s (2010) discussion of the affordances and constraints of instruments and Tuuri et al. (2017) on the push and pull effects of interfaces. The research questions posed at the head of Part I serve once more as a backdrop for the chapters in this part, augmented by at least one additional research question formulated at the 2016 workshop.

How does the relationship between performer and digital musical instrument differ from more familiar human-computer relationships?

The six chapters (Chaps.  8 13) comprising this part focus on the following six research areas:
  • How musical tools affect what musicians do: communication-oriented vs material-oriented perspectives (Chap.  8).

  • Music interaction and HCI: a long view (Chap.  9).

  • Interaction design without explicit planning: difficulty and error as creative tools (Chap.  10).

  • Music interaction by non-intentional cognitive and affective state (Chap.  11).

  • Investigating unstated assumptions in instruments for novices and the effects of these assumptions (Chap.  12).

  • Learning about music interaction from artists, musicians, designers, ethnographers and HCI researchers (Chap.  13).

1.5.1 How Musical Tools Affect What Musicians Do: Communication-Oriented Versus Material-Oriented Perspectives

Mudd (2019) develops the notion of a material-oriented perspective further in Chap.  8 (“Material-Oriented Musical Interactions”) exploring questions of influence and agency both in the design and use of musical tools. A material-oriented perspective considers instruments as active agents in the creative process, and the tools themselves may be the subject of the artistic work. This is contrasted with a communication-oriented perspective, where the instrument is viewed as an ideally transparent conduit for communicating ideas. These ideas are considered in relation to both the design of new musical interfaces, and the nature of the programs that these tools are often built with, such as Max, Supercollider or Pure Data.

1.5.2 Music Interaction and HCI: A Long View

In Chap.  9 (“Embodied Musical Interaction: Body Physiology, Cross Modality, and Sonic Experience”) Tanaka (2019) takes a long view, bringing together key historical paradigms, taxonomies and accounts from HCI and investigating the perspectives they bring to musical interaction. Firstly Dourish’s (2004) four stages of interaction (electrical, symbolic, textual, and graphical) are considered in the context of musical interaction (e.g. for analog synthesizers, text-based musical coding languages, direct manipulation interfaces and skeuomorphic visual metaphors). Secondly, the distinct “waves” of HCI as summarised by Harrison et al. (2007) are related to corresponding developments in music. Finally, three accounts of interaction design presented by Fallman (2003) are explored in the design of musical tools: a conservative account rooted in rational, engineering-focused thinking; a pragmatic account where design attempts to account for the specific situations, cultures and histories; and finally, a romantic account where design specifications and engagement with defined problems are subordinate to the designer’s creative freedom. Finally, these paradigms are brought to bear on three recent projects related to the author’s research: exploring haptic interfaces for audio editing, user-centred design for investigating everyday experiences of sound, and muscle sensor interfaces for musical control.

1.5.3 Interaction Design Without Explicit Planning: Difficulty and Error as Creative Tools

In Chap.  10 “Making as Research: An Interview with Kristina Andersen”) Kristina Andersen (Mudd and Andersen 2019) outlines some of the key considerations emerging from music, art and design that play an important role in new HCI research. She highlights the focus on process, making and building in the arts—prior to any fixed plans or formal requirements—as being an essential component. Making becomes an important method for understanding objects and interactions. As a long-time researcher at STEIM (Studio for Electro-Instrumental Music) in Amsterdam for 15 years, and more recently a researcher in the Future Everyday group at Industrial Design at TU Eindhoven and independent design practitioner, Kristina Andersen has extensive experience of design in situations that lack clear pre-existing requirements. One approach to design in such situations is to employ the material perspective: to follow idioms and limitations of the design materials being used. Andersen draws deeply on this perspective, but advocates a nuanced and balanced path that may, for example, involve switching between materials mid-design in order to better distinguish tacit design goals from idioms bound with particular materials. Andersen’s explorations of design in such contexts has interesting implications for music interaction. She points out:

The difficulty of a traditional instrument means that your playing errors are going to flavour your music, and you have the option at some point to decide that this is your style and then develop that. That’s a little harder to do with something that has a designed or HCI-informed interface, which has so far been tending towards wanting to assist and support you to eliminate those errors…

Andersen notes that her approach relates to ‘classic art strategies of improvisation and estrangement’ and discusses lessons both for music interaction specifically and HCI more generally.

1.5.4 Music Interaction by Non-intentional Cognitive and Affective State

In Chap.  11 (“Detecting and Adapting to Users’ Cognitive and Affective State to Develop Intelligent Musical Interfaces”) Yuksel et al. (2019) explore musical interfaces and interactions that adapt to the user’s cognitive state. They demonstrate an approach to the application of brain sensing and neuroimaging technologies in music education that is potentially applicable to any instrument. One key aspect involves balancing the learner’s level of challenge. When learning any instrument, an appropriate level of challenge facilitates learning, whereas too much challenge can overwhelm and intimidate. Yuksel et al. describe how they passively measure cognitive load and affective state as key mechanisms in two intelligent piano tutors. The two systems employ near-infrared forehead sensors tuned to frequencies at which the skull is transparent. These intelligent tutors, developed by the authors, work in two different ways. The first dynamically adjusts the level of to match the learner’s cognitive workload. The second focuses on a piano-based improvisation system that adds and removes harmonies based on the user’s state to test the use of cognitive adaptations in creative situations. The authors reflect on how implicit interactions of this kind that do not depend on conscious intention from the player might be used to improve player’s performance and learning.

1.5.5 Investigating Unstated Assumptions in Instruments for Novices and the Effects of These Assumptions

In Chap.  12 (“Musical Instruments for Novices: Comparing NIME, HCI and Crowdfunding Approaches”) McPherson et al. (2019) survey recent digital musical instruments released through academic and crowdfunding channels, with a particular focus on instruments which advertise their accessibility to novice musicians (“anyone can make music”). The authors highlight a range of unstated assumptions in the design of these instruments which may have a strong aesthetic influence on performers. Particularly in the crowdfunding instruments, many of the instruments are MIDI controllers, which adopt a keyboard-oriented musical outlook which assumes the note as a fundamental unit. Another common assumption amongst these instruments is that music is made easier to play by limiting the pitch space or quantising the timing. Many of the instruments surveyed in this chapter advertise their versatility (“play any sound you can imagine”) and they seem to target transparency or communication metaphors. And yet curiously, they may exhibit as strong an aesthetic influence as any material-oriented exploration (see Chap.  8 of this book, described above) and may in fact encode ideas of aesthetics and style to a point that becomes limiting to the performer.

1.5.6 Learning About Music Interaction from Artists, Musicians, Designers, Ethnographers and HCI Researchers

The Part of the book on Interaction concludes with an interview with Steve Benford (McPherson and Benford 2019) in Chap.  13 (“Music, Design and Ethnography: An Interview with Steve Benford”). He explains how his HCI research became increasingly involved with the arts, and specifically with music in recent years. He discusses the value of ethnography to understand how communities operate, and he explores the many ways that engaging with artists and practice-led research can inform HCI. Benford points to intermediate design knowledge—heuristics, concepts and guidelines as one way of bridging between outcomes in practice-led research and outcomes in Ph.D. research more generally. Benford’s long experience of bringing together artists, musicians, designers and HCI researchers, and his pioneering work on multi-user virtual and mixed reality experiences dovetails into the third major theme of this book: Collaboration. The chapters in this section explore some of the ways in which technology can assist, augment, and provide new models for creative collaborations.

1.6 Part III Collaboration

Following the lead of the previous two parts, the research questions posed at the head of Parts I and II are carried forward serve as backdrop for the chapters in this part, augmented by the following further research questions formulated at the 2016 workshop:
  • How can embodied interaction deepen the understanding of, and engagement with, music by dancers?

  • What can music interaction learn from games design?

  • How can performance-oriented musical interfaces be evaluated during group improvisation?

  • What is the role of machine learning in music interaction?

The five chapters comprising this part focus on the following six research areas:
  • Innovative multi-performer music performance environments inspired by games mechanics (Chap.  14).

  • Enriching immersive environments for music composition and performance with new forms of game-inspired interaction (Chap.  15).

  • New forms of music interaction that draw on machine learning to help promote inclusion, participation, and accessibility (Chap.  16).

  • Giving dancers precise engaging live agency over music (Chap.  17).

  • Group rehearsal by free improvisers for research and evaluation (Chap.  18).

1.6.1 Innovative Multi-performer Music Performance Architectures Inspired by Games Mechanics

Chapters  14 and  15 explore game techniques and mechanics in the context of interactive musical systems. In Chap.  14 (“Applying Game Mechanics to Networked Music HCI Applications”) Camci et al. (2019). provide an overview of ways in which methods and theories from game design can be used in collaborative/distributed musical applications, focusing on introducing the notions of competition and reward. They introduce and evaluate a system called Monad—a collaborative audio-visual game-like environment—in two versions, differing according to whether the system distributes the rewards, or whether players themselves distribute the rewards.

1.6.2 Enriching Immersive Environments for Music Composition and Performance with New Forms of Game-Inspired Interaction

In Chap.  15 (“Mediated Musical Interactions in Virtual Environments”) Hamilton (2019) investigates musical interaction in virtual environments, focusing on the opportunities offered by virtual environments to define new musical interactions—or influence more traditional ones—thus escaping real-world limitations of classical instruments. In particular, this research expands on the classical view of a musical instrument to include “mediation layers” (e.g. rules governing object behaviours, agents, third-party actors, etc.) as a major factor influencing musical interactions, eventually disrupting them. Having argued the case for the importance of such layers, the author then describes three main works: Carillon, OscCraft and Echo::Canyon (sic), citing the various types of mediation layers and interaction possibilities in each one of them.

1.6.3 New Forms of Music Interaction that Draw on Machine Learning to Help Promote Inclusion, Participation, and Accessibility

Chapter  16 (Holland and Fiebrink 2019) is an interview with Rebecca Fiebrink. Fiebrink discusses her work at the intersection of HCI and music, and the role that machine learning can play there. She talks about the origins of the Wekinator, her open source tool for real-time machine learning, and her more recent research helping to make machine learning processes accessible and readily implementable for artists and musicians—and therapists working with people with disabilities. She reflects on different possible models for collaboration between human creators and machine learning algorithms, and the different roles that the algorithms might play in such collaborations.

1.6.4 Group Rehearsal by Free Improvisers for Research and Evaluation

Chapters  17 and  18 both investigate the role of new technologies in existing musical practices. In Chap.  17 (“Free-Improvised Rehearsal-as-Research for Musical HCI”) Martin and Gardner (2019) explore the idea of free improvised group rehearsals as contexts for the formal evaluation of new performance-oriented musical interfaces. The authors elaborate a methodology for qualitative and quantitative data collection that retains the freedom and creative, collaborative atmosphere of the rehearsal. Two recent studies that use this methodology are presented that evaluate touch-screen musical applications.

1.6.5 Giving Dancers Precise Engaging Live Agency Over Music

In Chap.  18 (“A Case Study in Collaborative Learning via DMIs for Participatory Music: Interactive Tango Milonga”) Brown and Paine (2019) focus on a problem faced by amateur dancers—more specifically those learning to dance the Argentine Tango in a social context. The problem is that learners often find it hard to balance the highly improvisational nature of the dance form with the equally important need to harmonise both with their partner’s movements and the accompanying musicians. As a result, many fail to develop the skills to engage as dancers with the accompanying music. As part of an effort to address this problem, Brown and Paine developed an interactive dance system that gave tango dancers detailed real time control over the algorithmically generated music, via their dance movement. This made possible a process of simultaneously making music and dancing to the music, promoting otherwise difficult to acquire listening skills. This chapter includes reflections on matters of wide relevance—including methods for developing sustainable participatory music systems for non-musicians—and methods for studying the long terms effects of such systems.

1.7 Conclusions

The beginning of this chapter observed the long-shared history of music and computing, noting that the use of computers to make music may be viewed as the latest expression of a fundamental human tendency to use technology for musical purposes. Musical practice itself evolves in response to the available technology, such that it is hardly a novel observation to say that computing has changed the kind of music we make.

What is perhaps less obvious, as this book illustrates, is that the study of musical interaction from an HCI perspective—even if the musical scenarios being studied do not involve digital instruments—can yield not only a better understanding of how humans and machines interact, but also new forms of artistic practice and culture. There are several aspects to this. Musical HCI research has long sought to create opportunities for new communities to participate in established forms of music making; Chaps.  3,  4 and  5 of this book (Hödl et al. 2019; McPherson et al. 2019; Hein and Srinivasan 2019) highlight some of the ways this problem is being approached in research and commercial domains. Other parts of the book, such as Chaps.  9,  10 and  11 (Tanaka 2019; Mudd 2019; Yuksel et al. 2019) present musical practices that depend not only on computing, but on specific insights into human-technology interaction. In yet other cases, such as Chaps.  14,  15 and  18 (Camci et al. 2019; Hamilton 2019; Brown and Paine 2019), musical interaction research yields a hybrid practice between traditional music-making and other domains such as virtual reality, gaming or dance.

Many of the directions in this book, perhaps more than those of its predecessor “Music and Human-Computer Interaction” (Holland et al. 2013), respond to developments in second-wave and especially third-wave HCI (discussed further in Chap.  9). Third-wave HCI considers how computing integrates into our lives and becomes deeply intertwined with our cultural values, while bringing in wider disciplinary perspectives ranging from design to psychology to philosophy. In the musical interaction community at large, there is increasing awareness of the domain as concerned not only with control and ergonomics, but also with the cultural values that are embedded in every tool we make and with the way those tools are situated in particular musical communities. The importance of culture in music interaction research comes out particularly strongly in the interviews in this book, but it is a theme that can be found throughout many of the chapters.

Looking ahead, then, we might expect that the “new directions” suggested in the title of this book are evolving and will continue to evolve not only from a combination of traditional musical practice, engineering and computer science, but from a broad multidisciplinary space spanning the arts, humanities, social sciences, engineering, mathematics and philosophy. This steady widening of scope is already underway in HCI, and musical interaction stands be both a beneficiary and one of the drivers of this trend.

Notes

Acknowledgements

The editors would like to thank workshop co-organisers not represented by chapters: Sile O’Modhrain, Michael Gurevich and Andrew Johnston. We would also like to thank workshop participants not otherwise represented in this book who made valued contributions to discussions at the workshop: Ge Wang, Gian-Marco Schmid, Jordi Janer, Jeff Gregorio, Sam Ferguson, Frédéric Bevilacqua, Edgar Berdahl and Mathieu Barthet. Finally, we would like to thank Helen Desmond at Springer.

References

  1. Angelis V, Holland S, Clayton M, Upton PJ (2013) Testing a computational model of rhythm perception using polyrhythmic stimuli. J New Music Res 42(1)CrossRefGoogle Scholar
  2. Arom S (1991) African polyphony and polyrhythm: musical structure and methodology. Cambridge University Press, CambridgeCrossRefGoogle Scholar
  3. Baecker R (1969) Picture driven animation. In: Proceedings of the AFIPS spring joint computer conference, vol 34, pp 273–288Google Scholar
  4. Balzano GJ (1980) The group-theoretic description of 12-fold and microtonal pitch systems. Comput Music J 4(4). Winter 1980CrossRefGoogle Scholar
  5. Beaudouin-Lafon M (2000) Instrumental Interaction: an interaction model for designing post-WIMP user interfaces. In: Proceedings ACM CHI ‘00, pp 446–453Google Scholar
  6. Blanchard C, Burgess S, Harvill Y, Lanier J, Lasko A, Oberman M, Teitel M (1990) Reality built for two: a virtual reality tool. ACM SIGGRAPH Comput Graph 24(2):35–36. ACMCrossRefGoogle Scholar
  7. Bongers B (2000) Physical interfaces in the electronic arts. Trends in gestural control of music, pp 41–70Google Scholar
  8. Bouwer A, Holland S, Dalgleish M (2013) The haptic bracelets: learning multi-limb rhythm skills from haptic stimuli while reading. In: Holland S, Wilkie K, Mulholland P, Seago A (eds) Music and human-computer interaction. Cultural Computing. Springer, LondonCrossRefGoogle Scholar
  9. Brown DE (1991) Human universals. McGraw-Hill, New YorkGoogle Scholar
  10. Brown C, Paine G (2019) A case study in collaborative learning via DMIs for participatory music: interactive Tango Milonga. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  11. Brown S, Martinez MJ, Parsons LM (2004) Passive music listening spontaneously engages limbic and paralimbic systems. NeuroReport 15:2033–2037CrossRefGoogle Scholar
  12. Buxton W (2008) My vision isn’t my vision: making a career out of getting back to where I started. In: Erickson T, McDonald D (eds) HCI remixed: reflections on works that have influenced the HCI community. MIT Press, Cambridge, MA, pp 7–12Google Scholar
  13. Camci A, Cakmak C, Forbes AG (2019) Applying game mechanics to networked music HCI applications. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  14. Church L, Nash C, Blackwell A (2010) Liveness in notation use: from music to programming. Psychology of Programming Interest Group PPIG 2010Google Scholar
  15. Cross I (2016) The nature of music and its evolution. In: Hallam S, Cross I, Thaut M (eds) Oxford handbook of music psychology, 2nd edn. Oxford, Oxford University Press, pp 3–17Google Scholar
  16. Darrow A (2006) The role of music in deaf culture: deaf students’ perception of emotion in music. J Music Ther 43(1):2–15. 1 March 2006,  https://doi.org/10.1093/jmt/43.1.2CrossRefGoogle Scholar
  17. DʼErrico F, Henshilwood C, Lawson G, Vanhaeren M, Tillier A-M, Soressi M et al (2003) Archaeological evidence for the emergence of language, symbolism, and music—an alternative multidisciplinary perspective. J World Prehistory 17(1):1–70Google Scholar
  18. Dourish P (2004) Where the action is: the foundations of embodied interaction. MIT PressGoogle Scholar
  19. Duchesne-Guillemin M (1963) Découverte D’une Gamme Babylonienne. Revue De Musicologie 49(126):3–17CrossRefGoogle Scholar
  20. Engelbart DC, English WK (1968) A research center for augmenting human intellect. In: Proceedings of the December 9–11, 1968, fall joint computer conference, part I (AFIPS ‘68 (Fall, part I)). ACM, New York, NY, USA, pp 395–410Google Scholar
  21. Fallman D (2003) Design-oriented human-computer Interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’03. ACM, New York, NY, USA, pp 225–232.  https://doi.org/10.1145/642611.642652
  22. Fenichel E (2002) The musical lives of babies and families. Zero Three 23(1). National Center for Infants, Toddlers and Families, Washington DCGoogle Scholar
  23. Fitch WT (2006) The biology and evolution of music: a comparative perspective. Cognition 100:173–215CrossRefGoogle Scholar
  24. Garcia J, Tsandilas T, Agon C, Mackay W (2012) Interactive paper substrates to support musical creation. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 1825–1828Google Scholar
  25. Hamilton R (2019) Mediated musical interactions in virtual environments. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  26. Hargreaves D, North A (1997) The social psychology of music. Oxford University Press, OxfordGoogle Scholar
  27. Harrison S, Tatar D, Sengers P (2007) The three paradigms of HCI. In: Alt. Chi. session at the SIGCHI conference on human factors in computing systems, San Jose, California, USA, pp 1–18Google Scholar
  28. Hein E, Srinivasan S (2019) The Groove Pizza. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  29. Hödl O, Kayali F, Fitzpatrick G, Holland S (2019) TMAP design cards for technology-mediated audience participation in live music. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  30. Holland S (1989) Artificial intelligence, education and music. PhD thesis, The Open University, Milton Keynes, UKGoogle Scholar
  31. Holland S (2000) Artificial intelligence in music education: a critical review, pp 239–274. In: Miranda ER (ed) Artificial intelligence in music education: a critical review. Readings in music and artificial intelligence. Routledge, pp 249–284Google Scholar
  32. Holland S, Fiebrink R (2019) Machine learning, music and creativity: an interview with Rebecca Fiebrink. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  33. Holland S, Wilkie K, Bouwer A, Dalgleish M, Mulholland P (2011) Whole body interaction in abstract domains. In: England D (ed) Whole body interaction. Human–computer interaction series, Springer Verlag, London. ISBN 978-0-85729-432-6CrossRefGoogle Scholar
  34. Holland S, Wilkie K, Mulholland P, Seago A (2013) Music interaction: understanding music and human-computer interaction. In: Holland S, Wilkie K, Mulholland P, Seago A (eds) Music and human-computer interaction, pp 1–28CrossRefGoogle Scholar
  35. Holland S, Wright RL, Wing A, Crevoisier T, Hödl O, Canelli M (2014) A Gait Rehabilitatin pilot study using tactile cueing following Hemiparetic Stroke. In: Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare Pervasive Health 2014, pp 402–405Google Scholar
  36. Holland S, McPherson AP, Mackay WE, Wanderley MM, Gurevich MD, Mudd TW, O’Modhrain S, Wilkie KL, Malloch J, Garcia J, Johnston A (2016) Music and HCI. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’16). ACM, New York, NY, USA, 3339–3346Google Scholar
  37. Holland S, Bouwer A, Hödl O (2018) Haptics for the development of fundamental rhythm skills, including multi-limb coordination. In: Papetti S, Saitis C (eds) Musical haptics. Springer series on touch and haptic systems. Springer International Publishing, pp 215–237Google Scholar
  38. Honing H, Ladinig O, Winkler I, Háden G (2009) Is beat induction innate or learned? Probing emergent meter perception in adults and newborns using event-related brain potentials (ERP). Ann N Y Acad Sci 1169:93–96CrossRefGoogle Scholar
  39. Hook J, Schofield G, Taylor R, Bartindale T, McCarthy J, Wright P (2012) Exploring HCI’s relationship with liveness. CHI ‘12 extended abstracts on human factors in computing systems (CHI EA ‘12). ACM, New York, NY, USA, pp 2771–2774Google Scholar
  40. Hunt A, Wanderley MM, Kirk R (2000) Towards a model for instrumental mapping in expert musical interaction. In: ICMCGoogle Scholar
  41. Hutchinson H, Mackay W, Westerlund B, Bederson BB, Druin A, Plaisant C, Beaudouin-Lafon M et al (2003) Technology probes: inspiring design for and with families. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 17–24Google Scholar
  42. Justus T, Jamshed B (2002) Music perception and cognition. In: Stevens’ handbook of experimental psychology. Wiley, pp 453–492Google Scholar
  43. Koetsier T (2001) On the prehistory of programmable machines: musical automata, looms, calculators. Mech Mach Theory 36:589–603CrossRefGoogle Scholar
  44. Lanier J (1989) Personal communication with Simon Holland and other attendees at 1989. In: Nato advanced research workshop on multimedia interface design in education, Lucca, ItalyGoogle Scholar
  45. Longuet-Higgins HC (1976) Perception of melodies. Nature 263:646–653CrossRefGoogle Scholar
  46. Mackay WE (2000) Responding to cognitive overload: co-adaptation between users and technology. Intellectica 30(1):177–193Google Scholar
  47. Magnusson T (2010) Designing constraints: composing and performing with digital musical systems. Comput Music J 34(4):62–73CrossRefGoogle Scholar
  48. Malloch J, Garcia J, Wanderley MM, Mackay WE, Beaudouin-Lafon M, Huot S (2019) A design WorkBench for interactive music systems. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  49. Martin CP, Gardner H (2019) Free-improvised rehearsal-as-research for musical HCI. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  50. McPherson A, Benford S (2019) Music, design and ethnography: an interview with Steve Benford. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  51. McPherson A, Verplank B (2019) The poetry of strange connections: an interview with Bill Verplank. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  52. McPherson A, Morreale F, Harrison J (2019) Musical instruments for novices: comparing NIME, HCI and Crowdfunding approaches. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  53. Mehr SA et al (2016) For 5-month olds melodies are social. Psychol Sci 27:486–501CrossRefGoogle Scholar
  54. Milne A (2019) XronoMorph: investigating paths through rhythmic space. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  55. Milne AJ, Holland S (2016) Empirically testing Tonnetz, voice-leading, and spectral models of perceived triadic distance. J Math Music 10(1):59–85MathSciNetCrossRefGoogle Scholar
  56. Milne AJ, Carlé M, Sethares WA, Noll T, Holland S (2011) Scratching the scale labyrinth. International conference on mathematics and computation in music. Springer, Berlin, Heidelberg, pp 180–195CrossRefGoogle Scholar
  57. Mithen S (2006) The ‘Singing Neanderthals’: the origins of music, language, mind and body. Camb Archaeol J 16:97–112CrossRefGoogle Scholar
  58. Mudd T (2019) Material-oriented musical interactions. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  59. Mudd T, Andersen K (2019) Making as research: an interview with Kristina Andersen. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  60. NRC (1970) From handel to haydn to the headless musician, science dimension 2(3), June 1970. http://ieee.ca/millennium/electronic_music/em_headless.html
  61. NRC (1971). The music machine. 11 minute 16 mm film produced by the National Research Council of Canada. http://www.billbuxton.com
  62. Pennycook BW (1985) Computer-music interfaces: a survey. ACM Comput Surv 17(2):267–289CrossRefGoogle Scholar
  63. Poupyrev I, Lyons MJ, Fels S, Blaine TB (2001) New interfaces for musical expression. In: Workshop proposal for SIGCHI 2001, Seattle, WA. http://www.nime.org/2001/docs/proposal.pdf
  64. Prechtl A, Milne AJ, Holland S, Laney R, Sharp DB (2009) A midi sequencer that widens access to the compositional possibilities of novel tunings. Comput Music J 36(1):42–54CrossRefGoogle Scholar
  65. Rasmussen J (1986) Information processing and human-machine interaction: an approach to cognitive engineering. Elsevier Science Inc, New York, USAGoogle Scholar
  66. Salimpoor VN et al (2013) Interactions between the nucleus accumbens and auditory cortices predict music reward value. Science 340:216–219CrossRefGoogle Scholar
  67. Standley JM (2011) Efficacy of music therapy for premature infants in the neonatal intensive care unit: a meta-analysis. Arch Dis Childhood Fetal Neonatal Ed 96:Fa52CrossRefGoogle Scholar
  68. Sutherland IE (1964) Sketch pad a man-machine graphical communication system. In: Proceedings of the SHARE design automation workshop. ACM, pp 6–329Google Scholar
  69. Tanaka A (2000) Musical performance practice on sensor-based instruments. Trends Gestural Control Music 13(389–405):284Google Scholar
  70. Tanaka A (2019) Embodied musical interaction: body physiology, cross modality, and sonic experience. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  71. Tanimoto SL (1990) VIVA: a visual language for image processing. J Vis Lang Comput 1(2):127–139CrossRefGoogle Scholar
  72. Toussaint GT (2013) The geometry of musical rhythm: what makes a “Good” rhythm good?. CRC Press, Boca RatonzbMATHGoogle Scholar
  73. Tuuri K, Jaana P, Pirhonen A (2017) Who controls who? Embodied control within human–technology choreographies. Interact Comput 29(4):494–511Google Scholar
  74. Ungvary T, Vertegaal R (2000) Cognition and physicality in musical cyberinstruments. In: Trends in gestural control of music, pp 371–386Google Scholar
  75. Vuilleumier P, Trost W (2015) Music and emotions: from enchantment to entrainment. Neurosci Mus V Cogn Stimul Rehabilit 1337:212–222.  https://doi.org/10.1111/nyas.12676CrossRefGoogle Scholar
  76. Wallis I, Ingalls T, Campana E, Vuong C (2013) Amateur musicians, long-term engagement, and HCI. In: Music and human-computer interaction. Springer, London, pp 49–66CrossRefGoogle Scholar
  77. Wanderley MM, Battier M (2000) Trends in gestural control of music. IRCAM, Centre PompidouGoogle Scholar
  78. Wanderley MM, Mackay W (2019) HCI, music and art: an interview with Wendy Mackay. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  79. Wilkie K, Holland S, Mulholland P (2009) Evaluating musical software using conceptual metaphors. In: Blackwell A (ed) Proceedings of the 23rd British HCI group annual conference on people and computers. pp 232–237. ISSN 1477-9358Google Scholar
  80. Wilkie K, Holland S, Mulholland P (2010) What can the language of musicians tell us about music interaction design? Comput Music J Winter 34(4):34–48. Massachusetts Institute of TechnologyCrossRefGoogle Scholar
  81. Xenakis I (1992) Formalized music: thought and mathematics in composition, second, revised English edition, with additional material translated by Sharon Kanach. Harmonologia Series No. 6. Pendragon Press, Stuyvesant, NY. ISBN 1-57647-079-2Google Scholar
  82. Yuksel BF, Oleson KB, Chang R, Jacob RJK (2019) Detecting and adapting to users’ cognitive and affective state to develop intelligent musical interfaces. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  83. Zhang Y, Cai J, An L, Hui F, Ren T, Ma H et al (2017) Does music therapy enhance behavioral and cognitive function in elderly dementia patients? A systematic review and meta-analysis. Ageing Res Rev 35:1–11CrossRefGoogle Scholar
  84. Zimmerman TG (1982) An optical flex sensor US patent 4542291 VPL Res Inc Redwood City California USGoogle Scholar
  85. Zimmerman TG, Lanier J, Blanchard C, Bryson S, Harvill Y (1986) A hand gesture interface device. In: Carroll JM, Tanner PP (eds) Proceedings of the SIGCH conference on human factors in computing systems (CHI ’87). ACM, New York, pp 189–192Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Simon Holland
    • 1
    Email author
  • Tom Mudd
    • 2
  • Katie Wilkie-McKenna
    • 3
  • Andrew McPherson
    • 4
  • Marcelo M. Wanderley
    • 5
    • 6
  1. 1.Music Computing Lab, Centre for Research in ComputingThe Open UniversityMilton KeynesUK
  2. 2.Reid School of MusicUniversity of EdinburghEdinburghUK
  3. 3.Music Computing Lab, Centre for Research in ComputingThe Open UniversityMilton KeynesUK
  4. 4.School of Electronic Engineering and Computer Science, Centre for Digital MusicQueen Mary University of LondonLondonUK
  5. 5.Centre for Interdisciplinary Research in Music Media and TechnologyMcGill UniversityMontrealCanada
  6. 6.Inria Lille – Nord EuropeVilleneuve d’AscqFrance

Personalised recommendations