Keywords

1 Introduction

In 2019 the Coachella music festival featured an “interactive augmented reality (AR) stage,” in which audiences could use a mobile application [1] to enhance the experience through various fantastical contents, which appeared superimposed on the Sahara venue when viewed through a camera [2,3,4]. When viewed through the app, audiences could see “space objects like planets, asteroids, and stars” [5] within the performance space, providing digital visual content that complements the audience experience of music. This is just one example of extended reality (XR) technologies being brought into live performance spaces for electronic dance music. XR is an umbrella term that encompasses augmented/mixed-reality (AR/MR), virtual reality (VR), and more broadly, other associated technologies such as multi-projection 360-projection environments and fulldomes, all of which are increasingly being used to complement or visualize music at electronic dance music events, by VJs and digital artists.

The use of these technologies points toward a future in which public performance spaces for music are enhanced through immersive XR content. Music performance spaces are already immersive—sounds, lighting, and communal experiences of dance can all contribute toward generating powerful social experiences that are emotive and meaningful [6]. Yet, while sound can fully engulf the audience as a spatial, aural experience, visual accompaniments too can go further in surrounding and immersing the audience. Through the use of XR, we can begin to conceive music performances where eventually a mesh of synthetic 3D graphics engulfs the performance space, creating hallucinatory computer graphics that visualize the music [7]. Yet, these technologies also generate a variety of new issues that must be considered through research.

The main purpose of this chapter, then, is to explore how audio-visual experiences of music can be designed in XR. Specifically, I will discuss compositional approaches that can be used to design these experiences. I will argue for an approach in which designs in XR can provide visualizations of music that go beyond basic audio reactivity, embodying similar forms of symbolism as those present in the music, and enhance the audience experience by reinforcing meaningful resonances with the music. Towards this aim, first I will review some examples of XR music visualizations and VJ performances, in order to evidence present activity in this field. Following this, I will then turn to consider my own practice-led research creating artistic works such as AR paintings, VJ performances, and a VR application: Cyberdream VR. The exploration of these artifacts, performances, and software applications will demonstrate compositional approaches for realizing music visualizations that cross boundaries from analog media such as paint, into the digital; and from digital projections into immersive forms of XR. Through the course of this chapter, I therefore aim to illuminate possible approaches and open new conversations about visualizing music in XR.

2 The Current State of the Art

Visual music is an established art-form with considerable history dating back over a hundred years to the early color organs; works by artists such as Kandinsky; and later the films of artists such as Len Lye, Oskar Fischinger, John Whitney, and others [8]. The essential priority of visual music and related forms such as the psychedelic light show [9] is the representation of sound and music through complementary visual media. From the 1980s to present, the rapid expansion and democratized home availability of computer and video technologies saw a significant growth in the related forms of music visualizations, music video, and VJ performance [10]. Of course, there are many possible journeys through this expansive area of work, and it is beyond the scope of this chapter to provide an extended history. As explored in a recent panel discussion that the author participated in at London South Bank University [11], some VJs connect their work with visual music, yet inspiration may also come from the wider sphere of music culture, motion graphics, film, and video. For our purposes here, it will be sufficient to acknowledge that music visualization is a rich and varied field, but also one that has been shaped significantly by new digital technologies over the past century. It therefore comes as no surprise that XR technologies are gradually being assimilated into the panoply of tools used for visualizing music, yet with this, they generate specific new affordances and concerns, which we will now discuss through a selection of examples.

2.1 Multi-projection VJ Performances

VJ performance typically involves an individual mixing live video as a complement to electronic dance music performances, in an approximately equivalent manner to the DJ [12]. In recent decades, VJ performance has developed to use multiple projections, and video mapping techniques, which allow video to be projected on custom, irregular surfaces. At electronic dance music festivals such as Mo:Dem (Croatia, 2017), elaborate sculptures provide the stage design, on to which the VJ projects video-mapped visuals. Besides the main stage, smaller projection screens are also mounted in the trees (Fig. 12.1), and complement other aspects of the festival decor such as ultra-violet canopies. Here the VJ projections are audio-reactive, responding to the beat, but the designs also reflect otherworldly, alien or shamanic symbolism, which complements the psychedelic themes of the music (as discussed in [13]).

Fig. 12.1
figure 1

Photo credit J. Weinel

Psychedelic decor with VJ projections among the treetops at Mo:Dem festival, Croatia, 28 July 2017.

Along similar lines, Burning Man (USA, 2016) was one of many recent festivals to feature a fulldome theatre. Here, the fulldome provided an ad hoc movie theatre, which the audience could enter to view 360° films such as Samskara [14], which provides a cinematic experience based on the concept of a psychedelic journey analogous to an LSD experience. The film incorporates electronic music and relates to the overall psychedelic theme and ethos of the festival. Elsewhere, video-mapped domes have also been used for VJ performances; for instance, United VJs provide performances [15] and workshops [16] specifically addressing techniques for VJing in fulldomes.

Besides fulldome, other custom multi-projection setups are often used. For instance, the Resolution [17] series of events in London has utilized various projection configurations to provided extended immersive visuals. For instance, the series presented Sim Hutchins, whose performances revisit 1990s rave nostalgia through combinations of electronic music and projected visuals, on a 270° projection system in the G05 venue [18]. Meanwhile, another event at G05 featured music by Bobby Tank with a VJ performance by L’Aubaine (Fig. 12.2, [19]), who performed on a 360° projection system.

Fig. 12.2
figure 2

Photo credit Laurie Bender (L’Aubaine)

L’Aubaine performing visuals on a 6-screen projector setup at Bobby Tank Oxygen EP release event, by Not Like That, G05, London, 2 February 2019.

2.2 Augmented Reality Companion Apps

Mobile apps are often used to provide companion experiences at music events, for instance Notting Hill Carnival mApp [20] provides useful features such as a map of sound systems for Notting Hill Carnival. The Coachella app fundamentally fulfills a similar function, providing digital marketing and informative features; however, it also goes further with the incorporation of a ‘Coachella Camera’ with AR features. This provides AR features similar to the popular app Snapcat, allowing the user to take selfies and photographs with AR enhancements. However, for our purposes here, more interesting is the capability for real-time (synchronous) experiences of location-specific AR content, such as the space-themed 3D imagery described in the opening of this chapter, which could be seen at the Sahara stage when viewed through the app. Here the space-theme reinforces the identity of the Coachella festival, suggesting a fun, exotic festival experience that seductively indicates escape from the everyday.

Elsewhere, AR is also being used to complement or visualize music in other ways. The Kybalion album by Øresund Space Collective includes a mobile application created with artist Batuhan Bintaş. The application can be activated by viewing the illustrations of the album artwork through a mobile device, bringing to life the artwork through AR/MR and VR computer graphics. Here the app allows the audience “to not only listen to the songs but also to learn about the Hermetic teachings of Thoth by interacting with the album artwork” [21]. In this regard it extends the conceptual universe suggested by the space rock music, effectively allowing the audience to enter the imaginative virtual world that the music describes. Bintaş sees the work as revitalizing the album cover as an artifact to be enjoyed alongside the music, and in this regard it could be understood as complementary to the vinyl revival [22]. However, this piece can also be considered as a ‘cyberdelic’ (a portmanteau of ‘cyberculture’ and ’psychedelics’) experience, using a term that was popularized in the 1990s by Timothy Leary [23], and is now used by The Cyberdelics Society [24, 25], of which Bintaş is an affiliate. Kybalion fits with the idea of cyberdelics because it utilizes the illusory capabilities of the technology to produce a digital art experience that is analogous to psychedelic hallucinations.

2.3 Music Visualizations for Mixed Reality Headsets

Although we have noted the Coachella AR app as an example of XR aimed at the dancefloor, mobile phones may be poorly suited as a means of enhancing immersion in the live experience of music; indeed their use at concerts has proven contentious for audiences [26,27,28]. An alternative approach could be to use MR devices such as the Microsoft HoloLens or Magic Leap, which allow the viewer to wear a holographic headset that superimposes 3D content over the surrounding natural environment. Elsewhere, MR headsets have been used to enhance digital arts exhibitions, by bringing animated visualizations to life. For example, the recent exhibition: Leonardo da Vinci and Perpetual Motion: Visualising Impossible Machines [29] used tablet devices and a HoloLens to provide AR/MR 3D holograms, to visualize Leonardo da Vinci’s perpetual motion sketches as 3D computer graphics animations in the gallery setting.

This area remains relatively unexplored for music visualization, but there is some early work, such as Synesthesia, a HoloLens app that provides an MR music visualization experience based on the generation of audio-reactive graphics [30]. It is possible that technologies such as these could be brought into live music events. However, the obvious current limitation is that the devices are too expensive to be widely used in the music festival environment. Significantly, the viewing angle of these devices is also relatively small [31], which combined with cumbersome headsets would be likely to have a significant impact on immersion during a live music event. At the present time, MR music visualization would therefore be unsuited to the music festival environment, though certainly we could see visualizations of music in smaller scale settings such as gallery installations. As the technology improves and decreases in cost, it may be more feasible to organize larger scale events where audiences use MR glasses to view live visualizations of music. Alternatively, technologies such as the Holo-Gauze [32] provide possibilities for projecting Pepper’s ghost holograms on invisible reflective materials, which can be viewed by audiences without headset. A similar approach using Pepper’s ghost holograms has been utilized in visual music performances by Carl Emil Carlsen for his work with Silicum [33], and it is possible that future technologies using approaches such as these may provide other ways to introduce MR into the dancefloor context.

2.4 Music Visualizations in Virtual Reality

VR is also gradually being incorporated at music festivals. For example, Psych-Fi [34] provides immersive experiences at music festivals such as Boomtown Fair and Sci-Fi London film festival. At the 2016 edition of the Liverpool International Festival of Psychedelia, their app Dioynsia was included in the PZYK Gallery, a multisensory arts installation designed to complement the festival experience [35]. Dioynsia provides a short journey into a hallucinatory landscape in VR, thereby realizing the idea of a psychedelic trip through synthetic computer graphics and sound [36].

Elsewhere, L’Aubaine has also created work that translates the aesthetics of her VJ performances into 360° narrative piece, 360 Life #1, which explores “introspection versus outrospection and the boundary between reality and surreality” [37]. Works of this type can be situated in the gallery context, but are also sometimes featured at VJ events; for instance, Fathomable’s Gnosis [38] is a cyberpunk VR experience created by VJ Rybyk, which was featured at VJ London’s AV Depot event [39]. There are also various other commercial VR experiences of music, which are relevant to consider in this section. For example, Fantasynth [40] is a short VR experience which provides a journey through a landscape of audio-reactive graphics, while The Wave VR [41, 42] and Amplify VR [43] are other music platforms that aim to provide music video or virtual concert experiences in VR. There are also various other bespoke music-related VR music experiences, such as Fabulous wonder.land VR [44], a VR experience based on wonder.land, a National Theatre musical created by Damon Albarn, Moira Buffini, and Rufus Norris.

2.5 Discussion

This section has outlined a selection of examples where XR technologies have been used to visualize music. It is significant to note that similar XR technologies underpin many of these productions, since 360° production software and video game engines such as Unity can be used to adapt XR experiences across multiple platforms. Thus, in some cases, where a work is designed for VR, it can be realized for other formats such as AR/MR or fulldome projection. The fluidity between these platforms is one reason why it is pragmatic for our discussion here to look at the bigger picture regarding the use of XR for visualizing music. Yet in considering various types of XR, we also find that these technologies afford different forms of audience experience. In some cases, multi-projection environments may be used to provide VJ experiences in the electronic dance music context, extending the lightshow. Yet in festival environments, XR technologies such as AR/MR and VR are finding new contexts, such as their use to provide ‘side-shows’ that audience members may experience between the main acts.

It is perhaps worthwhile to consider how XR technologies may impact on dancefloor immersion. Dancefloor immersion is often characterized as arising from the experience of losing oneself in communal experiences of dance and music [45]. Yet Rietveld [46] argues that electronic dance music culture has undergone a shift from dimly lit nightclubs and warehouse parties, toward visual spectacle, which can be associated with the power structures of consumer capitalism. In her argument, elaborate stage designs enhance the visual and redirect the gaze of dancers away from each other, toward all-powerful ‘superstar DJs’. For Rietveld then, enhancing the visual spectacle of events may lead to a negative effect on collective experiences of dancefloor immersion. From this perspective, XR clearly carries a risk. When we consider examples such as the Coachella app, audiences are encouraged to direct their gaze to a mobile device, shifting activity away from the dancefloor toward social media interaction and the narcissistic taking of ‘selfies’. The negative impact of mobile phones on dancefloor immersion is already recognized in popular music press—for instance, a recent article in DJ Mag [47] even calls for a ban on mobile phones on the dancefloor, referring to academic research by Henkel [48] that suggests a ‘photo-taking-impairment’ effect on memory, underscoring the idea that such activities may reduce the presence and immersion of individuals in real-world contexts.

However, XR may also be capable of immersing the participant in other virtual spaces that are distinct from the dancefloor [49]. Where VR ‘side shows’ are provided at festivals, these may provide virtual spaces that relate to the themes and symbolic meaning of the event as a whole. In St. John’s [50] discussion, festivals and raves provide liminal spaces of physical and social activity that are removed from the everyday. While the dancefloor experience may be of critical importance for these events, activity in these spaces is diffuse and encompasses multisensory experiences in which aspects such as clothing, costumes, and conversation are also significant. Here, XR may be complementary, since condensed experiences of digital content away from the dancefloor may stimulate conversation, reinforcing the meaning and immersion into the event as a whole.

3 Case Study: Projects Visualizing Music in Extended Reality

Having outlined various examples related to the visualization of Music in XR, I will now turn to consider my own practice-led research in this area, which includes work across the areas of AR painting, VJ performance and a VR application: Cyberdream VR. Notably, some earlier iterations of this work were presented and discussed in the Carbon Meets Silicon exhibitions at Wrexham Glyndŵr University (2015, 2017). Beginning with a brief outline of earlier artistic works, in this section I will provide a personal journey through my creative practice, in order to demonstrate how the work visualizes music in different ways across a variety of media, eventually moving into XR territories. In doing so, I aim to illuminate some possible compositional strategies for visualizing music in XR.

3.1 Background

My background is in electronic music and visual arts, and was significantly developed through my Ph.D. [51], completed at the Keele University music studios. Here my work focused on the composition of electroacoustic music based on altered states of consciousness. In summary, this work seeks to design music that is analogous to the form and structure of psychedelic hallucinations, through corresponding sonic materials and structures. This resulted in a series of compositions that were released on the Entoptic Phenomena in Audio 12” vinyl [52]; software tools that were used to realize these compositions; and an audiovisual composition entitled Tiny Jungle [53, 54]. These works organize sounds, and (for audiovisual works) visual materials, in order to construct experiences analogous to hallucinatory journeys [55].

I later extended these ideas through further audiovisual compositions: Mezcal Animations [56], Cenote Zaci [57] and Cenote Sagrado [58] are three fixed-media visual music films that seek to provide synaesthetic experiences of electronic music and abstract visuals, based on the concept of altered states of consciousness. These were widely performed at international festivals for electronic music and visual music, such as the International Computer Music Conference [59], Seeing Sound [60, 61] and others, and were included in Technoshamanic Visions from the Underworld [62], a loop of collected audiovisual works presented at the first Carbon Meets Silicon exhibition. Notably, these works use the technique of ‘direct animation’, where 8 mm film is hand-painted, projected and digitized, and then combined with other materials such as stop-motion animation and computer graphics. Around this time at Wrexham Glyndŵr University, I also created Quake Delirium [63, 64], a video game modification that seeks to represent a hallucinatory state in the form of an interactive video game; and Psych Dome, an interactive installation for mobile fulldome, in which participants wear an EEG headset that captures brainwaves, which are used to affect parameters of an audio-visualization based on the visual patterns of hallucination seen during altered states of consciousness [65].

3.2 Augmented Reality Paintings

While working at Aalborg University in Denmark, I created several new paintings, which explored similar ideas to my earlier work [66]. For example, Vortex (2017) is based on the concept of visual patterns of hallucination, providing a funnel image related to Klüver’s [67] ‘form constants’ (honeycomb, cobweb, funnel, and spiral forms seen during hallucinations). Alongside this work, I also began working with the creative coding environment Processing, designing motion graphics sketches related to altered states, while also drawing influence from demoscene art [68] and the related VJ mixes [69]. I began experimenting with mixing video live to music using the VJ software VDMX, combining direct animation with materials created in Processing and footage made using other techniques such as stop-motion animation. This resulted in Technoshamanic Visions from the Underworld II [70], a pre-recorded video loop created by mixing video live to music by the Japanese psychedelic rock band Hibushibire, which was presented at the Carbon Meets Silicon II exhibition at Wrexham Glyndŵr University. The exhibition also featured Vortex and several of my other paintings, which are essentially companion pieces that test similar visual ideas as those I explore in the videos.

Continuing to explore both painting and audiovisual composition in parallel, I created a series of works that interpret music through synaesthetic, psychedelic forms of visual art. Technically these works explore the use of flow techniques, airbrushing, and digitally cut stencils. They incorporate other aesthetic influences from music via the artwork of L.A. punk bands such as Excel [71], Suicidal Tendencies [72], and hip-hop music via the artist Rammellzee [73]. For example Trip at the Brain (2017) interprets a Suicidal Tendencies song of the same name as a pen sketch, which is converted into a digitally cut stencil and rendered in airbrush. 31 Seconds (2017) incorporates airbrushed lettering, referencing a sample from the jungle track Origin Unknown ‘Valley of Shadows’ [74], and uses acrylic flow techniques and patterns that reference the designs of rave collective Spiral Tribe. Bug Powder Dust (2017) references the Bomb the Bass featuring Justin Warfield song ‘Bug Powder Dust’ [75], which is based on the William S. Burroughs novel The Naked Lunch [76]. For the latter piece, the painting uses airbrushed skeleton stencils reminiscent of Burroughs’s ‘shotgun paintings’, to provide a form of visual quotation (or ‘sampling’, to use a music production metaphor) [77].

In many cases these paintings were created alongside the VJ work and vice versa, and develop similar aesthetic ideas and symbolism across these forms. I began carrying out some initial experiments that integrate these practices, by video-mapping my VJ content on to the paintings, thereby providing visual art with moving elements. Later, I created a series of three paintings that link the practices of painting and VJing by incorporating printed stills from my VJ work as collage elements: Enter Soundcat (2017), Soundcat S-101 (2017), and Soundcat 2000 (2017). These paintings were later developed through the use of an AR app, which brings the still images to life as VJ animations when the application is viewed through the mobile application, thereby reinserting the moving image elements into the paintings [78]. The AR paintings provide symbolic interpretations of sound and music, utilizing XR to link the physical media of paint with computer-generated motion graphics.

3.3 VJ Performances

My exploration of VJ performance began with improvizational jamming, in which video loops created with direct-animation, stop-motion animation, and computer graphics techniques were mixed live to various kinds of music including psychedelic rock and electronic dance music DJ mixes. This allowed me to experiment with different combinations of sounds and images. I eventually formulated this work into a live DJ/VJ performance under the alias Soundcat [79], which consisted of a DJ set utilizing various breakbeat music from the 1990s breakbeat rave era and beyond (e.g., [80]). In 2018 this was performed at as part of audiovisual concerts for VJ London at New River Studios, London (Fig. 12.3, [81]), and at a concert held at Tŷ Pawb arts centre as part of the ACM Audio Mostly in Wrexham [82].

Fig. 12.3
figure 3

Photo credit Laurie Bender (L’Aubaine)

Soundcat DJ/VJ performance at VJ London, New River Studios, July 2018.

The visual materials for the Soundcat performance are based on my previous explorations of psychedelia, while also drawing on graphics inspired by 1990s VJ mixes and demo-scene graphics [83,84,85]. I incorporate 3D tunnel effects and geometric animations; 3D scenes reminiscent of the ‘cyberdelic’ science-fiction landscapes seen on fliers for mega-raves such as Fantazia or Dreamscape; dancing 3D figures; scrolling patterns referencing acid house culture through smiley faces; detournéments of the London Underground and Intel Inside logos; and other Discordian [86] or absurdist imagery. Branching out into the area of ‘video music’ (in which video samples or loops are used to create music, as exemplified by artists such as Addictive TV, Coldcut or Eclectic Method), one section remixes video trailers from the Planet of the Apes films to match the samples used in a track by Unkle (‘Ape Shall Never Kill Ape’ [87]). During this period I also became interested in vaporwave (Tanner [88]), an Internet music subculture which provides a surrealistic or hyperreal interpretation of 1990s computer graphics and techno-utopian culture, and some sections incorporate symbolic references to these forms through the use of computer software user interfaces and related symbols or designs.

Using these materials, I created original music videos for all of the tracks that I wanted to include in the VJ mix. These videos were mixed live in VDMX using audio-reactive effects, layering, and synchronized looping techniques, all of which were manipulated in real time using a MIDI controller (a Korg NanoKontrol). For some sections, I used an Akai MPC Studio to rhythmically trigger video clips live in synchronization with the music by improvising with the percussion pads. For a section based on Equinox ‘Acid Rain V.I.P. (Breakage Final Chapter Mix)’ [89], I used the tracker music sequencer Renoise to program a MIDI sequence in synchronization with the drum track, which was then used to trigger closely synchronized 3D graphics within VDMX. For each song, I created a different VJ mix, which was performed live in the studio, and recorded using a Blackmagic HyperDeck Shuttle. In some cases, further video overdubs were carried out in order to provide additional layering of visuals. This process resulted in a collection of original music videos for each music track, which could then be used to create the final DJ/VJ mixes.

The final live performances were created using the DJ software Serato Scratch, the MixEmergency video plugin, an Akai AMX mixer, and the Akai MPC Studio. This allowed the music videos to be mixed in the same way a DJ mix would usually be created, where changes to pitch can be made to synchronize the beat and blend between music tracks. Visually, further composites were created as the tracks are blended, which could also be further manipulated with effects in MixEmergency (for example, linking EQ adjustments to color contrast). In addition, I used the percussion pads of the Akai MPC Studio to trigger ‘one shot’ audiovisual materials, which were layered as composites with the video mix. The resulting DJ/VJ mix has some limitations in that various aspects of the visuals are pre-recorded, however, by carrying out intensive work beforehand (both in artistic and computational terms), the approach allows for a highly varied and efficient end result. This approach also prioritizes the mixing of audio and the overall structure of the DJ mix as the focus of effort during live performances, which is an appropriate strategy for solo performances where the music takes priority, and visuals are complementary.

3.4 Cyberdream VR

Cyberdream VR is a recent project that extends many of the approaches discussed in electronic music, painting and VJing into XR using VR. The project is based on the concept of a hallucinatory journey through the broken techno-utopias of cyberspace, providing a surrealistic world of psychedelic rave visuals and vaporwave music. Cyberdream VR was created for Samsung GearVR, was adapted for a VR cardboard version, and has been shown at various events including Cyberdelic Incubator Melbourne [90], Sci-Fi London festival [91] and MIND: Past, Present + Future/Cyberdelics/Remote Viewing [92].

The experience provides a short ‘fly-through’ (approximately 5 min long), consisting of a series of scenes. The menu screen is based on the flier for the Fantazia NYE 19911992 rave event, presenting a surrealistic virtual landscape with a large face suspended over it. Upon entering Cyberdream VR, the viewer flies across a chequerboard bridge surrounded by statues of strange creatures, entering a fractal structure based on the Sierpinski triangle (Fig. 12.4). Next, we travel over a vast infinity pool with broken manikin heads floating in it. Following this, the viewer is suspended in a large room with airbrushed walls, which were created by digitally scanning paintings made with a real airbrush. In this room, an animated effect creates vortex patterns based on visual patterns of hallucination. In the next sequence we fly through a virtual sky bombarded with pop-up windows; a pastiche of the John Carpenter movie They Live [93], the spam adverts are revealed as signals of capitalist control. The next scene depicts a virtual chequerboard island on which Atari ST cursors (pixel art rendered as 3D graphics) bounce manically or lie derelict among Grecian statues suggestive of techno-utopianism (also a vaporwave trope, see [94, 95]). After this, the next two rooms consist of cycling waves of brightly-colored cubes with oscillating color patterns and sizes. These are based on the classic ‘plasmas’ of demoscene videos [96], which generate fluid animations using oscillating patterns—here the technique is translated into 3D, giving an impression of being inside the pixels of a computer monitor, while also subtly referencing the design of the Windows ‘95 logo. Following this, the viewer enters another airbrushed room (again, created using digitized hand-painted art), in which spherical objects move in Lissajous figures. The final scene consists of a dark, chaotic room with bouncing stroboscopic arrows, and the text “the future is lost, crash the system, back to the tribes.” This message is a comment on the loss of the techno-utopian futures once imagined by cyberculture [97, 98], while calling for a dissolution and ecstatic recombination of these digital structures. The comment ‘back to the tribes’ also playfully hints at the idea of ‘technoshamanism’ (the use of technologies to access shamanic forms of experience, see [99]) and references free-party rave culture (e.g., Spiral Tribe).

Fig. 12.4
figure 4

Still image from Cyberdream VR. Image J. Weinel (2019)

The soundtrack for Cyberdream VR includes short pieces of acid house, hardcore rave, and vaporwave music. Just as the scenes of the project are essentially artistic sketches, these pieces of music are audio sketches. Thematically both the visuals and audio are related to, and reinforce the overall concept of the piece. Rave music suggests the futuristic aspects of hardcore techno [100], while the vaporwave clips use a plunderphonic approach (music made from existing audio recordings, see [101]) by slowing down imperfect loops of corporate library music intended to enhance productivity in the workplace. In this regard, the piece sonically mirrors the visuals through combinations of symbolic elements from psychedelic rave cyberculture and corporate computer culture. The overall result aims to elicit a broken, hallucinatory vision of the techno-utopianism of cyberculture, revealing the artificial or hyperreality of these visions, while also hinting at the ecstatic possibility in the dissolution of these structures. Described another way, the piece takes Douglas Rushkoff’s [102] concept of the Internet as a hallucination, and attempts to visualize that hallucination as a synaesthetic XR experience that allows the viewer to enter into the virtual worlds suggested by rave culture and vaporwave music.

3.5 Discussion

The work I have outlined in this subsection spans over a decade of creative practice creating work related to the concept of altered states of consciousness. In different ways, these works represent hallucinations, and synaesthetic experiences of sound and music through combinations of sound and visual art. One of the distinguishing features of this work is that it prioritizes the visualization of music by focusing on the symbols and conceptual meanings that are suggested by music, rather than the physical properties of acoustic soundwaves through audio-reactivity (though some parts of the work do also include audio-reactive or closely synchronized elements). In this regard, the work follows Danneberg’s view that music visualisers based primarily on audio reactivity may be relatively uninteresting, because they render only simplistic, readily apparent features of sound. Instead, he argues that composers should “make connections between deep compositional structure and images” [103]. With my own work, I interpret this ‘deep structure’ at a conceptual, symbolic level, where the visualization becomes a means to unlock the imaginary spatial environments and visual associations suggested by the music. While these symbolic visualizations can be realized with static visual art, animated visuals provide a way for these to be realized as time-based audiovisual media. XR technologies then provide a way to extend this idea further still, by providing spatial portals into these synaesthetic virtual worlds.

4 Conclusions

The first half of this chapter explored various examples where XR technologies are being used to visualize music. Through this discussion, we saw how XR provides new possibilities for constructing immersive visual experiences that complement music. These may extend the idea of the concert lightshow, or provide complementary ‘side’ experiences that reinforce the cultural meaning of these events. The latter half of the chapter then discussed my own practice-led research, creating AR paintings, VJ performances, and a VR experience related to music and altered states of consciousness. These works broadly seek to elicit synaesthetic experiences of sound and music through various forms of visualization. Through the exploration of these works, I have demonstrated some possible approaches for visualizing music using XR, and I also hope to have shown that XR technologies need not be approached as novelty gadgets—but rather as means through which to extend fundamental artistic concepts for visualizing music. The approach that I have emphasized here is one in which XR visualizations do not lean heavily on audio-reactivity, but rather seek to access deeper symbolic meanings, in order to manifest the imaginative worlds suggested by music as synaesthetic immersive 3D spaces. The unique potential of XR is to go through the portals into music that visual music paintings, films, and VJing have so tantalizingly provided in the past. Now, it is possible for the listener to enter into the music as an audiovisual space, and feel as if they are inside the visual worlds suggested by music, or for the visual forms of music to spill out into the concert hall or living room. Whether these technologies are used to visualize the psychedelic music, rave music, and vaporwave discussed here, or other genres, the potential is to radically transform the way we experience music.