Keywords

In our contemporary society, digital media dominates our lives and it is well understood that technology and media are the primary tools to influence culture and facilitate social change. It was back in the early 1980s with the arrival of the Atari and Amiga consumer computers that the beginnings of what would become known as the metaverse started to emerge. While the term Metaverse itself was first coined by science-fiction fantasy writer Neal Stephenson in his 1992 novel Snow Crash, early computers in the 1990s had already pioneered solutions for users to interact in computer-generated virtual worlds. Early examples of social interaction in these virtual worlds were often found in arcades where people played computer games together on terminals. With the ubiquity of the Internet, gamers quickly found solutions for creating parallel lives in a digital paradigm. These connected virtual worlds have allowed millions of people around the world to play in simulations and story worlds together with others while physically remaining at home. Technologies have been evolving rapidly from the early Nintendo consoles and PlayStations to Oculus head-mounted displays for virtual reality (VR) and smartphone apps for augmented reality (AR), to the latest in mixed reality (MR/XR) with the Microsoft HoloLens.

Each of these technologies has excellent examples of how storytelling can be incorporated into the user experience as a powerful tool for enhanced entertainment and engagement and, in more recent decades, for informal education and social change such as the serious games movement that began in the late 1990s (Ritterfeld et al., 2009). America’s Army (2002) is an early example. While this game was designed as a popular first-person shooter by the United States Army, it also had real-world content built in. At first, it was designed solely to engage the users with the hope of recruiting them and offset the need for subscriptions, but later it became a tool for the actual training of new recruits (Shen et al., 2009). A well-known example of a game designed for changing social perspectives is Darfur Is Dying (2006). This was a Web-based flash game designed to raise awareness of the genocide taking place in Darfur with the aim to elicit empathy and motivate players to help stop the crisis (Peng et al., 2010). Another example is BREAKAWAY (2010) that uses character role modeling in a story-driven scenario-based soccer game to address the complex issue of violence against women and girls. The game employs a branching narrative structure that facilities self-reflection as it highlights the behaviors that lead to gender inequity and bullying (Wang et al., 2019). Games for Change (2022) announced a national challenge, supported by NOAA, on student game development about climate change. There is an increasing number of commercial games addressing the climate crisis, and what they all have in common is that the players play on their own discrete personal devices. They are separated by geography and can only participate in the Metaverse virtually without any social interactions in a shared physical space.

Also in the 1990s, the same decade when the term metaverse was coined, the I-MMERSiON (Immersion Studios Inc.) was launched. Inspired by the cinematic innovations of IMAX, our dream was to expand audience engagement and understanding of the natural world by combining state-of-the-art digital technologies with large-scale and real-time social interaction all in one physical space. The idea was to create an immersive cinematic experience where audiences could participate in a self-directed exploration along with many others simultaneously through various narrative elements and gamification mechanisms. While loosely based on the analog version of “choose your own adventures” books developed in the 1920s, our ambition was to design a solution for large audiences to be emotionally immersed in a story world and to affect the learning outcomes of that narrative experience through mediated and nonmediated social interactions in real time.

The earliest example of this approach was in the year 2000, called the “Immersion Theater.” And for 2 years, it was installed in the Museum of Natural History at the Smithsonian Institute. The experience we created and later called “Immersion Reality” utilized a large-format 220° wrap-around immersive screen combined with an interactive touchscreen terminal at every seat, allowing each audience member to explore the subject, gain new insights, and, by either competing or voting, influence the outcome of the story. This combination of game and movie that we called a “govie” at I-MMERSiON and as such seemed logical that this format would be an ideal platform for the entertainment industry.

As one considers the growing enthusiasm for social media and digital games, the obvious big difference an Immersion Theater offered was the physical social engagement of the audience. The Immersion Theater offered compelling immersive cinematics along with shared physical contact and dialog as the players interacted with each other to collectively influence the outcome of the story. The opportunity to connect in the physical setting while exploring the Metaverse was a unique and attractive feature for museums worldwide interested in developing compelling digital experiences. By sharing several exemplary projects in this chapter, we hope to demonstrate how combining interactive storytelling in immersive environments with simulation, games, telepresence, social learning, and govies can provide mass audiences with a deeply meaningful and transformative learning experience about the natural world and help inspire climate action.

Interactive storytelling is a technique that experience designers use to create a narrative world and weave through original content to share the information with users in an engaging manner such as real scientific problems, instantaneous feedback, and aggregating data through emerging technologies (Hoguet, 2014). By default, the individual users will not have the same experience in the same sequence and they will have to make their own choices along the way as they explore new content to solve a problem or overcome a challenge with other users. Immersive environments are computer-generated spaces that allow the users to have a sensory experience of physical immersion with the awareness of one’s self and often the presence of others. Three popular types of immersive environments are massively multiplayer online role-playing games (MMORPGs) such as World of Warcraft, multiuser simulated virtual worlds such as Second Life, and cave automatic virtual environments (CAVEs) with projections on large-size surrounding screens such as Ars Electronica (Lui, 2014). The CAVEs may be less known among these three types but they have been around for decades and have become more accessible to the public through recent exhibits such as the Van Gogh: Immersive Experience and Immersive Klimt. The advantage of CAVEs lies in the immersive experience taking place in a shared physical space that allows the users to engage in meaningful dialogs about the content in real time, which is critical for deep learning.

Simulation: Storm Over Stellwagen

The growing popularity of museums around the world meant they were constantly searching for meaningful ways to better communicate contemporary science with the public. As the content they wished to share evolved to a higher degree of complexity, the demand for new ways to entertain and educate the public became critical. This led to the first series of Immersion Cinema experiments in social interaction. We were looking for evidence that interactivity used as a control element within a storyline would provide a social dialog that engaged the audience more deeply in the meaning of the content.

The first large-scale test of this approach was a collaboration between the US National Oceanic and Atmospheric Administration (NOAA) and the New England Aquarium in Boston, who engaged I-MMERSiON to produce a show titled Storm Over Stellwagen (1999). The pressing issues these groups wanted to highlight were the impact of climate change and human behavior in a particularly sensitive marine sanctuary near Boston called Stellwagen Bank. The approach was to create an interactive and immersive experience that combined a fictional storyline with a scientifically sound marine environment, a host of accurately represented marine species, and a unique simulation engine. The storyline set the stage for the audience to determine by interacting on their terminals the combined impact of environmental, climatic, and human intervention on Stellwagen Bank. The audience was able to choose from a host of factors (which were expressed interactively), and the culmination of these factors was fed to the simulation engine for evaluation. The result was a set of parameters and each set matched one of five potential future scenarios. Once the potential future scenario was identified by the software, the cinema system called up a sophisticated pre-rendered and animated sequence of what Stellwagen Bank would look like in that projected future.

After several years of running this show, the New England Aquarium found that audiences were often emotionally involved in the impact they had caused and returned to test out alternative solutions. Students responded particularly well to the association they experienced between cause and effect, recognizing the complexity each future scenario entailed. The embedded games that the audience interacted with were all text-based with simple simulation parameter controls, rather than the real-time 3D graphical games we would later come to employ. One notable finding was the level of dialog among the audience members as they experienced the show. This was often a very noisy experience with people engaging those around them in a discussion to find alternative answers and debate about potential strategies to approach the simulation.

Simulation turned out to provide an effective emotional connection and audience engagement with the story. A key characteristic of the way simulation was used in this show was how the results were shown to the audience. They were presented on two levels: The individual choices were seen both privately (on their personal touchscreen terminal), and the accumulation of everyone’s input was seen publicly (on the large immersive screen). These public-private interactions became a common characteristic in many productions that followed, providing a specific result that encouraged enthusiastic dialog both during and after the experience. In addition, we decided to recognize the highest achievers through a scoring system, something we carried forward in all our future productions as an additional means of fostering a high level of participation. Scores were shown throughout the experience to the individual participants with increased interaction and content discovery, and top performers were highlighted to the entire audience at the end of the experience. Tying it back to the concept of the metaverse, in this scenario-based experience, the choices the audience made could actually predict the future in reality, with a one-on-one direct connection from each of the five scenarios to each of the five outcomes.

Immersive Game: Sharks: Predator/Prey

An example pointing to the challenge of changing contemporary thinking was an immersive game called Sharks: Predator/Prey (2001). As marine science evolved, it had become evident how important it was to communicate to the general public that the relationship between species, which in earlier times was called the food chain, was now more accurately understood as the food web. How can we show that changes in the life cycle of one species can influence the links in the web in minor or catastrophic ways? And that the diversity of the food web itself can affect its own ability to withstand any changes in water temperature or the impact of other animal populations? The solution we came up with was Sharks: Predator/Prey. This was a fully 3D real-time multiplayer game that could be played by a hundred or more people simultaneously in the Immersion Theater (Fig. 1). The experience began with players starting as the smallest creature in the ocean, a type of phytoplankton, and maturing through the game to try and become a great white shark. Each level of success resulted in players taking the role of more complex creatures in the ecosystem and trying to meet increasingly complex goals for survival. Players had to discover who was a predator and who was prey in order to survive among an ocean of creatures all controlled by the audience members. The game was set in a very competitive environment against the backdrop of a large-format immersion screen showing high-resolution animations of who is eating who – in graphic detail. The players were introduced to the action through an opening narrative and then thrust into the immersive game and allowed to unearth the key relationships more deeply through their gameplay. As one might expect, this particular game was far more popular with audience members who were avid gameplayers and more of a challenge to those who needed time to absorb game mechanics instead of focusing on the game content. Sharks: Predator/Prey represented a good example of a multiplayer experience designed for experienced gamers with readily available motivations and skills. Digital games have been powerful tools for entertainment and in more recent years evolved to include serious games purposefully designed for learning, development, and change (Ritterfeld et al., 2009; Spiegel, 2006; Spiegel & Hoinkes, 2009). Sharks: Predator/Prey was our first project to push for a real-time game concept in an immersive environment. At that time, most museum visitors could only endure about 30 minutes of cinematic presentations due to the intensity of the visual effects. But what we found from this project was that the game-based experience significantly extended the time people spent in the Immersion Theater with dramatically improved user engagement. In this case of the metaverse, players were learning about the interdependency of species in the food web and the immersive experience helped them to better connect to the real-world conditions and consequences of what happens in the marine environment and their relationship with these events.

Fig. 1
A photograph of a group of people playing the games in the immersion theater. The sharks are displayed on the large immersion screen.

Sharks: Predator/Prey Immersion Cinema

Telepresence: Ring Road

As we continued to explore new technological solutions to enhance large-scale social interactions in interactive storytelling and immersive environments, we had an opportunity to experiment with telepresence – the use of VR for remote control of machinery to participate in a distance event – in Ring Road (2001). In this project, we collaborated with Dr. Robert Ballard, the scientist best known for discovering the Titanic, and connected the underwater habitats of Monterey Bay in California to the visitors to Mystic Aquarium in Connecticut. A remotely operated vehicle (ROV) equipped with a live high-definition camera was tethered in the water in Monterey Bay and connected via Internet2 (an advanced technology for streaming video at that time) to visitors across the United States, sitting in the Mystic Aquarium in a dome structure configured as an Immersion Cinema. In this design, the high-definition video captured by the ROV was projected on the large-format immersive screen in the dome through telepresence using a live stream instead of prescripted and pre-rendered content. At the same time, visitors could use their individual touchscreen terminals to interact with a matching but simulated undersea environment (Fig. 2). All of these events were happening simultaneously and not driven by a preprogrammed narrative. This parallel world experience was dynamic and allowed instructors to take the visitors on a personal investigation of the animal and plant species in the virtual habitat in relation to what was observed in their natural habitat in reality. Instructors also had the option to activate simple trivia games as part of the experience and provide the winning participants with control of the ROV in Monterey Bay (Fig. 2). If actual underwater conditions were poor on a particular day, alternative video recordings of the live environment could be called up on the Immersion Cinema screen as a comparison for the audience. The live environment inputs and controls were seen as very compelling and made the content experience real to the participants in a way that virtual environments and gameplay could not achieve (Fig. 2). Metaverse is about leveraging technologies in the virtual world to enhance the experience and impact in the real world. In this case, we used the interactive and immersive experience to create conditions for the participants to better understand what was happening in real time by observing the changing conditions in Monterey Bay.

Fig. 2
2 photographs. Left. A photo of a person operating the R O V controller. Right. A photo of Stacey Spiegel, President and C E O of Immersion Studios.

Screenshots from the Ring Road Project Video (left: A student in Mystic Aquarium uses the controller to maneuver the ROV on the dry land of Monterey Bay; right: The live stream of videos projected on the immersive screens in the dome while visitors explore the content at their individual touchscreen terminals and Spiegel explains how immersive environments coupled with real-time content feeds and meaningful group dialogs can help visitors understand complex scientific phenomena through self-directed and socially engaging learning experiences)

Telepresence + Social Learning: Exploration: Sea Lions

Another collaboration we had with Bob Ballard that combined both live and virtual immersive experiences was Exploration: Sea Lions (2003) as part of the JASON Project, Ballard’s initiative to promote scientific exploration among school-aged groups across America (Cohen, 2020). We offered a significant expansion to Ballard’s proposition of using only telepresence as an educational tool by creating enhanced features for social learning through Immersion Cinemas and Immersive Classrooms. The show ran at Mote Marine Laboratory and Aquarium in Florida, the Mystic Aquarium and Institute for Exploration in Connecticut, and in The Lamphere Schools in Michigan with over 2000 students (mostly 9–14 years old) in a 2-week period. A real-life scientist Kathleen presented to the students with an urgent problem that the California sea lion pups on San Miguel Island were facing through a live broadcast and allowed them to connect with each other as well as experts to figure out what the major cause was. These pups were dying at a higher rate that year than in previous years. Students were invited to help the scientists prove one of the three competing hypotheses: predators, pollution, or climate (Fig. 3).

Fig. 3
4 screenshots. Top left. A study area map of San Miguel Island with the dying spots of sea lions. Top right. A graph of change in deaths versus year. 3 lines for predators, pollution, and climate follow an increasing trend. Bottom Left. A photo of a sea lion with pups. Bottom right. A tutorial note.

Screenshots from the Exploration: Sea Lions Project Video (top left: map of San Miguel Island showing different locations where sea lion pups were dying for different reasons; top right: an example of data visualization the host used to explain the correct hypothesis: climate; bottom left: touchscreen terminal interface with facts about the sea lion pups on San Miguel Island and other resources for scientific exploration to solve the problem; bottom right: personalized field notes based on each student’s interaction with various features in the program to facilitate self-reflection and group discussions)

Students across all locations had live stream videos of the sea lions on San Miguel Island projected on a large-format Immersion Cinema screen while they had a touchscreen terminal with an interactive interface that allowed them to conduct research on their own, with other students in the same physical space, discuss with students virtually across all locations, and the possibility to ask the expert (Fig. 3). Their discoveries got transferred to their field notes, and when the time was up, students reviewed their notes and send the facts that they believed to support or reject their hypothesis. Participants could then reflect upon their hypotheses after viewing each of the alternatives and supporting presentations to make a final assessment of the cause of the problem. A final presentation on each hypothesis was made virtually to the entire audience, and the correct hypothesis was revealed and explained by the scientist host Kathleen (Fig. 3).

In the end, the telepresence experience with enhanced social learning features for such a scientific exploration was found to be a powerful means for more significant concern and active discovery among the students. One of the unique features introduced in this project was the use of avatars to represent the students (note that this was the same year when Second Life was released). This introduction of a virtual representation of the students themselves added to the value of play and personal identity in the pursuit of a scientific inquiry. As students mastered their chosen area of inquiry in this program, they could directly offer assistance to other students through their avatars in advancing their collective understanding of the problem.

Exploration: Sea Lions is similar to Ring Road in that they both connected live on-site experiences and virtual investigations. However, they had very different results. Ring Road tried to weave the live into the virtual directly and found itself challenged by constant context-switching to establish a strong enough presence for the participants, thereby reducing the motivation and engagement level and ultimately the learning outcomes. Exploration: Sea Lions put the live and virtual experiences next to one another but left each one to remain in their own formats and was by far the more successful approach. The live broadcast helped to anchor the interactive elements of the experience and still left room for personal exploration and discovery. Students and teachers both rated highly of the Exploration: Sea Lions experience, noting that it encouraged dialog, was of strong value to weaker students, and motivated them to learn more about the topic (Ritterfeld et al., 2004).

Govie: Dolphin Bay

The most intriguing and complex narrative highlighting the human impact on the marine environment that we created is called Dolphin Bay (2005). It represents the state-of-the-art approach to storytelling and digital technologies in blending narrative in movies and interactivity in gameplay, the type of immersive experiences we called govies. Dolphin Bay was a compelling action-adventure production that illustrated the human impact upon dolphin habitat that had a seamless integration of gameplay with the narrative throughout. This “govie” was played for many years exclusively at the Immersion Theater of the Mote Marine Laboratory and Aquarium in Sarasota, Florida. This highly evolved storyline interweaves a series of complex games that through personal interaction transform the participant into the role of a marine biologist trying to save dolphins within a fictitious bay. The production utilized movie directors, screenwriters, and Hollywood actors combined in a sophisticated game development. This experience blurred the lines between cinema and games to an extent that a player would not be able to identify either as the clear driving factor. Intense emotion and action fostered engagement with the storytelling and pushed audience dialog on environmental issues to occur during gameplay portions of the experience. The intertwining of narrative and gameplay can serve a key entertainment/educational role in engaging emotional participation as evident in this production due to the highest quality of each aspect.

Real Change Is Possible: Sparking Reaction

Sparking Reaction (2001) is an excellent example of how a self-directed, interactive, and immersive experience enabled the public to influence a major corporation to get out of the business of nuclear energy production. As part of a visitor center that we built for the British Nuclear Fuels Ltd. (BNFL) headquarter in Sellafield, Cumbria, Sparking Reaction was intentionally designed as an open public forum on energy where visitors were empowered to directly join the debate about nuclear energy vs. alternative solutions through a unique experience. Anchored through simulator engines in an Immersion Cinema, visitors from all walks of life were challenged to build nuclear reactors, generate the fuel, power the grid, and then dispose of the waste (Fig. 4). The outcomes resulted from their individual choices led to a collective discovery that nuclear energy production left behind a major toxic waste problem that was extremely difficult to resolve, as we know is a fact today. The visitors were able to explore multiple alternative and renewable energy solutions such as solar and wind and explore the same power demands and challenges for production. We worked closely with BNFL and the content experts from the London Science Museum to ensure that the interactive narrative experience we created was grounded in the scientific evidence to date. So when the visitors came to the Immersion Theater, they would be put at the center of this real-world energy debate and use the simulated experiences to explore and better understand the pros and cons of each option, weighing the risks and benefits of generating nuclear energy, and voice their personal opinions. In fact, BNFL was at a crossroad in defining its business priorities and directions for future development. With most of its business in nuclear energy generation at that time, BNFL was contemplating a new role in fuel reclamation. Sparking Reaction served as a tool for public engagement in gauging the role of energy production from the perspective of the nuclear industry. This experience, combined with a series of related interactive exhibitions in the venue, provided BNFL with direct inputs from 500,000 members of the public (both adults and children) as to what corporate roles and responsibilities the company should have regarding the nuclear industry. Based on this and other public feedback, BNFL ultimately got out of the business of nuclear energy generation and moved its primary focus to the reprocessing of spent nuclear fuels.

Fig. 4
A photograph of a visitor in the visitor center of British Nuclear Fuels and learning about the energy production. A large screen for fuel fabrication is placed in front of the visitor.

A young visitor learning about nuclear energy production at Sparking Reaction

Conclusion

Raising awareness and engaging the public in meaningful dialogs are critical for social change. Digital media can help transform scientific information into an aesthetic and compelling experience. When combined with interactivity in an immersive environment, the narrative experience can be even more engaging as users explore the content in a self-directed way. Early public immersive experiences can be traced back to the nineteenth century when large-scale 360° circular panoramic paintings allowed the public to appreciate religious tales, historical events, and urban landscapes through realistic representations known as cycloramas in America) and panoramas in Europe (Fletcher’s Mutiny Cyclorama, n.d.). These immersive settings allowed for social interactions and meaningful discussions, but the narrative content was not at all interactive. Fast forward to more recent years, the success of cinematic environments such as IMAX prepared the public for the Van Gogh Immersive Experience and Immersive Klimt, both of which have become international attractions drawing huge audiences. Again, they missed out on the opportunity for interactive storytelling and meaningful engagement for deeper learning. There is a constant demand for innovation in digital media to create new and unique experiences in the metaverse: 360 videos, AR, VR, MR/XR, and most recently AI-generated experiences, with each technology appearing more seductive than the one before. However, most of these commercially popularized and personalized digital technologies can be physically, socially, and emotionally isolating, separating people from the reality – especially the meaningful dialogs that can take place in a shared space – leaving individual users feeling disembodied, dislocated, and disoriented and not knowing where they are in the real world.

Climate change is urgent and complex. Urgent stories need to be told yet complex stories are hard to tell. What we have learned in the past 20+ years is that technological solutions exist and will continue to improve for interactive storytelling in immersive environments, with powerful capabilities such as simulation, gamification, telepresence, and cinematic presentation shown in the examples in this chapter. Equally important is the seamless integration of science-based content, dramatic narratives, interactive mechanisms, and meaningful dialog in a shared physical, social, and emotional experience through self-directed exploration and collective discovery. When people can get up close and personal with the challenges and then are put at the front and center to contemplate the moral dilemmas and seemingly intractable problems, people would be more compelled to step up and respond in action, with faith and reverence. These examples from around 20 years ago are from a different era, as we were experimenting with various cutting-edge technologies in our creative solutions while working with different museums, aquariums, and research institutions. But 20 years later, we see some of these creative solutions becoming accessible and popular public attractions. And we firmly believe that they provide exciting opportunities for climate storytelling and empowerment.

For example, Arcadia Earth (n.d.) uses video projects, AR, VR, and large-scale installations by more than a dozen environmental artists to provide a multisensory immersive experience to awaken public conscience about the ecological crisis and inspire climate action. “Entertainment, coupled with enlightenment, is the purpose of this pop-up exhibition,” and “Together, they evoke the landscapes, marine depth and life-forms that global warming threatens” (Graeber, 2019). ARTECHOUSE (n.d.) is another public venue for metaverse experiences with nature, from traversing through the cherry blossom in Washington D.C. in PIXELBLOOM to an imagined future in 100 years RENEWAL 2121. In the meantime, researchers have been working with indigenous communities to use technologies like InstaOneX 360 cameras to create immersive videos and share their stories about culture and the environment they live in (Westervelt et al., 2022) and the scientific community has been collaborating with technology experts to develop online tools such as the En-ROADS simulator (n.d.) from Climate Interactive and the MIT Sloan Sustainability Initiative.

The new wave of AI-generated media has arrived with many cautionary tales and unprecedented ethical concerns (Mets, 2023). As with any new technological innovation, it will serve as a double-edged sword and will require dedicated support, effective guidelines, and thoughtful collaborations to serve the public good. Despite the steep challenges we face, one can only envision what possibilities there are when the tech innovators, digital artists, climate scientists, and professional storytellers all come together to help engage the citizens in the world and the policymakers to better understand what humanity is facing and what can be done right here and right now.