The development of CAULDRON has been guided by feedback from players, used to see whether it was achieving its aims and for suggestions on how to improve the game. The main sessions from which feedback was collected are summarized in Table 1. An initial prototype version was played as part of the Africa Climate Conference 2013 and the game was then played in Warsaw during COP19. Six additional sessions, with between 14 and 44 participants in each, chiefly with climate scientists and students from a range of backgrounds on courses tailored to enhance learning on complex environmental and development issues, were then carried out (Table 1). While these were not the key intended players of the game, they were able to provide insightful comments to guide its development. CAULDRON aims to encourage learning and reflection for players regardless of their previous knowledge of probabilities, climate change, and PEA itself. The scientists had some knowledge of PEA and were able to provide informed insights on the presentation of the science in the game. Feedback from students was valuable despite their different levels of prior knowledge compared to policymakers. Information and feedback from all these sessions was accumulated and fed into the game. Following this it was played with policymakers in Senegal as part of a workshop focussing on PEA science and addressing loss and damage, which involved 40 government, civil society, and scientist stakeholders.
Feedback was collected via comments players made on debrief sheets where possible, along with the negotiated texts. In some cases, players completed a questionnaire immediately after the game about their prior understanding of PEA and L&D, their main insights from playing, and whether they thought PEA could be used in L&D. A further short survey was sent to some players in the months following to learn about longer-term influences of playing the game but this had limited response (n = 7). Game facilitators also made systematic observations of sessions. After the game in Senegal, players reflected in groups on what they felt were the key learning points. Comments from the feedback sheets and sessions are used here to illustrate some of the key themes that emerged.
Learning About PEA
One of the key aims of CAULDRON was that players would learn about the science of attributing extremes to climate change and the probabilistic nature of this. How players learned will have depended on their previous understanding of probabilities, and also their perceptions of climate change and how this is affecting extremes such as droughts. The goal was that players with any level of scientific understanding would be able to benefit from experiencing changing probabilities and reflect on how easy it was to tell if there had been a change.
Players commented that using dice to highlight climate changes was helpful, as one said, “Even though the knowledge was already there, the concept of ‘chance’ and ‘probability’ that is very important became clearer through the experience gained over playing the game.” How players chose to assess whether they had a change in drought probability was up to them. Players with more scientific backgrounds often worked out how many droughts they would have expected to see with a normal dice. Many players noted the difficulty in assessing the changes in probability with such limited data, with players in Senegal commenting that the short timescales made it difficult to see whether there was a change in drought likelihood or it was just variability. This is also a challenge faced by scientists carrying out PEA studies. Another player commented that such analysis could be especially challenging for developing countries. This is particularly the case for those lacking reliable long-term observations and the scientific capacity for analysis of changes. Reflecting on the link to real-life challenges, players in Senegal recognized how the model results could differ from the observations, and that further work was needed to understand how best to use both model and observational data.
The negotiation phase was also useful for seeing how players viewed the science. The views of players with previous knowledge of the science will of course not just be based on the game, but this still provides an opportunity for them to apply their understanding. For example, one region decided not to use attribution information as they felt it was too uncertain, saying “There is still more work to be done on the attribution science to provide evidence for country-specific climate change impacts.” In the game context data are limited and models may be imperfect. Real-world PEA also faces these challenges, but with additional difficulties including framing scientific questions and defining the event to attribute (Otto et al. 2015). These are not incorporated into CAULDRON as the game would become too complex. There are also different methodological approaches being developed to overcome these difficulties (Stott et al. 2016), which are also left out of the game for simplicity. Another player noted that “distinguishing damages from chance, (climate) change, risk taking is tough,” highlighting that L&D can be affected by vulnerability and exposure, which require consideration alongside the meteorological hazard (Huggel et al. 2013).
Most players began the game with no knowledge of PEA (except the Met Office group). After playing, the majority of players reported in questionnaires that their knowledge of PEA had improved (73% slightly improved, 17% greatly improved). We have not systematically analyzed players’ understandings of PEA, but the few responses to the follow-up survey showed very varied understandings. This very small sample included views that the chance of extreme events could be affected by climate change, events could be made more extreme due to climate change, that most extreme events are attributable to climate change, and that it is all random. It would therefore be interesting to analyze more systematically what players understood before and after playing the game, as their own perceptions of how much they learnt may not match whether they have a correct understanding.
Improving Learning About PEA in CAULDRON in the Future
The Met Office group suggested ways to improve how CAULDRON represents PEA science, including having more consideration of uncertainties in the modeling part. Players often found differences between the observational and model data and were uncertain which to trust. It was suggested that models could also be provided of the unchanged climate to assess the model skill, and that models should not imply that they are perfect representations of the real world.
However there is a careful balance to be drawn between keeping the game relatively simple and easily understandable in many contexts, and incorporating all of the complexities of PEA, which CAULDRON is obviously unable to portray. This requires the game facilitator to have the skill to judge the needs and understandings of the players and lead and tailor the game accordingly. The game documents have been provided so that anyone can learn to run CAULDRON; the skills required for the game to have maximum benefit for participants by encouraging engagement, reflection, and learning between players from different backgrounds (Mender de Suarez et al. 2012) can be more challenging to develop. However by working with more experienced game facilitators and colleagues with skills in different areas, such as experts in PEA science, others can develop these necessary skills through experience and participation.
It is also necessary to consider how the game can be used alongside other activities so players are able to gain a more complete understanding of the science on which the game is based. For example, before a couple of the sessions players were given a more traditional presentation on PEA as this could provide more detailed background.
Work has begun to compare how well players learn during games compared to more traditional methods such as slideshow presentations (Patt et al. 2010), but more long-term monitoring will be needed for the greater impact of games to be assessed (Mendler de Suarez et al. 2012; Harteveld and Suarez 2015). This can be challenging and requires systematic assessment. Haug et al. (2011) suggest that to collect large enough samples of robust data, evaluation could be embedded into games so it is not seen as a time-consuming extra for players. This could be incorporated as CAULDRON develops. We have not included surveys before playing as this is time-consuming and may put players off. We have instead tried to ensure that players with any level of previous understanding of probabilities can build on this during play—how they do this and the insights from the decisions they make can then be discussed in the debrief.
Nevertheless it would also be interesting to collect quantitative data on players’ understandings. Questioning why players made particular decisions, and short, but in-depth, surveys of players’ understandings could improve evaluation. While this area has not been our focus in the development of the game so far, questions that could be investigated, for example focussing on decision making in the farming phase, include: What strategies do players use for planting? What do players do when they experience a drought? Who helps who when drought occurs?
Promotion of Dialogue About Roles for PEA in L&D
The second key objective of the game was that players would have the opportunity to consider whether PEA has a role in addressing L&D, as this has been debated by academics. An earlier version of CAULDRON had an unstructured negotiation phase where players could discuss however they wished. Players tended mainly to address how they would respond to future losses without considering the science of whether drought risks had changed, and whether losses could be attributed to climate change. While an interesting finding, an aim of CAULDRON was to encourage consideration of uses for PEA, even if it was not then used, so a suggested structure for the negotiations was introduced. Players were encouraged (although they could choose not to do so) to consider how many (if any) of the beans they lost could be attributed to climate change, who was responsible, and then how to address this state of affairs.
Of those who filled in questionnaires following playing, most had no, or very little, knowledge of L&D or UNFCCC negotiations before playing (Development and Climate Days players were likely more knowledgeable). For these players, CAULDRON may have been a useful tool to provide a brief introduction to the L&D negotiations, as well as to provide the opportunity to consider how PEA could be used. This may not be discussed in the real policy world. One player said the negotiation phase helped them improve their “understanding of how the science can be applied” and another “how the understanding of climate risk can be used as a negotiating implement.”
In the negotiation phase some groups did consider what losses could be attributed to climate change and produced deals to compensate these. Others decided against using scientific information, instead focussing on addressing past or future losses regardless of their causes. One player commented that attribution would lead to blame so could be useful for forcing an outcome, but was not important if developed countries chose to support developing countries. Some players did not think their drought probability had changed and therefore did not attribute losses to climate change; in other cases losses were attributed to players’ planting strategies.
Negotiations could be difficult, and from participant observations this was often one of the main messages players carried away from the game experience. They reported that some countries tried to pressure others into agreements, and it was difficult for countries in different circumstances and with different perspectives to agree. Often a deal depended on developed countries taking responsibility, as they generally held the power in negotiations. Players noted that “negotiations are hard because they are not determined only by science but by other factors as well (political etc.)” and participants also commented on the “difficulty of reaching a negotiated deal within a deadline, when working with incomplete and very uncertain information.”
Naturally, many players remarked on the similarities between their own behavior and the patterns that emerge from UNFCCC negotiations. In some circumstances, negotiations could also be unrealistic and often led to simplistic solutions. These exaggerated fair distributions of resources, including common resource pools of beans for the future, redistributions of wealth making countries more equally wealthy, and plans to donate beans in cases of future crises, which are not commonly seen in international negotiations. The negotiated texts from Senegal (Table 2) were much more detailed and less simplistic than others. But players still reflected that countries were willing to help each other in times of crisis more readily than in reality. CAULDRON could perhaps be improved by introducing greater political bias between countries to encourage less “fairness,” as has been suggested by players.
From the questionnaires, the majority of players thought PEA could be used in addressing L&D in real life, despite many not explicitly using it in the game. Reasons for its use were not explained in detail, but included preparing for future events, demonstrating climate change impacts, distinguishing anthropogenic and natural causes of events, and because the effects of climate change are often caused by different actors than those who are affected. Some climate scientists and other players were concerned that PEA should only be used if robust enough, showing awareness of the uncertainties. Others disputed using PEA in L&D for reasons that included difficulties distinguishing if an event is attributable to climate change, limited data, the time needed to calculate results, and because the use of PEA could encourage a focus on blame rather than on reducing losses in developing countries. Other suggested uses for PEA included in more general policy negotiations, risk analysis, insurance sector policies, investment planning, adaptation, and improved regional projections. Senegal stakeholders were interested in how PEA results could be implemented at national and local levels. These views suggest that players had a chance to consider some of the issues surrounding using PEA in policy.