Global Forest Watch is an online platform that can support monitoring and evaluation of conservation projects, international commitments to reduce deforestation, and private sector zero-deforestation plans. As a partner-funded project, like other projects within the World Resources Institute (WRI), its contributions need to be evaluated and monitored. Like other open-data platforms, it is based on the idea that transparent, publicly available data can support the greater good; in this case, reducing deforestation. By its very nature, the use of freely available data can be difficult to track, and its impact difficult to measure.

The team behind Global Forest Watch (GFW) explored several methods for monitoring and evaluating the impact of open-data tools for natural resource protection. This chapter explores four approaches for measuring the reach and impact of GFW, including quantitative and qualitative approaches for monitoring outcomes and measuring impact. This chapter aims to provide a framework for other open-data platforms to monitor outcomes and measure impact to learn and iterate on the most cost-effective strategies for natural resource protection. In the case of GFW, we have found that quantitative methods are capable of measuring outcomes and impact, although they require innovative approaches. We’ve also found that qualitative methods are necessary to understand the mechanisms of adoption and application and produce lessons for furthering the reach and impact of open data.

Background: What Is GFW?

Deforestation is a critically important challenge for the global community. Forests capture 30% of the carbon emissions released each year, playing a vital role in stemming climate change. They are also home to untold biodiversity and a resource for remote and indigenous communities. But the world is losing forests at an alarming rate: A football pitch’s worth of primary forest was lost every 6 seconds in 2019, according to WRI data, slightly more than in the previous year (Weisse & Dow Goldman, 2020).

Until very recently, those responsible for forests—including policymakers, protected area managers, and international commodity purchasers—had no way of knowing where deforestation was happening without visiting potential sites on foot. Tropical forests in places such as the Congo Basin are dense and inaccessible, and monitoring them can be extremely costly. Researchers recognized the lack of data as a concern and a barrier to improved management: “To enhance the efficiency of the protection, regeneration, and utilization of forest resources, information about these changes is required” (Suwanwerakamtorn et al., 2011, p. 169). In the mid-2000s, public data and advances in computing made such a system possible. When the U.S. government publicly released imagery from the Landsat satellite, the potential for such data became clear. “Satellite imagery offers an emerging source of data for analysis and a novel medium to attract greater government and public attention to domestic and international problems such as deforestation” (Baker & Williamson, 2006, p. 12). Researchers also began calling for unified and publicly available sources of data. Eventually, researchers at the University of Maryland developed an algorithm to process the vast trove of satellite imagery and discern locations of forest change (Hansen et al., 2013).

In 2014, WRI brought together the results of Hansen’s new forest change analysis, the computing power of Google Earth Engine, and stakeholders in forest protection for a partnership and a platform that made forest change data accessible to anyone with an internet connection. Global Forest Watch, as the platform was named, is a website that displays changes in forest cover as pink pixels on a map that can be overlaid with a wide range of contextual data, including protected areas or primary forest. Of the new features and data added since 2014, the most important are weekly deforestation alerts called “GLAD alerts.”Footnote 1 GLAD alerts are the most spatially explicit forest change alert product publicly available, identifying areas of 30 by 30 m. GFW also carries Fires alerts, which use heat signatures to detect fires and are updated daily, but do not detect other types of clearing, such as when trees are felled for timber or to clear a road.Footnote 2 Users can also analyze areas of interest by selecting an existing area such as a state or park, or by uploading or drawing their own shapefiles. Global Forest Watch added a mobile phone application called Forest Watcher, which enables users to download data about forest change and take it offline and into the field where internet access may not be available. The theory behind the data is that users from civil society, governments, and the private sector will use this data to better manage and protect forests (see Box 1 for GFW objectives).

All of the data on Global Forest Watch can be accessed freely by anyone with an internet connection, without even the need to identify themselves. The team behind the platform recognized at the outset that by providing the data in a completely open fashion, they would lack information about the reach of the platform, and the topic was discussed in depth by developers. Eventually, the goal of complete transparency won out. The limited knowledge about users was deemed a worthy tradeoff to avoid the possibility that entering private information might become a barrier to use. This aligned well with the thinking of other researchers who argued that public access to a geo-wiki—an open-access geospatial platform—would be critical for the success of remotely sensed data:

The philosophy is that as information is aggregated in public, discrepancies will arise, but so, too, will incentives to rectify them. The public nature of the information could induce a country to provide data to correct the record, for instance. Experience with public disclosure programs suggests that not all of them work, however, so identifying conditions most conducive to success is part of the geo-wiki experiment. (Macauley & Sedjo, 2011, p. 512)

The open-data nature of Global Forest Watch created a challenge for monitoring and evaluating the impact of the platform: If donors were to fund the continued development of the tools, they deserved to understand its reach. WRI leadership recognized that the institute has a responsibility to identify the most efficient use of donor funds to protect natural resources. The institute also has a mission of investigating and sharing guidance on effective methods of natural resource management. Therefore, over the past 5 years, the team explored new and evolving ways of monitoring program progress and measuring impact. These included using Google Analytics to track the reach of the platform, requesting data from users through a login feature, gathering user stories, and a two-part evaluation.

Box 1: Objectives Outlined in the Global Forest Watch Theory of Change

  1. 1.

    Strengthened accountability for global commitments: Accountability for implementation of global forest commitments is strengthened by credible, independent information and analysis of forest and land-use dynamics.

  2. 2.

    Responsible supply chains: Actors trading or financing major forest-risk commodities use smart strategies and cutting-edge information and technology tools to reduce deforestation and illegal logging in their supply chains and investments.

  3. 3.

    Empowered forest defenders: Civil society and law enforcement actors are better equipped to expose and combat deforestation and illegal logging.

  4. 4.

    A broad-based restoration movement: Around the world, communities and commercial enterprises gain access to the knowledge, expertise, and finance they need to restore degraded lands.

  5. 5.

    Enabling conditions for sustainable landscapes: Sound forest and land management is enabled by governance reforms, new incentives, and improved geospatial monitoring and analysis in targeted countries and landscapes.

GFW Monitors Progress

Analytics

The first step in the Global Forest Watch theory of change is getting data into the hands of users who might act on it. To measure the reach of GFW, the team turned first to Google Analytics, a tool that provides data to a website’s back-end developers about the traffic that website receives. Analytics can provide data on a variety of performance indicators by breaking down website visitors into certain categories. The GFW team tracks the total number of visitors to the site, a figure that has risen steadily each year (since launch, more than three million people have visited GFW, from every country in the world). The team recognizes that many people may visit GFW once, without integrating the information into future action or advocacy, so we began tracking active users, defined as those who visit more than once in a year and stay for more than 2 minutes at a time. (Repeat and longer visits suggest that the user is reviewing information over time or delving deeper into an area or topic.) We also track the number of active users disaggregated by country, and the number of users who perform certain actions such as turning on or off the GLAD alerts layer on the map. For example, in 2019 we wanted to know how easily users found the base map feature and which base map was more popular among our users to understand what information they were seeking. We were able to track the number of clicks on these different features. We learned that the features were easy enough to find, since about half of the users who clicked on them were “new,” compared to users who had visited the site at least once before. We also learned that planet imagery was the most popular alternative base map, suggesting that users were looking for even greater detail behind each pixel.

All this data is useful for performance monitoring and continued trends of increasing data usage imply that the site is relevant for users (see Fig. 1). Rising numbers of active users in high-forest countries, increasing time on site, and clicks on map layers suggest that the growing audience is interested in the available data. It can also tell us a bit about how to make the site more useful by highlighting which features are easiest to access, or whether users are finding new features, based on the number of clicks. Analytics data cannot, however, tell us about who those users are or how, if at all, they apply the data. It certainly cannot tell us if that application results in any action or policy that could shift the trend of deforestation.

Fig. 1
figure 1

Annual user data for GFW’s main website

Courtesy of WRI, Stephanie Jamilla

To learn more about users and applications, the GFW team implemented one additional quantitative metric. We gathered user information through a login system, allowing all data to remain freely available while asking users who wanted additional features to provide further information. Based on conversations with users and partners, the team had identified a need for added features, particularly the ability to save areas of interest. In response, we developed the MyGFW tool, which enables users to create and save a custom set of areas and view them in a dashboard—and requires users to create a login. This provided an opportunity to gather data about users (notably, it also required the team to implement data security protocols to protect those users). Through the login, the team could ask anyone seeking to access the features a few questions about their role in forest monitoring and reason for using GFW.Footnote 3 For example, we learned that, of more than 2000 subscribers who reported on their sector, the majority worked for nongovernmental organizations (NGOs), followed by researchers or students, government workers, and those in the private sector. A smaller number of users worked in journalism, community monitoring, or other categories that covered donors and United Nations (UN) agencies (see Fig. 2). We could then link that data to the locations that users were monitoring. Gathering data linking the type of user to the area they monitored would become critical to evaluating whether users applying GFW data influenced forest change.

Fig. 2
figure 2

Numbers of MyGFW subscribers by sector reported

Users and Stories

Even with the combination of analytics and MyGFW logins, the GFW team still had limited information about the way users applied data to real-world situations. This type of information could help the team, for example, make decisions about the balance between allocating resources to better data or to wider reach, or identify which type of user needed the most support, or which additional data would add the most value for users. So the team embarked on an ambitious project to gather, document, and categorize user stories for internal use. We created a user-friendly interface and a searchable database within WRI’s secure network, simply called the User Stories database. In the system, WRI staff could easily search, review, or add new user stories.

The database now contains more than 300 detailed and categorized stories about uses of GFW, painting a picture of the ways users achieve impact. Each story is categorized by variables including location, year of first use, and sector, but most important are the type of application of the data and the GFW objective to which the story contributes. We also captured stories of users who showed interest but ultimately discontinued their use of GFW. The aggregate data from the database is not representative of all users, but rather those with whom GFW staff have the most contact or those who contact GFW with questions. For example, most of these (43%) come from local NGOs, which tend be more likely to interact with GFW staff through requests for technical support, or from applications to GFW’s Small Grants Fund.

Overall, the stories provide a qualitative understanding of the variety of uses and some notion of which of those are most successful. They tell the stories of journalists who used GFW to uncover illegal activities, or of teachers improving students’ understanding of forests’ role in climate change, or of researchers advocating for greater transparency in countries such as China.

Stories from the database also provide substantial anecdotal evidence that forest defenders have turned to GFW’s near-real-time data—GLAD alerts and Fires alerts—to monitor and protect areas of interest. Data from the MyGFW system told us that users were subscribing to alerts, but stories in the database provided examples of specific successes; 84 stories came from groups tasked with monitoring and protecting forests, from government officials to civil society and even private sector users. Of those, 21 stories reported concrete results in stopping or preventing deforestation. Examples include a police chief from Brazil who manages a forest the size of Nepal with only 12 field staff. He told the team, “We effectively only started combating deforestation after Global Forest Watch. It is the principal tool of [our] police station.” The chief explained how the near-real-time alerts helped him more quickly identify areas of burning and clearing, leading to at least eight arrests including company operators who were overseeing an illegal deforestation scheme. Another NGO used the data to follow up on alerts in Malaysia. They identified a logging contractor who encroached on an area that was outside a legal logging area—allegedly accidentally. The group was reported to authorities, and the logger now knows that they were being monitored and could be identified if they violate the rules again.Footnote 4 Even private sector examples emerged: In Paraguay, an investment bank noticed an alert on a farm and telephoned the farmer, who had been unaware of the fire but was able to act quickly (Guthrie, 2018).

Quantitative Evaluation

When combined with GFW’s analytics data, the substantial number of anecdotes suggested that near-real-time data was having the intended impact, but could WRI provide definitive evidence? To answer this question, WRI’s Managing for Results team worked with the GFW team to complete a two-part, mixed-methods evaluation funded through an internal mechanism. The evaluation aimed to explore potential methods, recognizing that definitive evidence of impact might not be within reach just yet.

The quantitative team focused on that data most likely to create a short-term, measurable impact: the GLAD deforestation alerts. Because users receive these alerts within days or months of deforestation occurring, and the alerts are specific to a 30-by-30-m area, users can act quickly to intervene. Park rangers or local communities can conduct site visits to intercede or to deter the responsible parties. In Africa, the evaluation found that subscriptions to GLAD alerts decreased the likelihood of deforestation by 18%. This impact increased when a lag time of 1 year was added between the subscription and the forest-cover measurement, suggesting that as users learn and apply the tools, they become more effective at intervening in the causes of deforestation. Indeed, these results correspond to user stories, which suggested that areas where users had subscribed to receive email alerts might see reduced deforestation.

WRI’s team worked with an economist from the University of Wisconsin–Madison to design models to test the impact of the availability and use of GLAD alerts. Both models used time variation to create plausible counterfactuals. Two things changed over time: First, GLAD alerts were rolled out to different countries at different times, and second, once alerts were available, users began subscribing to alerts at different times. The research team explored a third option by attempting to build a model for receiving alerts. Since cloud cover can prevent the detection of deforestation, sometimes users don’t receive alerts until much later, and this could affect actions. This third model proved impractical for technical reasons, but the first two yielded credible results.

In the availability model, countries that had not yet gained access to alerts provided a counterfactual.Footnote 5 This model explored the likelihood of deforestation in countries where alerts were available compared with countries where they were not, and found no discernible impact. We hypothesized that this was because adopting GLAD alerts may take time and adoption first occurs in limited areas of countries, while the sample covered many areas where no adoption occurred. It may also have included areas of forest or mining concession or other areas where monitoring was not intended to prevent clearing. The study noted that when a one-year lag effect was applied, the correlation between availability and deforestation was negative, although not statistically significant (Moffette et al., 2021).

The model assessing use of GLAD alerts explored the impact of users holding a subscription to GLAD alerts for a specified area of forest. For the use model, researchers divided subscribers into two categories, those with intent to monitor for action, such as local rangers or remote supply chain managers, and those without intent to act on monitoring, such as researchers at universities. The unit of analysis was a 1-by-1-km grid cell of forest area; the sample was limited to those cells identified as falling within the subscription areas of users with intent to monitor. Other control variables included biophysical characteristics such as slope, distance to road or ports, time-variant features such as temperature and precipitation, and macroeconomic fixed effects.

The unique approach used in this analysis hinges on the gradual uptake of new data over time. A key source of bias for understanding the impact of monitoring is the covariation of monitoring and deforestation; people tend to look where deforestation is occurring. By limiting the sample to areas that were monitored through a subscription and incorporating the deforestation rates before and after monitoring began, the researchers could eliminate this bias.

The study results showed that “the average effect of subscriptions on deforestation [was] negative, but statistically insignificant and small compared with the average yearly 2011–2016 deforestation probability (0.18)” (Moffette et al., 2021, para. 13). The analysis detected significant effects in Africa, but not in Asia or Latin America. The researchers suggested that the effect in Africa is likely because of the limited alternatives for monitoring deforestation. In South America, Brazil has a national forest-monitoring system that excluded it from the sample. Peru, which was included, developed its own forest-monitoring system during the study, and other included South American countries gained access to GLAD alerts later than countries in Africa or Asia. In Asia, alternative forest-monitoring systems have emerged that are being used by private sector actors, and subscriptions to GFW by local civil society organizations are more limited, which collectively contributed to a weaker effect.

This research offers some additional clues to two mechanisms of impact: policy incentives and experience applying the data. When data on policy mechanisms was included in the analysis, the study found that “subscriptions have a stronger deterrent effect in protected areas and in forest concessions” (Moffette et al., 2021). Further, the effect of subscriptions increased when a time lag of 1 year was applied. This suggests that as users integrate the data into protocols and systems and gain experience using alerts, they become more effective at deterring deforestation. Given that the study covered only 2 years of potential usage of alerts, we can assume the potential for increased effect over a longer period, given both the lag effect and the steady increase in users.

Qualitative Evaluation

The quantitative analysis found solid evidence that GFW contributes to reducing deforestation, but questions remained about the mechanisms of adoption and the enabling factors that lead to greater success or failure in applying GFW data to forest monitoring. To answer these questions, WRI conducted a qualitative portion of the evaluation with case studies in Cameroon and Uganda. This research found evidence of impact and identified three categories of enabling factors: incentive structures, capacity, and behavior change factors. WRI staff (including this author) conducted the study with support from in-country consultants, and findings were published in a working paper by WRI (Shea & Coger, 2021).

To conduct the case studies, we chose two countries characterized by the same region and high levels of GFW usage and subscriptions but with different approaches in outreach strategy, so that we could detect contrasting results (Yin, 2014). We used the GFW theory of change to identify a set of anticipated causal pathways and targeted key informants that included users and potential users within the categories of GFW’s intended objectives (see Box 1). Data from the user stories database and from subscription information contributed to identification of potential informants. We then conducted interviews and focus groups with more than 100 key informants representing a range of potential users, from national government to NGOs to community leaders in high forest-cover areas, and analyzed these results.

The results of the case studies supported evidence from the quantitative evaluation by identifying specific user groups that were applying tools in ways that directly or indirectly reduced deforestation. We found evidence that the data contributed to awareness raising and decision making for improving enabling conditions. For example, in Cameroon, government officials used the data to make decisions regarding forest titles and management plans. In Uganda, the National Forestry Authority (NFA) occasionally reviewed the data to oversee forest management. We also found evidence of GFW empowering civil society and law enforcement to more efficiently and effectively expose and combat illegal deforestation. Staff at the Uganda Wildlife Authority (UWA) used GLAD alerts to plan routes for rangers going on forest patrols; private forest owner groups used it to monitor conservation for livelihood funds; and, in Cameroon, civil society used it to independently monitor and report on illegal activities.

The most important findings from the case studies were the enabling factors. First of these was that monitoring is most effective when it links to an incentive structure. Incentives can be positive: In the case of the UWA rangers, income from tourism incentivizes the more efficient protection of forested parks. Increased access to markets such as the EU, for Cameroonian exporters, drives efforts to control deforestation at the government level, and GFW was adopted in this context even in the face of technical hurdles. Conversely, the Ugandan NFA staff had also been trained in the tools, but that agency generates income from selectively logging forested land; more than one informant mentioned a perverse incentive not to stop illegal logging, and monitoring was not adopted on NFA lands. Incentives may be financial, although these appeared to drive the most large-scale change, such as a shift across multiple agencies of government, but incentives can also be driven by other resources such as time or even reputation, both of which may have financial implications. As examples from case studies illuminated, community cohesion provides its own incentive in some forest communities, while in other cases the opportunity to express political agency and a sense of altruism may also be incentives for independent monitors.

Capacity was another key factor in the adoption and application of GFW—not only the technical capacity to receive the data, but also the institutional capacity to integrate data into process and act on that data. One significant element in successful capacity-building efforts was the role of intermediary actors, such as larger NGOs or in-country WRI staff that provided ongoing technical support and offered training sessions. Overall, key informants reported the ability to use the main functions of GFW tools at high levels, although some leaders from forest communities did report a lack of technical knowledge. Complaints about physical capacity limitations such as limited access to cell phones, computers, internet, or data plans were more common.

Institutional capacity to act on the available data is a familiar challenge for open data initiatives. “Transparency is more likely to produce the intended effects when it fulfills both the condition of ‘publicity’—having relevant disclosed information actually reach the intended audience—and the condition of ‘political agency’—having mechanisms where citizens can take action in response to the disclosed information” (Ling & Roberts, 2014, p. 8). In this case, some such mechanisms were barriers to the success of GFW. For example, although informants acknowledged the progress that Cameroon’s Ministry of Forests had made over the years of partnership with WRI, they also noted that institutional change is slow. Changes like hiring a new unit to process and handle digital data, instituting new roles for staff, and opening new reporting channels had taken years and are still ongoing. One NGO staff member complained that independent monitors still are not able to file electronic reports; they must be printed and paper copies submitted. In his view, this slowed response times or possibly provided opportunities for special interests to intervene. Great transparency in protocols may contribute to greater effectiveness for near-real-time data.

Behavior change factors, notably trust and ownership, are also vital to adoption of new technologies and data. Trust hinges first on the validity and credibility of the data, and this may include the system for its delivery, which is also built over time through relationships. The case study revealed that in Cameroon, WRI was able to cultivate trust both through the independence of the data and by persistent engagement and collaboration with both NGOs and government. The National Forest Atlas is officially owned by the government, and this is part of its success. As Shea and Coger (2021) reported, “According to one NGO representative, the government’s buy-in to a transparent system enables civil society to hold government accountable by their own standards” (p. 20). This corroborates a finding from one of the few other known studies of the impact of near-real-time forest data: “It is imperative to establish collaborative relationships with government counterparts” (Musinsky et al. 2018, p. 18). That study found that when data providers work with governments, and governments brand tools and claim ownership, systems are most effective.

In Uganda, the case studies revealed a different situation from that in Cameroon. No national agency in Uganda had adopted GFW’s tools, even as field staff were using them regularly for monitoring. Little work had been done to integrate the available data and information into larger institutional contexts, or to coordinate between central and decentralized staff. This may be due to WRI’s limited engagement with local actors. In Uganda, WRI worked primarily through a local NGO and did not have direct relationships with officials. One of the challenges for WRI, then, is how to continue to foster trust and ownership in other contexts where it may not have staff on the ground.

The qualitative study aimed to reduce a knowledge gap identified through a literature review finding that research examining the impact of monitoring on deforestation is scarce (Shea & Coger, 2021). It did so by identifying several successful mechanisms for the impact of near-real-time data and by illuminating some enabling factors. It also produced recommendations for the GFW team, including a more complete assessment of existing incentive structures and capacity prior to entering future engagements with partners, and establishment of a strategy for addressing these factors through intermediary actors or WRI’s direct engagement. Most important, the qualitative study rounded out a picture of how GFW’s data can and does achieve impact and provided recommendations to expand that impact in the interest of more effective forest protection.

Discussion and Conclusion

When it comes to the management and safeguarding of natural resources, remotely sensed information and open-data tools can play critical roles in filling knowledge gaps, providing evidence that stakeholders need to make decisions and take action. Knowing the reach and impact of such data can help refine the tools of delivery and outreach. To do so, data providers need to monitor progress and measure impact, but the very nature of transparent, open data can make these tasks especially challenging. When anyone can access and use the tools, program managers have limited means to identify users and assess the adoption of tools or the impact of the work those users do.

Here we explored four methods—qualitative and quantitative methods for monitoring progress, and qualitative and quantitative approaches to evaluate impact—and their results. The lessons from these methods and results are broadly applicable to other platforms or tools presenting open data, especially data about natural resources.

Three lessons emerged with regard to monitoring. First, basic user data accessed through Google Analytics is useful for understanding reach and for improving basic functions and elements of the platform. Second, more specific user data is useful for understanding the potential mechanisms for impact. In particular, platform developers should explore options for two-tiered access when special features may prove useful; in the case of GFW this was a login that allowed users to save areas of interest. Logins of this type offer the opportunity to understand types of use and goals of users. More important, such data may prove critical to conducting rigorous evaluations of impact. Third, gathering qualitative stories about users’ experience and application of the data in a systematic way provided the necessary complexity for additional decision making, and may contribute to evaluability of the project.

The results of the evaluations also tell us three things. First, yes, GFW has been successful in reducing deforestation, implying that other open data platforms can achieve impact. Second, the quantitative evaluation shows that measuring the impact of open data platforms is possible as long as some data about the use of those platform is available, but that innovative methods may be needed. An experimental design would be ideal for evaluating, for example, whether a more intentionally random rollout of the GLAD alerts by GFW could have facilitated a more informative evaluation. Ethical concerns about withholding access to data or technical considerations may prevent such design. In this case, using randomly varying characteristics was possible to create counterfactuals, as was the case with the staggered uptake of GLAD alert subscriptions. Third, qualitative evidence that explains the mechanisms of impact are important for understanding the results. Although this is true for any analysis, it is especially important in the situation of data that is freely accessible to any user, where information about those users may be more limited.