Ten rules to increase the societal value of earth observations


Earth Observation (EO) data play an important role in our society today, but there is still tremendous opportunity to improve how these data are used to affect change. In this paper, we provide guidance to help data providers and intermediaries within the EO value chain (from data to applications) increase the societal value of the EO data, information, and data products that they work with. We first describe the EO value chain as a conceptual framework for how data are translated and applied for societal benefit. We then introduce three approaches that are often used to assess and improve the EO value chain. Finally, we present “10 rules” that can be implemented to increase the societal benefits of Earth science information. The 10 rules emphasize meeting user needs, problem-solving within interdisciplinary teams, and long-term sustainable solutions. Some rules focus on a specific segment of the value chain or phase in the problem-solving process, while others are relevant to the value chain or process as a whole. Each rule includes at least one case study example to illustrate the key points. The rules are loosely organized according to project management principles with the initial rules focusing on defining problems, planning for data use, creating effective teams, and examining a diverse selection of solutions. The next set of rules are best applied throughout a project, and include such concepts as evaluation, interoperability, trust, adoption, and documentation. Finally, the last rule addresses the challenge of determining when to close a project.


Earth system science provides important information for understanding our environment, informing management decisions, and monitoring the impacts of human actions on our planet. Earth Observation (EO) data collected by sources like satellites, surface stations, and human observation underpin societal benefits, such as the production of energy and clean water (Healy et al. 2015), agricultural decision-making (Ziolkowska 2018), more accurate prediction of weather (National Weather Service 2019), policy change to improve human quality of life and the environment (Bornmann 2013), and aesthetic, cultural, and existence aspects of our world (Jorda-Capdevila and Rodríguez-Labajos 2017). It is estimated that the sectors of society that use Earth Science data for their operations, products, and services are worth trillions of dollars (Hanson et al. 2017).

EO data play an important role in our society today, but there is still tremendous opportunity to construct solutions and improve how data are used for decision-making. Discourses surrounding big data, machine learning, the internet of things, and the cloud, for example, emphasize how more efficient use and integration of diverse datasets could literally change our world (Waga and Rabah 2014). However, there are still many challenges for utilizing EO data to benefit society. Societal benefits are anchored in people’s “values”, which relate to a desirable outcome or way of doing things (De Wit and Notje 2014); these values are dynamic, subjective, and highly dependent on context (Higuera et al. 2019). Given such circumstances, translating data into information for decision-making is no easy task (Virapongse et al. 2018a). One problem, for example, is how ineffectively data are often presented and made available for the purpose of meeting social and economic goals, such as poverty alleviation (Taylor et al. 2014).

The goal of this paper is to provide guidance that helps data providers and intermediaries within the EO value chain increase the societal value of the EO data, information, and data products that they work with. We first describe the EO value chain, which provides a conceptual framework for how data are translated and applied for societal benefit. We then introduce three approaches that are often used to assess and improve the EO value chain. Finally, we present “10 rules” that can be implemented to increase the societal benefits of EO data. This paper builds upon a 2018 Earth Science Information Partners (ESIP) webinar series entitled, “The Socioeconomic Value of Earth Science Data, Information, and Applications,” where more detail for many of the case studies included in this paper can be found (ESIP 2018a).

The EO value chain

The application of EO data for societal benefit can be conceptualized as a value chain that is composed of data, information, knowledge, and wisdom (Ackoff 1989; Sharma 2008). This value chain is illustrated by the large triangle in Fig. 1. Starting from the bottom of the triangle and moving upwards, when data are given meaning, such as through models or analyses, they are transformed into information that people can use. Information is packaged into “products” (e.g., solutions, tools, and services) that address specific purposes. The application of these products can become integrated into a body of knowledge, which allows for people to apply information to new situations. Finally, based on an understanding of the fundamental principles embodied within knowledge (Bellinger et al. 2004), wisdom provides people with the ability to make decisions under circumstances with high uncertainty. Ultimately, value is created when the decisions lead to improved outcomes for society.

Fig. 1

Earth Observation (EO) value chain and approaches used to increase the societal benefit of EO data. The large blue triangle represents an EO value chain, which encompasses data, information, knowledge, and wisdom, as well as actors (people icon), processes for transforming data along the value chain (green triangles), and examples of sources of data and results from the processes (gears icon). Three main approaches often used to increase the societal benefit of EO data are (A) Seeking to recruit new users (data exist but the value chain does not), (B) Improving a current value chain (value chain exists); and (C) Identifying and/or developing data and data products based on user needs (users exists, but the value chain may or may not exist) (Figure adapted from Harshadeep 2018)

Different types of actors (people) work within the value chain to facilitate the contribution of data toward decision-making for societal benefit. In its most simplistic form, EO value chains include the following types of people: a) Data providers that collect, manage, generate, analyze, integrate, aggregate, and transform Earth Science data into information; b) Intermediaries that synthesize, translate, communicate, and help usher information and decision-support products toward an end use; c) End users that (should) understand a particular set of information so that they can make decisions (CCSDS 2012); and d) Citizens that can be impacted by said decisions. In reality, many EO value chains are actually “value networks”, which are systems of value-added processes with multiple exit and entry points for actors (Li and Whalley 2002). With such complex EO value chains, however, most projects focus on smaller, digestible pieces of the pathway, such as a value chain element that provides specific data products for scientists. Therefore, we use the term “user” to refer to any person who benefits from products at any point in a value chain, including end users. Users are experts in their own domain. They exercise their contextualized knowledge, wisdom, and values, leveraging EO information to address societal problems.

Approaches for increasing the societal benefit of EO data

To increase the societal benefit of EO data, we highlight three common approaches: 1) Seeking out new users for existing data and data products; 2) Improving an existing value chain that already has data and users; and 3) Developing a new value chain to meet specific user needs. These approaches (A, B, and C in Fig. 1) are defined by the existence of data, a value chain, and/or users, and how well these three elements are aligned.

Seeking new users

In this approach, data and data products exist, and data providers seek to identify new ways to apply and use data, as well as recruit new users. Such an approach is often used to add value to existing data or an existing method of collecting data. The main challenge with this approach is that the users may simply not exist, or the products are not usable by or useful for a targeted user group. Users are often sought from an “audience”, which includes all people and groups with a potential interest in the data products (Baker et al. 2015).

Building capacity among potential users has been proposed as one strategy to add value to existing data, such as satellite observations (Hossain 2015). As one example, Copernicus, which is the European Union’s Earth Observation program, generates data collected from satellites, air, ground, and seaborne stations, and sensors. Its data are used mostly by large organizations, and current efforts focus on stimulating user uptake of Copernicus data through the development of new services and skills development in the space geo-information sector, such as among Earth Science academics (case study presented by Vandenbroucke in Virapongse et al. 2018b & ESIP 2018c).

Value chain improvement

For this approach, an EO value chain exists, and specific actors in the chain seek to improve the value chain through increased efficiency or discovery of new opportunities, such as new products and services for users. Valuation techniques, like socioeconomic impact analysis, Value of Information framework, and scenario storylines, are often used to assess and enhance the value and the Return on Investment (ROI) that EO data provide.

Value chain analysis is also helpful for identifying where the critical path to the greatest value lies. The analysis includes evaluating tradeoffs between value chain elements, assessing the socioeconomic factors, resource limitations/opportunities, and broader environmental context that affect decision-making in the value chain (Virapongse et al. 2014), and identifying products and benefits for value chain actors (case study presented by Coote in Virapongse et al. 2018b & ESIP 2018c). A challenge with this approach, however, is a tendency to focus on the most obvious high-value products and shortest paths to these products. While this can lead to positive cost-benefits and increased adoption, it can also result in overlooking new opportunities and innovative areas that may be more difficult to achieve but can result in high-value outcomes.

Use-driven approach

Knowledge production in modern society has often occurred by small homogenous (often elite) groups defining both the problem and solutions. Such knowledge production has been criticized for its poor integration of scientific knowledge with practical knowledge (Frost and Osterloh 2003), resulting in minimal benefits for users of the resulting products (Klocker 2012; Southby 2017). In contrast, for the use-driven approach (or user-centered design), users frame the solution-seeking process, aligning different needs and expectations of those involved while also managing preconceived notions of the project and team (Gibbons et al. 1994; Burns et al. 2006). “Dumping information” (i.e., releasing data, information and products that have not been specifically prepared for user consumption) is increasingly considered inadequate for addressing societal problems, such as water shortages and flooding of urban settlements (Patel et al. 2015).

A use-driven approach can lead to more efficient use of EO outputs over the long-term, because these outputs are created with input from users regarding specific problems that they face (Cook et al. 2013; Ziervogel et al. 2014). One challenge with this approach, however, is that it depends on how much users are willing to participate in, engage with, and commit to the process (Southby 2017). For example, citizen science projects like Nature’s Notebook, which records changes in plants, animals, and the environment, rely on different motivational strategies to encourage people to participate (case study presented by Shanley in Robinson et al. 2018 & ESIP 2018b). Hence, if users do not actively contribute to the process, then their needs, knowledge, and expectations cannot be integrated into broad knowledge systems.

10 rules

The 10 rules outlined here provide guidance for actions and best practices that data providers and intermediaries in the EO value chain can implement while planning and executing projects that seek to increase the societal value of EO data. In particular, the rules address the “value chain improvement” and “use-driven” approaches described previously. The rules focus on problem-solving that cannot be done by a solitary individual, while aiming to enhance the efficiency, effectiveness, and long-term sustainability of the solution. While some rules focus on a specific segment of the value chain or phase in the problem-solving process, other rules are relevant to the overall value chain or process. To help increase understanding and awareness of the perspective that is presented (Goodman 2018), we address what the rule entails, why it’s important to do, who should be involved, and how to achieve the particular objectives for the rule.

The rules are loosely organized according to project management principles with the initial rules focusing on defining problems, planning for data use, creating effective teams, and examining a diverse selection of solutions. The next set of rules are best applied throughout a project, and include such concepts as evaluation, interoperability, trust, adoption, and documentation. Finally, the last rule addresses the challenge of determining when to close a project.

Identify the root causes of a problem

A well-defined problem helps direct problem-solving efforts toward addressing underlying issues rather than just its symptoms

An EO value chain addresses a problem by mobilizing information, tools, and people to help support a solution. Such problems can be urgent, such as a forest fire, or an on-going issue that needs management, like stream flows. When defining the problem, it is important to identify its root causes, so that solutions focus on addressing underlying issues, rather than its symptoms. Root cause analysis can also uncover multiple potential solutions (Nagy 2018), helping to save time and resources over the long term by identifying the most appropriate solution(s). In-depth understanding of the problem helps support better selection of a team to address the issue, as well as more effective dialogue with users.

A root cause is often composed of individual human factors, like people’s knowledge, awareness, attitudes, and behavior, as well as social causes, such as cultural, economic, and political factors (Lopez 2018). For example, the Pressure and Release conceptual model emphasizes the importance of understanding root causes for preparedness and management of disasters and hazards (Wisner et al. 2004). In this case, economic, demographic, or politically-related root causes, like limited access to power and resources, play a key role in vulnerability and unsafe physical and social conditions.

Root cause analysis, which aims to isolate and understand the impacts of a root cause, begins with identifying what is known about the problem, and gathering any missing information (Nagy 2018). A causal loop diagram is a conceptual model that provides a visual representation for how a problem relates to other variables; it demonstrates the relationships and feedback processes between components within a complex system (Lannon 2018; Marketlinks Team 2019). A system is an interdependent group of items forming a unified pattern, and some examples include a community, city, or organisation (Kirkwood 1998). A causal loop diagram promotes a shift in thinking about problem solving by moving from isolating a problem and its causes towards a systems approach for understanding how a problem interacts with other variables in the system. For example, the problem of water shortages in a community can result from a combination of changes in climatic variables (e.g. decline in rainfall), poor maintenance of bulk water infrastructure, and population growth. Variables have a positive causal link if a change in one component results in a similar change in another (i.e. both increase or both decrease). Variables have a negative causal link if a change in one component results in a change in the opposite direction for another component (Kirkwood 1998).

The initial problem statement can be subjected to further analysis to identify its root causes by using the “5 Whys”, which was initially developed by the Toyota production system (Ohno 1988). It entails repeatedly asking “But why?” (Lopez 2018). For example, many people did not evacuate their homes before Hurricane Katrina made landfall in the US in 2005. But why? They didn’t have a place to go and way to leave. But why? Shelters and emergency transportation prohibited companion animals (Fritz Institute 2006). Such analysis led to the passing of the PETS Act in 2006 that requires government entities to “account for the needs of individuals with household pets and service animals before, during, and following a major disaster or emergency.” The 5 Whys method can be strengthened by engaging in various iterations of the questioning process to overcome any simplistic, linear thinking (i.e., there is only one root cause) (Card 2017). It is important to include diverse participants and particularly people across the value chain (e.g., theorists, developers, users), because root cause exploration is limited by the knowledge base of those involved (Murugaiah et al. 2010).

Consider how data are really used

Understanding how different people need and want to use data and information can help identify the best solutions

The difficulty of transforming data into useful information and products is often underestimated. Even more challenging is predicting how people will use those information and products to make decisions and solve problems (Cook and Lewandowsky 2016). To gain the most societal benefit from EO value chains, it would be ideal if decisions were made based solely on scientific information. In reality, however, decisions are often made based on information sourced from multiple communities and non-scientific knowledge systems like traditional and place-based knowledge (Wenger 2000; Roux et al. 2006; Dunlop 2009), under different political motivations and levels of imposed use of information through hierarchical structures (Dunlop 2017), and within personal worldview and belief systems (Murambadoro and Mambo 2017a). For example, while Earth Science information can inform potential climate change responses, the corresponding decisions and actions taken do not always align with scientific recommendations (Adger et al. 2009).

Being aware of the complex circumstances under which decisions are made is helpful for planning and designing data products that are well-tuned to users’ needs and worldview. Different actors require and use different data products--one “size” does not fit all. Most end users seek summarized information and applications that help solve their problem, while caring less about the actual data and technical details. The Higg Index, for example, aggregates natural resource use data into a simplified assessment tool that the apparel industry uses as a tool to measure and communicate a company’s or product’s environmental sustainability performance (Sustainable Apparel Coalition 2019).

Different approaches are available to understand users’ needs and worldviews associated with data use. A stakeholder analysis is useful for identifying people’s behavior, intentions, interrelations, agendas, and interests (Brugha and Varvasovszky 2000). User experience (UX) research is often used by technology companies to understand how users experience their products, so that their perspectives may be integrated into the design and functionality of their products (Vermeeren et al. 2016). Overall, it must be kept in mind that user groups are often very diverse (Virapongse et al. 2014), necessitating specific requirements to design products that meet their varying needs (Baker et al. 2015).

As an example of how one solution has helped to transform federal agency data into an information source that decision-makers can use, GeoCollaborate (www.geocollaborate.com) is a NASA-funded mapping platform that enables better sharing, aggregation, and visualization of geospatial data between agencies (federal, state, and local) and with the private sector. GeoCollaborate has been used operationally since 2017 in eastern and central U.S. to “up the tempo” of utility responses to tropical and winter storms, wildfires, heat waves, and pipeline incidents, The All Hazards Consortium, for example, has implemented GeoCollaborate to help mobilize and move resources, such as fleet vehicles, across state borders, past weigh stations, and into staging areas by sharing key datasets across multiple platforms and devices in real-time (case study presented by Jones in Moe et al. 2018 and ESIP 2018e).

Get the right people to the table

Building bridges between data and use is not easy, but identifying and accessing key intermediaries and users can help

The value of collaborating across disciplines and sectors has been well-emphasized for helping to develop innovative approaches to complex challenges, and improve the effectiveness, adoption rates, and reach of a solution (O'Leary et al. 2012). Much less attention, however, has been paid to identifying who to engage with, and how to gain and sustain access to them.

To identify the different disciplines and sectors that should be engaged, a value chain analysis can help break down a complex value chain into more discrete steps and roles. People can range from those that help to identify and bring attention and accountability to a problem, technical experts that address specific elements of the problem, and the ultimate users of the solution. To help identify the right people, a “snowball” technique is useful for identifying and reaching hidden populations (Faugier and Sargeant 1997). It is conducted by asking an individual to refer another person’s name until a person is identified that meets specific criteria.

It is too much to hope for that all of the identified people and groups will be easy to access and work with, so intermediaries can be essential. Such intermediaries provide benefits like opening paths of communication, providing cultural translation, aligning interests, establishing trust between groups, and bringing problem solving to a more concise level. For example, different groups use different vocabularies and meanings, making it often difficult for groups to communicate and understand each other. Intermediaries are “fluent” and trusted in multiple cultures; their facilitation efforts can help prevent misunderstandings and speed up the process of coalescing groups so that they can work together.

Intermediaries can be both individuals and organizations. Individuals include “boundary spanners” (Williams 2002), communication specialists, community leaders, champions, and consultants. For example, Climate champions are individuals designated by the United Nations to connect the work of governments with actions taken by cities, regions, businesses, and investors to develop innovative and practical solutions (United Nations Climate Change 2019). Boundary organizations, middle-out organizations, and community organizations are valuable for helping to identify appropriate individuals and groups for projects, and open doors for communication. Boundary organizations and middle-out organizations both help organize communities and identify topics of shared interests, but boundary organizations focus on lateral co-design/co-production of knowledge (Gustafsson and Lidskog 2018), while middle-out organizations help mediate between top (e.g., government) and bottom-up (e.g., community) directives (Cutcher-Gershenfeld et al. 2017).

Co-production of knowledge and co-implementation of projects are approaches that help to reach a high level of success for solutions. Studies show that social cohesion, trust, and social capital are key factors that enable individuals or collectives to organise and execute certain courses of action (Sampson 2004; Hipp 2016). Co-implementation of projects allows knowledge producers to assess how well the products, applications, and services are addressing an identified need/challenge. Implementing these approaches successfully relies on sustaining access to collaborators and participants through best practices that build trust and reward people fairly (Oettle et al. 2014). People have a limited amount of goodwill and patience, so it is important to consider what motivates them and what is the best strategic use of their time. For example, some tasks are more successfully achieved with financial compensation (e.g., staff positions), while others are appropriate as volunteer work (e.g., participating in surveys). Human resources needs also vary over the life of a project, affecting when people’s contributions are most valuable.

As an example of a middle-out organization, Earth Science Information Partners (ESIP) is framed by the interests of sponsoring federal agencies (NASA, NOAA, and USGS), while being composed of activities that are led by the broader science and data user community, including academics, NGOs, industry, and private sector. ESIP clusters (groups of people that self-organize around a specific topic or goal) offer a way to efficiently identify a group of experts on topics ranging from semantics to data use for community resilience (ESIP 2019). By leveraging its over-20-year-old network, ESIP helped to align the interests of diverse stakeholders around data-sharing practices, leading to the “Enabling FAIR Data Project’s Commitment Statement in the Earth, Space, and Environmental Sciences” that over 100 repositories, communities, societies, institutions, infrastructures, individuals and publishers committed to (Stall et al. 2019). Without ESIP’s network, establishing a FAIR data commitment statement that all stakeholders agreed with would have taken much more time and effort to achieve.

Investigate all alternative solutions carefully

Comprehensive identification and analysis of potential solutions helps determine which (if any) solutions should be advanced

Before pursuing specific solutions to a problem, the breadth of different options must be identified and explored. Ideation can be done on both an individual- and group-level. While the number of ideas that are identified may not differ greatly between these two approaches, it is notable that the qualitative depth of discussions is intensified by working in collaborative groups (McMahon et al. 2016). Such depth is useful when moving beyond identification of solutions and onto determining which solutions should be pursued.

Ideation is often conducted through different variations of brainstorming, which aim to inspire creative problem solving by encouraging people to share ideas while withholding criticism or judgment (Rudy 2017a). To ensure the best results from a brainstorming session, it is important to choose the right people to participate and facilitate the process, and to have a clear idea of what the outcome should be (Rudy 2017b). This process should be supported by all of the dominant actors in the value chain, as well as external advisors.

Once a set of potential solutions has been identified, a comprehensive analysis is conducted for each idea. The group should consider what they like and dislike about each idea, as well as its potential negative and positive side effects, practicality, and potential effectiveness. They should also question how easy/difficult it is to put into practice, if everyone involved will accept it, and if it is consistent with other things done by the group. The proposed solution may need to be modified based on suggestions (Nagy and Axner 2018).

With the reduced list, more detailed analyses are conducted to compare between variations of the proposed solution, as well as between different solutions. A first step in this comparison process is to identify the costs of implementing the solution and benefits that are relevant to the decisions that will be made using the solution. A cost-benefit analysis can then be used to determine if the benefit greatly outweighs the costs incurred. With this analysis, the cost and benefit valuation methods should clearly link the use of data and methods to defined and quantifiable outcomes (Smart 2014). The analysis should address critical questions like: Are the needed resources (budget, equipment, team) available to support the solution? Who are the beneficiaries of the solution, and what benefits will they experience? Are there any potential negative impacts on other elements of society? Should one or more solution be eliminated as a result of cost-benefit analysis? A risk identification and analysis can also be conducted to consider and manage project and technical limitations (Lavanya and Malarvizhi 2008).

This rule is demonstrated in practice through the process of selecting a solution that best informs culvert design. A culvert is an engineered structure (e.g., a pipe) that is partially buried to allow surface water to flow underneath a roadway; it must be built to an optimal size that is neither too small nor too large. The provision of stream gage data can help culvert builders understand the frequency of high-flow events, thereby informing their decision-making about the appropriate size for a culvert. A cost-benefit analysis compares the cost of implementing the solution (i.e., installing and operating the stream gages) to its potential benefits, which include a greater likelihood of building culverts of optimal size, as well as cost savings and reduced roadway flooding. Data collected from diverse stakeholders, such as the Transportation Engineering Community and Disaster Response entities, are essential for providing the type of information needed for better comparisons between different solutions (case study by Pindilli in Virapongse et al. 2018b and ESIP 2018c).

Evaluate, adapt, and iterate

Solutions must be strategically evaluated and refined to ensure their best possible fit within the context

An evaluation strategy should be developed early in the project--rather than after a solution has been implemented--to help inform the solution design and its associated EO value chain. An evaluation strategy requires identifying the societal outcomes that a solution seeks to influence, metrics for tracking changes in those outcomes, and an empirical strategy to assess whether the changes in outcomes (as monitored by the metrics) can be attributed to the solution. To help inform the evaluation strategy, the project’s theory of change should be identified. The theory of change “is a method that explains how a given intervention, or set of interventions, is expected to lead to specific development change, drawing on a causal analysis based on available evidence” (UNDG 2017).

Once the development of a solution is underway, the evaluation strategy is implemented to measure how well the project is reaching its goals. The evaluation may also reveal how the solution can be improved to potentially enhance its impact. Engaging with participants of the value chain is crucial during this phase to identify how to improve the presentation, accessibility, and utility of data and information. Different methods for usability testing are particularly useful for testing and evaluating user interactions with data products (Maramba et al. 2019). This process of evaluation and improvement can be repeated, forming an iterative process that continually improves the solution.

Different quantitative and qualitative methods are available for evaluating the added-value contribution of a solution. The methods are selected depending on the problem environment, and if the expected outcome is qualitative or quantitative. For example, quantitative methods like socioeconomic impact analysis (Adams et al. 2013) and The Value of Information (VOI) approach (Macauley 2006), and qualitative methods like scenario storylines (Rounsevell and Metzger 2010) are all useful for measuring the value of information used in a solution.

As an example of how one of these methods is used in practice, the VOI approach helps demonstrate ROI on satellites and data products, and provides Earth scientists with an effective tool to communicate the value of their work, make informed choices about how to invest limited resources, and increase the likelihood that a satellite or satellite data application produces socioeconomic benefits. The VOI approach has been applied to evaluate the impacts of EO data in several applications, including the benefits of improved frost prediction for Kenyan tea farmers, the role of LandSat imagery for the discovery of new gold deposits, and the human health benefits of using remotely sensed data for regulating air pollution (case study presented by Kuwayama in Pearlman et al. 2018 and ESIP 2018d).

Think globally

Data and data products should adhere to existing best practices, standards, and ethical considerations to increase their potential for interoperability and broad applicability

Most EO projects focus on context-specific problems that address a segment of a larger, more complex value chain; such projects can operate at a local, national, or global level. These projects must consider how they connect and interface with other value chain segments in order to attain interoperability across the broader value chain. While it is often difficult to anticipate just how products might be used in the future, the codification and use of best practices and standards can guide projects to desired outcomes. In the best case, a developed solution is applicable across a range of applications.

Sustainable Development Goal (SDG) indicators, which help to assess society’s progress on global sustainable development challenges like poverty and ocean stewardship (https://unstats.un.org/sdgs/indicators/indicators-list/), offer an example of how aggregation of discrete data collections leads to global metrics and analyses. SDG indicators are based on inputs from national statistical agencies for each country. To operate across the global spectrum that is needed for sustainability analyses, the sharing of data and information, and the detailed definition of indicators is particularly important. Thus, thinking globally does not mean knowing and planning for everything. Instead, it refers to the need to reach out beyond the local or national operating environment.

Using existing best practices and standards can facilitate collaboration across communities. A best practice is a “methodology that has repeatedly produced superior results relative to other methodologies with the same objective” (Pearlman et al. 2019). Whether best practices focus on data, information, products, or methods, they provide a consistent framework for operating in multiple, disparate places. Best practices that have broad adoption support more efficient sharing, interoperability and sustainability. For example, the principles of “findable, accessible, interoperable and reproducible” offered under the FAIR approach (Wilkinson et al. 2016) are gaining broad acceptance in Earth Sciences, and should be considered in designing and implementing projects (Stall et al. 2019).

One example of adopting standards and best practices is addressing the portability of data used for natural disaster response. In 2017 alone, U.S. weather and climate disasters caused a record-breaking $306 billion in damages. Supply chains were disrupted, business was halted, repair and overtime costs skyrocketed, and lives were lost (Moe et al. 2018). The NASA Disasters Mapping Portal strives to present NASA and other EO data in geospatially enabled and compatible formats (https://maps.disasters.nasa.gov). The NASA portal is working to ensure that data are broadly accessible by adopting a set of standards and best practices that is documented and available to all users (https://disasters.nasa.gov/resources/portal). Examples of these data can be found on the portal and also in the case study presented by Glasscoe in Moe et al. 2018 and ESIP 2018e.

Trust is essential

Users must be able to easily determine that data and data products have been developed with transparency and scientific rigor

Data have become the underlying fabric for much of modern society, particularly as internet and technological advances support automated data collection, increased ease of data and information aggregation, interpretation, sharing, and re-use. This deluge of data is also accompanied by the potential for mis-use and misinformation, however, and both technical- and human-oriented strategies are needed to address these challenges (Murambadoro and Mambo 2017b; Farrell et al. 2019; Tavares et al. 2019). For information to be valuable and useful, it is key that users of information can trust both the Earth science information itself and the people providing it. Users must have ways to identify which data, information, products, and data producers they can trust as a basis for informed decisions.

Trust, in part, comes from transparency in the creation and evolution of products through the value chain. Documentation of these processes is key, so that others can verify and replicate results, as well as understand where results come from. Maintaining data and methods in sustained repositories is emerging as the norm for scientific work so they can be cited consistently when questions are raised about product veracity and quality (Oettle et al. 2014). Elsevier, an academic publisher that encourages the publication of data, for example, notes that “greater transparency boosts public faith in research” (Elsevier 2019).

A user’s confidence and trust in data and information are greatly affected by their understanding and perceptions of the uncertainties in data, analyses, and products (Sacha et al. 2015). Unfortunately, discussions of uncertainties often take a backseat to the dissemination of new findings, even though uncertainty is an important element in understanding, using, and advocating applications. In the worst cases, such broad lack of understanding around scientific uncertainty has been used “to discredit undesirable results or postpone important policies” (Broomell and Kane 2017).

The development of indices is one approach that is used to help make it easier for users to identify which data are reliable, and can be trusted and used for decision-making. Operational Readiness Labels (ORLs), for example, have been developed to aid users in planning, decision support, and risk reduction by creating a standard by which data may be evaluated and ranked for use by stakeholders in a uniform model. The ORL is a federated standard that was developed by the Sensitive Information Sharing Environment (SISE) working group in partnership with ESIP, NASA, NOAA, the All Hazards Consortium (AHC), and the private sector. A model decision tree is used to assess each data set, so that a data set can be classified as ORL 1–4; the lower levels meet more criteria and can be considered as more trusted (as related to completeness of the dataset). Specifically, ORL 1 means that the data are available now for immediate decision making and there are people available to contact with questions about it. ORL 2 data are available sporadically on an event-driven basis, and a point of contact is provided. ORL 3 data are nearly operational and in testing phase, but the data are not guaranteed. Such data could still improve situational awareness and decision making, and target operations are 6–12 months in the future. ORL 4 data are the lowest level and considered to be in the testing or validation phase. Data are still being evaluated for accuracy and being validated (case study provided by Hicks in Moe et al. 2018 & ESIP 2018f).

Transparency, honesty, and openness are all important mechanisms for building trust among people and within groups. Trust must be built both externally (between the team and people external to the project) and internally (within the project team) to the project. To build trust between the project teams and the people external to the project, information producers must be honest and open about acknowledging uncertainties and also gaps in the data and its limitations in order to manage user expectations, and their use and satisfaction with the product (Petter 2008). This can be done, in part, by integrating strategies for transparency in data/knowledge quality (e.g., biases, uncertainties), decision-making processes and methods, and specific challenges and past failures. Users do not have the same background as data/information providers, so technical details and subtleties of the processes must be thoughtfully translated to make them more easily understood. In regard to the project team, appropriate interfaces between relevant value chain elements are needed, so that the hand off between elements is efficient, and does not introduce errors and increase undesired uncertainties. Good cohesion within the elements and across teams helps support co-creation of knowledge and solutions.

One approach that is helpful for building trust among a project team is the development of a team contract, which allows participants to share their values, concerns, and vision for the team, as well as challenge current norms and practices, and agree upon a set of rules for engagement. The Bergrivier Climate Knowledge Network in the Western Cape of South Africa used this approach to help build trust among a group of climate change adaptation specialists, provincial government, local decision makers, residents, and other local stakeholders. The goal of the Network is to address climate challenges in the region through a collaborative framework. Building participants’ trust allowed the project to grasp the community’s level of understanding about climate change, perceptions on the accuracy of forecasting data and application of weather forecasts in day-to-day farming activities, and bring in underrepresented community groups, like women. Further to this, there is increased interaction and knowledge exchange between scientists, decision makers, traditional leaders, and community members. The Network plays a key role in guiding local climate change adaptation and capacity building of key actors in the municipality. The success of such group processes, however, depends on a skilled facilitator/s to manage power dynamics while also allowing for all voices to be heard, especially vulnerable and marginalized groups (case study from Oettle et al. 2014).

Lower barriers to entry

Adapting outcomes to user capabilities can help increase the uptake and success of data and information solutions

When EO data and information are disseminated, they are often subject to varying policy and socioeconomic conditions, and respond to development pressures that are embedded within complex contexts (Roberts 2008; Leck and Roberts 2015; Patel et al. 2015). Therefore, when users fail to appreciate the value of the information, it is often not because they are ignorant of epistemic knowledge but because they struggle with information that they are uncertain of how to use (Dunlop 2009). To increase societal benefit from EO information, both tailoring information products to align with user capability, and building capacity among users to apply these products can help.

Working within user capability entails developing, translating, and communicating products that align as closely as possible to a user’s context, such as their existing workflows, habits, language, and culture. Leveraging platforms and infrastructure that people are already using helps to minimize how much they need to learn or change their current workflows to use a new product. Users prefer to receive information in formats that make it easier to apply; approaches like iterative dialogue can help determine what those formats should be (Bielak et al. 2008). Ethnographic study, such as through interviews and observation, has been used by technology companies like Intel (Anderson 2009) to identify and better understand user context, and determine what terminology is best for communicating the product to different users (Pelling et al. 2015). To design products that address the specific perspective and culture of different users, development of user personas that summarize observations of potential users into different archetypes can help. Such user personas describe a fictional person, such as their main occupation, demographics, a day in their life, goals, and fears (Wilshere 2017).

In the context of EO data use, capacity building is needed when there is limited access to EO data and processing tools, education and training materials, and best practices (Desconnets et al. 2017). Capacity building can apply to individuals, institutions, and infrastructure (GEO 2006); it often aims to increase user self-sufficiency, as well as to gain commitment, acceptance, and adoption of EO data and data products (Giuliani et al. 2015). Common capacity building tools and mechanisms include guiding documents, tutorials, workshops, and one-to-one expert support. The CLEAN (Climate Literacy and Energy Awareness Network), for example, provides webinars and workshops to help teachers learn new approaches and tools for teaching K-12 students about climate and energy issues (case study in Niepold et al. 2018 and ESIP 2018 g).

As an example of how communication and use of data products can be improved by working within user capability, there have been efforts to support policy and decision makers in the Capricorn District Municipality of Limpopo, South Africa. This is done through the provision of climate change data and products. Such products include a spatial portal that presents updated weather information, as well as climate databases, vulnerability, and impact assessment tools to help support climate change response and disaster management. Despite having such products, however, local government users struggle to mainstream climate change information into municipal operations, because geospatial and scientific knowledge are presented in unfamiliar formats. Local-level forums and community meetings are currently used by officials to share this climate information, enabling them to implement rainwater harvesting and energy saving initiatives within the district. Successful mechanisms for better aligning data products and user capability include using participatory approaches to identify appropriate entry levels for climate change communication, developing sector plans that allow for integrated planning through high public participation, and creating social learning platforms (e.g., forums for municipal climate change and disaster advisory) that allow for long-term dialogue, collective action, and reflection (Harvey et al. 2012; Cundill et al. 2014) (case study by Murambadoro in Pearlman et al. 2018 and ESIP 2018d).

Document the process

Capturing lessons learned allows others to avoid repeating mistakes and to improve upon successes

If a tree falls in the forest and no one is around to hear it, did it make a sound? Without documentation and communication of the processes, failures, successes, and lessons learned in a project, a great portion of a project’s value is lost. While a project may result in providing solutions for a specific problem, without documentation it is often limited as to how it can be applied to other solutions in a broader context. It is also possible that the project’s solution fails. As such, the documented lessons learned could become a project’s main product and contribution. Documenting the process also allows for impediments to knowledge transfer and uptake to be highlighted, as these subtle barriers to the application of earth observation data and products are easy to overlook (Williamson et al. 2002).

Documenting processes involves keeping careful records of both initial and evolving concepts, data and information workflows, governance and management of the project, methods, and all of the inputs, outputs and versions associated with the process and data management. Importantly, specific people, dates, and locations of where data and products are stored should be noted. A project management plan (Project Management Institute 2017) that is developed early in the project can be used to think about and guide how diverse aspects of the project, such as data and information, will be treated, stored, and shared. Documentation approaches should consider if a person in the future might be able to understand, find, and re-use the information provided. The goal of documentation is to create a record of presentable and useful information, rather than documenting for documentation’s sake, which can result in a project’s “paralysis through analysis” (Adomavicius 2016). Therefore, the project should plan early on what should be documented, how and when documentation should occur, how much documentation is needed, who should do it, and what the goals of the documentation are.

In addition to documenting project successes and best practices (see rule 6), it is also important to note and report project failures, while being aware that failures differ in regards to why and how they occurred. “Intelligent failures”, for example, includes experimentation to identify the best path forward in uncertain contexts, providing essential new knowledge that contributes to future successes. Less desirable failures include those that are preventable (a known mistake was made) and complex situations where a combination of needs, people, and problems align in just the wrong way. (Edmonson 2011). Detailed descriptions of challenges and failures, such as how place-based communities are engaged and contribute to a project, are often not reported (Plowden 2008), although such recording can be very valuable for other projects to use for guidance.

Aside from simply documenting, the next step is to select how the documentation will be archived (for future reference) and communicated. It should be considered what audiences are intended for different types of documentation and what purpose the documentation could serve in the future. This will allow venues for communication to be properly selected. For example, scientific articles provide a level of summarized project detail that is typically appropriate for more technical audiences. Taking this communication further by translating project processes and results for a wider, non-expert audience, such as through blog posts and editorials, also contributes to societal benefits.

As an example of how documentation can be done, communicated, and used, the Climate Resilience Toolkit includes a collection of case studies that describe how people are building resilience for their businesses and in their communities. Sharing these case studies can inspire others to build climate resilience. The brief stories highlight examples of real people or communities who recognize climate-related issues and take some action toward building resilience. Such stories can help to communicate complex science into more easy to digest lessons learned. For example, an engaging narrative can help describe how a VOI approach can be used to study the impact of Landsat on agricultural land management (case study by Bernknopf in Hoebelheinrich et al. 2018 & ESIP 2018d).

Walk away when the solution has legs

Once the solution has been adopted, know when it’s time to let users take over

A great sign of success for a project is to develop solutions that users adopt, take leadership of, and adapt further in ways that were not originally planned by the developers. It can be difficult to know when users should be allowed (and encouraged) to lead the direction of any further development, while the development team moves on to another project or accepts new roles as advisors, support, or “leading from behind” (Hill 2010).

Once a satisfactory solution has been reached, the project team should take a step back and ask: Has the solution been adopted by users? Is it attaining the benefits that were envisioned? Has the training and skills development in the project been effective for users? Often it is necessary to give users time to test, use, and experiment with solutions before it becomes evident how the solutions will be applied in a real-world context. This should be part of the original project design developed under the process of co-design and transition planning.

To assess when it is time to allow users to take the lead, the project’s theory of change (UNDG 2017) that is established early on the project (as described in rule 5) can provide a helpful framework. A theory of change allows knowledge producers (and users) to define the change they want to make and how it will be done (Plimmer and Kail 2014). Articulating the outcomes of a solution allows for monitoring and evaluation of progress to occur, so that when outcomes have been met, it can be identified that it is time to close the project. For example, once a specified number of people have adopted a solution, it is determined that a level of adoption has been achieved, and the project team can allow users to begin leading the process. This allows for self and group efficacy as users increase their confidence to perform a task successfully, while also maintaining the skills and knowledge acquired (Bandura 1977, 1997; Pelling et al. 2015).

Coordinated networks are one way to build capacity and organize resources for users in order to add value to or help launch their projects. For example, the CLEAN framework provides cyberinfrastructure (to host group communication) and management personnel (staff that manage the network) to enable participants (e.g., teachers and educators) to take leadership and ownership of specific activities. As a result, while initiation of the network was science-driven, it has now become a platform that helps support community-driven activities, such as helping teachers create Earth Science-based lesson plans for K-12 classrooms (case study by Manning in Niepold et al. 2018 & ESIP 2018g).

The danger of not being aware of the right time for the project team to take a step back is that the solution may not be able to attain a level of sustainability, and worse may eventually fail because of lack of user ownership. Development studies highlight the risk of dependency when aid agencies fail to build the capacity of recipient communities to take their destiny into their own hands. Such a transition is essential to ensure that the benefits from solutions reach people as broadly and as sustainably as possible (Blair and Gross 2013).

It is also possible that a project is not taken far enough along to achieve the intended success. SERVIR, which is a joint initiative between NASA and the United States Agency for International Development (USAID), offers an example of how a project, while initially successful, could have been carried further to ensure greater success. The Salvadoran National Red Tide Commission (CONAMAR), which is composed of different Ministries of El Salvador, the Water Center for the Humid Tropics of Latin America and the Caribbean (CATHALAC), and SERVIR collaborated to develop a spatial tool to complement traditional water and shellfish tissue sample collection in order to improve monitoring of harmful algal blooms (HAB). The tool consists of processed maps of moderate-resolution ocean data (1 km spatial resolution) collected daily by the Moderate-Resolution Imaging Spectroradiometer (MODIS) instrument aboard NASA’s Aqua satellite. When a HAB event is detected, the El Salvador government issues a ban on selling shellfish to prevent consumption of toxins. A shortcoming that is often seen in development work is that use and capacity for technical tools is often centralized or concentrated at a ministerial level. Therefore, adoption does not extend to local departments, extension services, or end users that might be able to apply the information in local-level contexts. For example, with access to better information, fishing cooperatives could alter their fishing routes to avoid algal blooms. This case study demonstrates how projects could achieve higher societal value from EO-based tools by investing in capacity building beyond the immediate users (i.e., CONAMAR) (case study from Management Systems International, A Tetra Tech Company & Development and Training Services, a Palladium company n.d.)


There are increasing demands to understand and monitor the value of EO data, information, and application products in order to maximise societal benefits. EO value is optimized when users can easily access and use data and information to improve decisions. To help data producers and intermediaries of the EO value chain increase the effectiveness of their work, this paper provides 10 rules that can be applied toward the overall EO value chain and its segments.

Several themes are noted in the rules, such as the benefits gained from co-design and collaboration with diverse participants, and viewing a project and its intended results and participants as a complex system. Rather than a prescription, the rules offer best practices for addressing such challenges as developing and managing an EO data-based project, addressing end user needs, and developing project outcomes that scale up toward broader goals. The rules draws on concepts from such domains as biophysical science, science and technology studies, operations research, economics, and project management. While no means a comprehensive set of rules, we intend for the ten rules presented here to be a starting place toward thinking, planning, and increasing the societal benefit that can be gained from the wealth of EO data and information that exists and will be produced in the future.


  1. Ackoff RL (1989) From data to wisdom. J Appl Syst Anal 16(1989):3–9

    Google Scholar 

  2. Adams V, Blankenship T, Burgess-Herbert S, Corley W, Coughlan J, Gelso B, Hinds E, Hurley E, Hutson M, Li J, Wilson D (2013) Measuring socioeconomic impacts of earth observations. National Aeronautics and Space Administration, Washington, D.C.

    Google Scholar 

  3. Adger WN, Dessai S, Goulden M, Hulme GM, Lorenzoni I, Nelson DR, Naess LO, Wolf J, Wreford A (2009) Are there social limits to adaptation to climate change? Climatic Change 93(3):335–354 April 2008

    Google Scholar 

  4. Adomavicius A (2016) Documented Failure: Why Detailed Requirements Cost Twice as Much and Deliver Half the Value. Retrieved from https://www.devbridge.com/articles/failure-through-documentation/ Accessed on Sept 16, 2019

  5. Anderson K (2009) Ethnographic Research: A Key to Strategy. Retrieved from https://hbr.org/2009/03/ethnographic-research-a-key-to-strategy. Accessed on Sept 25, 2019

  6. Baker KS, Duerr RE, Parsons MA (2015) Scientific knowledge mobilization: co-evolution of data products and designated communities. Int J Digit Curation 10(2):110–135

    Google Scholar 

  7. Bandura A (1977) Self-efficacy: toward a unifying theory of behavioral change. Psychol Rev 84(2):191–215

    Google Scholar 

  8. Bandura A (1997) Self-efficacy: The exercise of control. W. H. Freeman, New York, NY

    Google Scholar 

  9. Bellinger G, Castro D, Mills A (2004) Data, information, knowledge, and wisdom. Retrieved from http://www.systems-thinking.org/dikw/dikw.htm accessed on July 11, 2019

  10. Bielak AT, Campell A, Pope S, Schaefer K, Shaxson L (2008) From science communication to knowledge brokering: the shift from “science push” to “policy pull”. In: Cheng D et al (eds) Communicating science in social contexts. Springer science+ business media B.V., Berlin

    Google Scholar 

  11. Blair T, Gross K (2013) From dependency to self-sufficiency. Stanford social innovation review. Retrieved from https://ssir.org/articles/entry/from_dependency_to_self_sufficiency#. Accessed on Sept 25, 2019

  12. Bornmann L (2013) What is societal impact of research and how can it be assessed? A literature survey. J Am Soc Inf Sci Technol 64(2):217–233

    Google Scholar 

  13. Broomell SB, Kane PB (2017) Public perception and communication of scientific uncertainty. Journal of Experimental Psychology: General 146(2):286. https://doi.org/10.1037/xge0000260

    Article  Google Scholar 

  14. Brugha R, Varvasovszky Z (2000) Stakeholder analysis: a review. Health Policy Plan 15(3):239–246

    Google Scholar 

  15. Burns M, Audouin M, Weaver A (2006) Advancing sustainability science in South Africa. Commentary. South African Journal of Science 102. September/October 2006:379–384

    Google Scholar 

  16. Card AJ (2017) The problem with ‘5 whys’. BMJ Qual Saf 26(8):671–677

    Google Scholar 

  17. CCSDS (2012) Reference model for an Open Archival Information System (OAIS). Washington DC: CCSDS 650.0-M-2, Magenta Book. Issue 2. June 2012. Retrieved from the Consultative Committee for Space Data Systems website: http://public.ccsds.org/publications/RefModel.aspx. Accessed on Sept 16, 2019

  18. Cook J, Lewandowsky S (2016) Rational irrationality: modeling climate change belief polarization using Bayesian networks. Top Cogn Sci, 8(1), 160–179. Retrieved from: https://onlinelibrary.wiley.com/doi/full/10.1111/tops.12186. Accessed on Sept 16, 2019

  19. Cook CN, Mascia MB, Schwartz MW, Possingham HP, Fuller RA (2013) Achieving conservation science that bridges the knowledge–action boundary. Conserv Biol 27(4):669–678

    Google Scholar 

  20. Cundill G, Shackleton S, Sisitka L, Nstshudu M, Lotz-Sisitka H, Kulundu I, Hamer N (2014) Social learning for adaptation: a descriptive handbook for practitioners and action researchers. IDRC/Rhodes University/Ruliv

  21. Cutcher-Gershenfeld J, Baker KS, Berente N, Flint C, Gershenfeld G, Grant B et al (2017) Five ways consortia can catalyse open science. Nature News 543(7647):615–617. https://doi.org/10.1038/543615a

  22. De Wit B, Notje K (2014) Value based decision making in planning and design of large capital projects. A reference guide for project managers. CSIR report ID: CSIR/NRE/GES/EXP/2014/0054/a

  23. Dery D (2000) Agenda setting and problem definition. Policy Studies 21(1):37–47

    Google Scholar 

  24. Desconnets JC, Giuliani G, Guigoz Y, Lacroix P, Mlisa A, Noort M, Ray N, Searby ND (2017) GEOCAB portal: a gateway for discovering and accessing capacity building resources in earth observation. Int J Appl Earth Obs Geoinf 54:95–104

    Google Scholar 

  25. Dunlop CA (2009) Policy transfer as learning- capturing variation in what decision makers learn from epistemic communities. Policy Studies 30(3):1–44

    Google Scholar 

  26. Dunlop CA (2017) Pathologies of policy learning: what are they and how do they contribute to policy failure? Policy & Politics, Volume 45, Number 1, January 2017, pp. 19–37(19)

  27. Edmonson AC (2011) Strategies for Learning from Failure. https://hbr.org/2011/04/strategies-for-learning-from-failure Accessed on Sept 16, 2019

  28. Elsevier (2019) Sharing research data. Retrieved from: https://www.elsevier.com/authors/author-resources/research-data. Accessed on Sept 25, 2019

  29. ESIP (2018a) 2018 ESIP webinar series: socioeconomic value of earth science data, YouTube Channel. Retrieved from: https://youtu.be/TeM-Xlo8eh0. Accessed on Sept 25, 2019

  30. ESIP (2018b) Webinar #1: Introduction to the series on Socioeconomic value of Earth Science data, 5Jun2018. Retrieved from https://youtu.be/TeM-Xlo8eh0. Accessed on Sept 25, 2019

  31. ESIP (2018c) Webinar #2: the information pathway for earth science data: between supplier and user. Retrieved from https://youtu.be/qsNpxDAdzQM. Accessed on Sept 25, 2019

  32. ESIP (2018d) Webinar #3: Measuring and Assessing the Socioeconomic Value of Earth Science Data. Retrieved from: https://youtu.be/7-y2aCTARLU. Accessed on Sept 25, 2019

  33. ESIP (2018e) Webinar #4: The “pipeline” of Earth science data to climate resilience. Retrieved from: https://youtu.be/pP8RViwhPrE. Accessed on Sept 25, 2019

  34. ESIP (2018f) Webinar #5: Managing disasters through improved data-driven decision-making. Retrieved from: https://youtu.be/VcYetkr6L9w. Accessed on Sept 25, 2019

  35. ESIP (2018g) Webinar #6: Building Societal Capacity: The Educational Value of Earth System Science Data, Information, and Applications. Retrieved from: https://youtu.be/dMETDQ0jNlo. Accessed on Sept 25, 2019

  36. ESIP (2019) Collaboration areas. Retrieved from: https://www.esipfed.org/get-involved/collaborate. Accessed on Sept 25, 2019

  37. Farrell J, McConnell K, Brulle R (2019) Evidence-based strategies to combat scientific misinformation. Nat Climate Chang 1

  38. Faugier J, Sargeant M (1997) Sampling hard to reach populations. J Adv Nurs 26:790–797

    Google Scholar 

  39. Fritz Institute (2006) Hurricane Katrina: perceptions of the affected. Fritz Institute. Retrieved from: http://www.fritzinstitute.org/PDFs/findings/Hurricanekatrina_Perceptions.pdf. Accessed on Sept 25, 2019

  40. Frost J, Osterloh M (2003) Dialogue Devices: Bridging between “Mode 1” and “Mode 2” Knowledge Production. In Müller, A. & Kieser, A. (Hrsg). Communication in Organisations Structures and Practices. Frankfurt. a.M.,S. 81–101

  41. GEO (2006) GEO Capacity building strategy - Document 13. November 2006. https://www.earthobservations.org/documents/geo_iii/13-Capacity_Building_Strategy.pdf [accessed on 16th September 2019]

  42. Gibbons M, Limoges C, Nowotny H, Schwartzman S, Scott P, Trow M (1994) The New production of knowledge: the dynamics of science and research in contemporary societies. SAGE, London

    Google Scholar 

  43. Giuliani G, Papeschi F, Mlisa A, Lacroix P, Santoro M, Nonguierma A, Cools J, Guigoz Y (2015) Enabling discovery of African geospatial resources. South-Eastern European Journal Issue of Earth Observation and Geomatics 4(1S):1–16

    Google Scholar 

  44. Goodman M (2018) Systems thinking: What, why, when where, how? Retrieved from https://thesystemsthinker.com/systems-thinking-what-why-when-where-and-how/. Accessed on Sept 25, 2019

  45. Gustafsson KM, Lidskog R (2018) Boundary organizations and environmental governance: performance, institutional design, and conceptual development. Clim Risk Manag 19:1–11

    Google Scholar 

  46. Hanson B et al (2017) Eos. https://doi.org/10.1029/2018EO071991

  47. Harshadeep NR (2018) Innovations for Sustainable Planning and Management of Watersheds. Bulletin n° : Vol 67 (1). Retrieved from: https://public.wmo.int/en/resources/bulletin/innovations-sustainable-planning-and-management-of-watersheds. Accessed on Sept 25, 2019

  48. Harvey B, Ensor J, Carlile L, Garside B, Patterson Z, Naess LO (2012) Climate change communication and social learning–review and strategy development for CCAFS. CCAFS working paper no. 22. CGIAR research program on climate change, agriculture and food security (CCAFS), Copenhagen, Denmark. Available online at www.ccafs.cgiar.org accessed Sep 25, 2019

  49. Healy RW, Alley WM, Engle MA, McMahon PB, Bales JD (2015) The water-energy nexus: an earth science perspective (no. 1407). US Geological Survey

  50. Higuera P, Metcalf A, Miller C, Buma B, McWethy D, Metcalf E, Ratajczak Z, Nelson C, Chaffin B, Stedman R, McCaffrey S, Schoennagel T, Harvey B, Hood S, Schultz C, Black A, Taggerty J, Keane R, Krawchuk M, Kulig J, Rafferty R, Virapongse A (2019) Integrating subjective and objective dimensions of resilience in fire-prone landscapes. Bioscience 69(5):379–388, https://doi.org/10.1093/biosci/biz030

  51. Hill LA (2010) Leading from behind. Harvard business review. https://hbr.org/2010/05/leading-from-behind accessed on Sep 17, 2019

  52. Hipp JR (2016) Collective efficacy: How is it conceptualized, how is it measured, and does it really matter for understanding perceived neighborhood crime and disorder J Crim Justice. 2016 Sep; 46: 32–44. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4824951/ Accessed Sep 25, 2019

  53. Hoebelheinrich N, Teng W, Bernknopf R, Wee B, Virapongse A, Pearlman F, Pearlman J, Robinson E (2018) ESIP webinar #4: the “pipeline” of earth science data to climate resilience and its value for real-world decision making. ESIP. Presentation. https://doi.org/10.6084/m9.figshare.7180562

  54. Hossain F (2015) Data for all: using satellite observations for social good. Eos 96

  55. Jorda-Capdevila D, Rodríguez-Labajos B (2017) Socioeconomic value (s) of restoring environmental flows: systematic review and guidance for assessment. River Res Appl 33(3):305–320

    Google Scholar 

  56. Kirkwood CW (1998) System behavior and causal loop diagrams. In System Dynamics Methods: A Quick Introduction CC BY-NC 3.0 http://www.public.asu.edu/~kirkwood/sysdyn/SDIntro/ch-1.pdf

  57. Klocker N (2012) Doing participatory action research and doing a PhD: words of encouragement for prospective students. J Geogr High Educ 36(1):149–163

    Google Scholar 

  58. Lannon CP (2018) Causal loop construction: the basics. Systems thinker. Retrieved from: https://thesystemsthinker.com/causal-loop-construction-the-basics/. Accessed Sep 25, 2019

  59. Lavanya N, Malarvizhi T (2008) Risk analysis and management: a vital key to effective project management. In: Paper presented at PMI® global congress 2008—Asia Pacific, Sydney, New South Wales, Australia. Project Management Institute, Newtown Square, PA

    Google Scholar 

  60. Leck H, Roberts D (2015) What lies beneath: understanding the invisible aspects of municipal climate change governance. Curr Opin Environ Sustain 2015(13):61–67

    Google Scholar 

  61. Li F, Whalley J (2002) Deconstruction of the telecommunications industry: from value chains to value networks. Telecommun Policy 26(9–10):451–472

    Google Scholar 

  62. Longhorn R, Blakemore M (2007) Geographic information: value, pricing, production and consumption. Boca Raton, FL, CRC Press. https://doi.org/10.1201/9781420005172

  63. Lopez C (2018) “Chapter 17: section 4. Analyzing root causes of problems: the "but why?" technique” community tool box, Center for Community Health and Development, University of Kansas. Retrieved from: https://ctb.ku.edu/en/table-of-contents/analyze/analyze-community-problems-and-solutions/root-causes/main accessed Sep 25, 2019

  64. Macauley MK (2006) The value of information: measuring the contribution of space-derived earth science data to resource management. Space Policy 22(4):274–282

    Google Scholar 

  65. Management Systems International, A Tetra Tech Company; and Development and Training Services, a Palladium company (n.d.) Ocean Algal Bloom Monitoring for Mesoamerica. Evaluation Brief: SERVIR Products and Tools, USAID

  66. Maramba I, Chatterjee A, Newman C (2019) Methods of usability testing in the development of eHealth applications: a scoping review. Int J Med Inform

  67. Marketlinks team (2019) What is a Causal Loop Diagram and What is it Good For? Retrieved from https://www.marketlinks.org/post/what-causal-loop-diagram-and-what-it-good

  68. McMahon K, Ruggeri A, Kämmer JE, Katsikopoulos KV (2016) Beyond idea generation: the power of groups in developing ideas. Creat Res J 28(3):247–257

  69. Moe K, Moran T, Jones D, Hicks K, Glasscoe M, Virapongse A, Pearlman F, Jay P, Robinson E (2018) ESIP webinar #5: managing disasters through improved data-driven decision-making. ESIP Presentation. https://doi.org/10.6084/m9.figshare.7361327

  70. Murambadoro M, Mambo J (2017a) Lessons learnt regarding climate service needs for local government in South Africa. Fifth international conference on climate services (ICCS5) 27 February −3 march 2017 Cape Town South Africa

  71. Murambadoro M, Mambo J (2017b) Qualitative and other social science methods to assess the importance of geospatial information. Book chapter in Kruse, J.B., Crompvoets, J. and Pearlman, F. eds., 2017. GEOValue: the socioeconomic value of geospatial information. CRC Press

  72. Murugaiah U, Jebaraj Benjamin S, Srikamaladevi Marathamuthu M, Muthaiyah S (2010) Scrap loss reduction using the 5-whys analysis. International Journal of Quality & Reliability Management 27(5):527–540

    Google Scholar 

  73. Nagy J (2018) “Chapter 17: section 3: defining and analyzing the problem.” Community tool box, Center for Community Health and Development, University of Kansas. https://ctb.ku.edu/en/table-of-contents/analyze/analyze-community-problems-and-solutions/define-analyze-problem/main

  74. Nagy J, Axner M (2018) Chapter 17: Section 6: Generating and Choosing Solutions. https://ctb.ku.edu/en/table-of-contents/analyze/analyze-community-problems-and-solutions/generate-solutions/main

  75. NASA (n.d.) EOSDIS Glossary. https://earthdata.nasa.gov/learn/user-resources/glossary. Accessed on Sep 17, 2019

  76. National Weather Service (2019) Forecasts and services. US Dept of Commerce, National Oceanic and Atmospheric Administration. https://www.weather.gov/about/forecastsandservice

  77. Niepold F, Fox S, Boyd K, Manning C, Chandler P, Virapongse A, Pearlman F, Jay P, Robinson E (2018) ESIP Webinar #6: Education Value of Earth Science Data.pptx. ESIP. Presentation. https://doi.org/10.6084/m9.figshare.7418075

  78. Oettle N, Koelle B, Law S, Parring S, Schmiedel U, Archer van Garderen E, Bekele T (2014) Participatory adaptation handbook: a practitioner’s guide for facilitating people centred adaptation. Indigo development and change Nieuwoudtville South Africa. Retrieved from https://www.researchgate.net/publication/275954218_Participatory_Adaptation_Handbook-_A_practitioner%27s_guide_for_facilitating_people_centred_adaptation. Accessed on Oct 8, 2019

  79. Ohno T (1988) Toyota production system: beyond large-scale production. Productivity Press, Portland, OR

    Google Scholar 

  80. O'Leary R, Choi Y, Gerard CM (2012) The skill set of the successful collaborator. Public Adm Rev 72(s1):S70–S83

    Google Scholar 

  81. Patel DJ (2012) Data jujitsu: the art of turning data into data product. Radar, O’Reilly. Retrieved from http://radar.oreilly.com/2012/07/data-jujitsu.htm

  82. Patel Z, Greyling S, Parnell S, Pirie G (2015) Co-producing urban knowledge: experimenting with alternatives to ‘best practice’ for Cape Town, South Africa. IDPR 37(2):187–203

    Google Scholar 

  83. Pearlman J, Kuwayama Y, Downs R, Murambadoro M, Virapongse A, Pearlman F, Robinson E (2018) ESIP webinar #3: measuring and assessing the socioeconomic value of earth science data. ESIP. Presentation. https://doi.org/10.6084/m9.figshare.7096286

  84. Pearlman J, Bushnell M, Coppola L, Karstensen J, Buttigieg PL, Pearlman F, Simpson P, Whoriskey F (2019) Evolving and Sustaining Ocean Best Practices and Standards for the Next Decade. Frontiers in Marine Science 6(277):19. https://doi.org/10.3389/fmars.2019.00277/

  85. Pelling M, Sharpe J, Pearson L, Abeling T, Swartling AG, Forrester J, Deeming H (2015) Social learning and resilience building in the emBRACE framework. Deliverable 4.3. emBRACE working paper series

  86. Petter S (2008) Managing user expectations on software projects: Lessons from the trenches. International Journal of Project Management 26(7):700–712

  87. Plimmer D, Kail A (2014) Theory of change for funders: Planning to make a difference. Retrieved from http://www.pointk.org/resources/files/Theory-of-change-for-funders2_1.pdf. Accessed Sep 25, 2019

  88. Plowden C (2008) Challenges and lessons studying non-timber forest products with traditional communities in the Amazon. Ethnobot Res Appl 6:023–028

    Google Scholar 

  89. Project Management Institute (2017) PMBOK Guide—Sixth Edition. Newtown Square PA

  90. Roberts D (2008) Thinking globally, acting locally –institutionalizing climate change at the local government level in Durban, South Africa. International Institute for Environment and Development (IIED) Environment & Urbanization 20(2):521–537

    Google Scholar 

  91. Robinson E, King JL, Pearlman F, Kruse J, Shanley L, Virapongse A, Pearlman J (2018) Webinar #1_Introduction_5JUN2018_for ESIP webinar series on socioeconomic value of earth science data. ESIP. Presentation. https://doi.org/10.6084/m9.figshare.6494882

  92. Rounsevell MD, Metzger MJ (2010) Developing qualitative scenario storylines for environmental change assessment. Wiley Interdiscip Rev Clim Chang 1(4):606–619

    Google Scholar 

  93. Roux DJ, Rogers KH, Biggs HC, Ashton PJ, Sergeant A (2006) Bridging the science–management divide: moving from unidirectional knowledge transfer to knowledge interfacing and sharing. Ecol Soc 11(1):4

    Google Scholar 

  94. Rudy LJ (2017a) What Is the Definition of Brainstorming? (For Groups & Individuals). The Ultimate Guide to Better Brainstorming Techniques, EnvatoTuts+. https://business.tutsplus.com/tutorials/what-is-the-definition-of-brainstorming%2D%2Dcms-27997. Accessed Sep 25, 2019

  95. Rudy LJ (2017b) How to Run an Effective Brainstorming Session. The Ultimate Guide to Better Brainstorming Techniques, EnvatoTuts+ https://business.tutsplus.com/tutorials/how-to-run-an-effective-brainstorming-session%2D%2Dcms-27145. Accessed Sep 25, 2019

  96. Sacha D, Senaratne H, Kwon BC, Ellis G, Keim DA (2015) The role of uncertainty, awareness, and trust in visual analytics. IEEE Trans Vis Comput Graph 22(1):240–249

    Google Scholar 

  97. Sampson R (2004) Neighbourhood and community: collective efficacy and community safety. New Economy 11(2):106–113

    Google Scholar 

  98. Sharma N (2008) The origin of data information knowledge wisdom (DIKW) hierarchy. https://www.researchgate.net/publication/292335202_The_Origin_of_Data_Information_Knowledge_Wisdom_DIKW_Hierarchy accessed on July 11, 2019

  99. Smart A (2014) POSITION PAPER - EVALUATION METHODS AND TECHNIQUES, ACIL Consulting. Retrieved from: http://www.geovalue.org/wp-content/uploads/2013/05/Background-Position-paper-Smart1.pdf. Accessed Sep 25, 2019

  100. Southby K (2017) Reflecting on (the challenge of) conducting participatory research as a research-degree student. Research for All 1(1):128–142. https://doi.org/10.18546/RFA.01.1.10

    Article  Google Scholar 

  101. Stall S, Yarmey L, Cutcher-Gershenfeld J, Hanson B, Lehnert K, Nosek B, Parsons M, Robinson E, Wyborn L (2019) Make scientific data FAIR. Nature 27

  102. Sustainable Apparel Coalition (2019) The Higg Index https://apparelcoalition.org/the-higg-index/. Accessed Sep 25, 2019

  103. Tavares B, Correia FF, Restivo A (2019, June) Trusted data transformation with Blockchain Technology in Open Data. In: International Symposium on Distributed Computing and Artificial Intelligence. Springer, Cham, pp 213–216

    Google Scholar 

  104. Taylor A, Cartrwight A, Sutherland C (2014) Institutional pathways for local climate adaptation: a comparison of three south African municipalities. FOCALES 18 Agence Française de Développement (AFD) [accessed online 24/06/2019] Available at https://www.africancentreforcities.net/wp-content/uploads/2014/06/FocalesN18_GB_WEB.pdf

  105. UNDG (2017) Theory of change:UNDAF companion guidance. https://undg.org › UNDG-UNDAF-Companion-Pieces-7-Theory-of-Change

  106. United Nations Climate Change (2019) Meet the Champions. https://unfccc.int/news/climate-champions-selected Accessed July 11, 2019

  107. Vermeeren AP, Roto V, Väänänen K (2016) Design-inclusive UX research: design as a part of doing user experience research. Behav Inform Technol 35(1):21–37

    Google Scholar 

  108. Virapongse A, Schmink M, Larkin S (2014) Value chain dynamics of an emerging palm fiber handicraft market in Maranhão, Brazil. Forest, Trees, and Livelihoods 23(1–2). https://doi.org/10.1080/14728028.2013.868707

  109. Virapongse A, Duerr RE, Metcalf EC (2018a) Knowledge mobilization for community Resilience: Perspectives From Data, Informatics, And Information Science. Sustain Sci. https://doi.org/10.1007/s11625-018-0612-z

  110. Virapongse A, Coote A, Pindilli E, Vandenbroucke D, Pearlman F, Pearlman J, Robinson E (2018b) ESIP webinar #2: the information pathway for earth science data: between supplier and user. ESIP. Presentation. https://doi.org/10.6084/m9.figshare.6962654

  111. Waga D, Rabah K (2014) Environmental conditions’ big data management and cloud computing analytics for sustainable agriculture. World J Comput Appl Technol 2(3):73–81

    Google Scholar 

  112. Wenger E (2000) Communities of practice and social learning systems. Organization 7(2):225–246

    Google Scholar 

  113. Williams P (2002) The competent boundary spanner. Public Adm 80(1):103–124

    Google Scholar 

  114. Williamson RA, Hertzfeld HR, Cordes J, Logsdon JM (2002) The socioeconomic benefits of earth science and applications research: reducing the risks and costs of natural disasters in the USA. Space Policy 18(1):57–65

    Google Scholar 

  115. Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M, Baak A, Blomberg N, Boiten JW, da Silva Santos LB, Bourne PE and Bouwman J (2016) The FAIR Guiding Principles for scientific data management and stewardship. Scientific data 3

  116. Wilshere A (2017) User Personas: What Are They And Why Use Them? https://trydesignlab.com/blog/user-personas-what-are-they-why-use-them/

  117. Wisner B, Blaikie P, Cannon T, Davis I (2004) At risk: natural hazards, People’s vulnerability and disasters, 2nd edn. Routledge, New York

    Google Scholar 

  118. Ziervogel G, New M, Archer van Garderen E, Midgley G, Taylor A, Hamann R, Stuart-Hill S, Myers J, Warburton M (2014) Climate change impacts and adaptation in South Africa. WIREs Climate Change 5(5):605–620

    Google Scholar 

  119. Ziolkowska JR (2018) Economic value of environmental and weather information for agricultural decisions–a case study for Oklahoma Mesonet. Agric Ecosyst Environ 265:503–512

    Google Scholar 

Download references


Earth Science Information Partners (ESIP) provided funding to support the webinar series that this paper was based on entitled, “The socioeconomic value of Earth Science data, information, and applications” (2018). Contributions from the presenters of the ESIP webinar series greatly helped to inform this paper. ESIP supported AV and provided the publication fees for open access of this article, as well as the graphics support for the Figure. Erin Robinson from ESIP was instrumental in developing the initial concept for this paper.

FP and JP acknowledge their collaboration with USGS Science and Decision Center through the USGS cooperative agreement G19AC00146. YK was supported in part through NASA cooperative agreement number NNX17AD26A with Resources for the Future to estimate the value of information obtained from satellite-based remote sensing. MG was funded in part by the NASA Applied Sciences Disasters program, and work was conducted at the Jet Propulsion Laboratory, California Institute of Technology.

Jared Berenter from Management System International helpfully provided the SERVIR use case. We also acknowledge an anonymous reviewer from the journal who contributed suggestions that helped improve the paper.


Audience: people and groups with an interest in specific data products (Baker et al. 2015).

Benefit: something that produces good or helpful results or effects, or promotes well-being of people and/or the environment.

Data: “measurements, values calculated therefrom, observations, or facts that can be represented by numbers, tables, graphs, models, text, or symbols which are used as a basis for reasoning and further calculation; Earth science data can include “observation data, metadata, products, information, algorithms, including scientific source code, documentation, models, images, and research results” (NASA n.d.); data can also result from local place-based observations made by citizen scientists, land and natural resource users, and the broader public (i.e., not necessarily resulting from scientific instruments, the scientific method, and/or scientists).

Data product: a tool, service, or package of data/information that “facilitates an end goal through the use of data” (Patel 2012).

Information: a product created from data that have been processed, structured, or presented according to a given context to make it meaningful and useful.

Knowledge: a collection of information with an intent to be useful (Bellinger et al. 2004).

NASA: National Aeronautics and Space Administration.

NOAA: National Oceanic and Atmospheric Adminstration.

Problems: analytical constructs that are also feasible to solve (Dery 2000).

Project: a temporary endeavor undertaken to create a unique product, service, or result; differing from operations and programs, a project has a definite beginning and end, and a defined scope and resources (Project Management Institute, 2017).

Socioeconomic: Concerning the use of resources belonging to a group of people (Adams et al. 2013).

Solution: The EO data product, tool, or service that directly responds to a problem(s) identified by an actor in the EO value chain for the purpose of increasing the societal benefit of EO data.

Value chain: the set of value-adding activities that are performed to create and distribute goods and services (Longhorn and Blakemore 2007).

Case study: a particular instance of something used or analyzed in order to illustrate a thesis or principle.

User: people that benefit from products at any point in a value chain. A user is inclusive of end users, as well as data generators and intermediaries that may take on roles as users of data products in some contexts.

USGS: United States Geological Survey.

Author information



Corresponding author

Correspondence to Arika Virapongse.

Ethics declarations

Conflict of interest

We have no conflict of interests in writing this article.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Communicated by: H. Babaie

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Virapongse, A., Pearlman, F., Pearlman, J. et al. Ten rules to increase the societal value of earth observations. Earth Sci Inform 13, 233–247 (2020). https://doi.org/10.1007/s12145-020-00453-w

Download citation


  • Earth observation
  • Data
  • Information
  • Value
  • Societal benefit
  • Value chain