Successfully understanding the methods presented in this book has given you some skills in model building and obtaining solutions from these models. However, this alone will not necessarily help you apply and implement such tools in practice. In addition to modeling skills, systems analysts working within or for organizations that make decisions need to know how to effectively inform those within those organizations or agencies who make or recommend decisions and thus can benefit from modeling designed to identify and evaluate possible alternatives. This requires building trust, and an awareness of, and being responsive to, the often-changing information needs of those who recommend or make decisions (Fig. 18.1).

Fig. 18.1
figure 1

Informing the political process is itself a political process

Analysts, especially those engaged in informing policymakers, need to be good communicators. This involves making their results transparent by specifying the assumptions upon which the results are based and by addressing the uncertainties and alternatives openly, taking into account the different interests, goals, and perspectives of stakeholders, and policymakers. Part of being good communicators is recognizing that many terms analysts use, such as the word “model,” can mean different things to others. Analysts attempting to communicate effectively to others should be aware of this need to speak the language their audiences understand.

What do policymakers expect from analysts? One might think they would like definitive advice on what to do, what plan or policy to choose, what action to take, and when, backed up by scientific evidence supporting that position. However, most know that models can by definition answer or address only ‘what if’ analytical questions, not the normative ones. A push for decisive decisions not only overlooks uncertainty but lies beyond the competence of analysts to deliver under the label of “science.” Furthermore, analysts working on policy issues can discover “inconvenient truths,” i.e., model results that might make an otherwise popular policy undesirable and therefore complicate a policy response or force a politically sensitive conclusion. Such a situation can cause two problems. One is the difficulty of communicating unexpected, disturbing results policymakers do not want to hear, thereby creating difficulties for them and possibly disrupting the relationship scientists have with them. The other is the dilemma of whether to make public (publish) such results, which can understandably be motivated by a sense of responsibility towards the public, as well as one’s career as an objective analyst.

Informing, i.e., knowing what to present, and how and when, is learned through collaboration that generates a mutual understanding and trust between systems analysts and their clients. Far less effective is the ad hoc modeling results ‘delivered by parachute’, by an outside expert or firm, either unsolicited, or in a rush when policymakers suddenly ask for the modeling results analysts may or may not have. This especially applies when a sufficient level of trust has not been developed between the analysts and their client policymakers. Useful evidence comes from collaborative, continuous, long-term relationships with policymakers and their staff throughout a policy making process. This is one reason why there is a tendency for policy making agencies to select the same consulting firms to provide the scientific evidence desired over time. They have learned to trust them.

To be relevant to, and imbedded in, policy making processes, analysts must build up that trust and be aware of, if not engaged with, the world in which alternative policies and stakeholder values are considered, debated, and where choices are made. This is a world where simple opinions and anecdotes coming from groups having different interests, perspectives, and power asymmetries, and even false information, can influence final decisions.

Yet policies chosen without sufficient supporting scientific evidence are more likely to fall short of being as successful as they could be. An excellent example of this is the observation that measures taken to increase the efficiency of water used for irrigation so that the savings could be beneficially used elsewhere often have just the opposite impact. They simply motivate enlarging the areas irrigated. In this case one could argue the policy to increase irrigation efficiency in order to provide more water for other uses might have been informed by analyses, but if so the analyses were not sufficient. They did not consider the whole system, or in fact human behavior. While any policy may result in surprising outcomes, not foreseen when the policy was implemented, the scale and likelihood of adverse consequences stemming from non- or incomplete evidence-informed decisions are much higher.

This irrigation story highlights the need for an iterative adaptive policy modeling—decision-making process. Once analysts start working on identifying alternatives, they may realize that they forgot to include some important criteria or constraints, requiring them to go back and update their models and data and continue through the process again, such as illustrated in Fig. 1.1 in Chap. 1. Each of these steps should be done with the decision-maker(s) and the stakeholders, ideally in a shared collaborative and open process.

Part of the art of modeling is deciding what to model, and in what detail. There is no reason to think the first attempt will be the right one. Feedback from those being informed by the modeling exercise will almost always motivate modifications in any systems model. One can only hope that by the time a decision must be made, the modeling results have succeeded in promoting the understanding desired and needed by those responsible for making decisions (Fig. 18.2).

Fig. 18.2
figure 2

Model outputs by themselves are rarely ready for prime time. Informing policymakers requires translating those outputs to what is desired and understood by, and relevant to, them