Keeping the faith: policy sciences as the gatekeeper
So how did these expectations/aspirations fare? Not very well, in terms of the overall public policy field, but the journal has held fast to its principles in the face of the continued expansion of non-contextual, narrow analyses that are not terribly useful for developing sound policy.
…the battle to reassert the importance of contextual, interdisciplinary, problem-oriented inquiry has been quite successful. (Ascher 1987, 3)
We…need no further exercises in naming the various facets of the policy process, if that new nomenclature is simply an equivalent language map of existing vocabularies. This does not reflect any slavish adherence to the Lasswellian vocabulary, but rather a belief that various sound vocabularies exist, and can be improved upon only by introducing new concepts rather than by relabeling the old ones. (Ascher 1987, 4)
Unlike the exciting ferment and intellectual challenges of the early development of the policy sciences as outlined in Garry Brewer’s recent article in this journal (Brewer 2017), in the 1987–1990 period the challenge we (Associate Editors Daniel Durning and Joseph Lipscomb, and I) faced was to keep the elephants away, in a context of too few submissions, too many of which were not problem-oriented, contextual, process-sensitive, or interdisciplinary. In terms of the volume of submissions, although the late 1980s saw a proliferation of public policy departments and schools, and the conversion of many public administration programs into public policy programs, few PhDs in public policy had been granted to that point. Therefore, the pull of publishing in disciplinary journals was still strong, and few scholars were motivated to allow a problem orientation to dictate that they had to be interdisciplinary, process-sensitive, and contextual. These were certainly not the criteria for tenure, promotion, and reputation in most political science or economics departments.
What we rejected
In terms of the content of the submissions, besides the individual gems that we did publish (supplemented by some excellent special issues), four disappointing types of papers were submitted. Many simply ignored the admonition about the need to apply broad systems thinking to policy problems. Either they were reductive, single-explanation treatments of current policy issues, or they provided a narrative of how a policy was proposed, and then adopted or rejected, without any effort to provide insights as to how the successes overcame obstacles or how the failures were undermined by pitfalls of decision functions. The potential pitfalls could have been mapped using the policy science framework or any of the parallel process-decomposing alternatives; the submissions could have analyzed how and why the initiative was poorly formulated or targeted in the first place. For many initial submissions, our suggestions to elaborate on any of these points did not yield resubmissions.
Another type of submission consisted of efforts to “formalize” the political economy context of policy conflicts. By replacing narrative with logical or mathematical symbols, these efforts were based on the hope that some insights would pop out. These analyses proved to be sterile in almost all cases. Formalization for the sake of formalization does not get us anywhere.
Yet even more dispiriting were the submissions by scholars who believed that they had hit upon a marvelous new framework that they thought would be appropriate for policy sciences, which had the reputation for publishing broad, thoughtful articles. While the second quote above tried to preempt the submission of mere relabeling exercises, this warning was ignored, and our most time-consuming experience was wading through long-winded submissions that proposed “new models” or “new frameworks” that failed to specify how their use provided greater leverage for understanding or for developing better policy choices. Although we took pains not to require Lasswellian vocabulary, recognizing that many policy experts are “fellow travelers” in appreciating the importance of the policy sciences’ principles, many of these submissions failed the test of whether they really added to our capacity to understand. Our primary criterion was simply that a submission had to provide some value added for understanding real-world policy issues, whether it focused on quite specific policy challenges or a broader class of challenges.
The fourth set of disappointing submissions included those that reported on regression analyses based on cases that had little to do with one another, based on the dubious premise that variables “accounting for” outcomes across dissimilar cases somehow would be enlightening for the examination of a particular case. For some of the submissions, the only variables that proved to be statistically significant were those that cannot be influenced by policy; knowing that outcomes would have been better if a community have been wealthier does not do much. Regression models can be very useful if employed to identify outliers of “surprising success” or “surprising failure” when control variables (such as income levels or population density) permit fair performance comparisons. This works best when the control variables are not policy variables; the policy choices are then to be examined in depth to determine how they may account for the surprises.
The most egregious of these regression-analysis submissions were cross-national analyses that threw in cases so different in political structures, economic patterns, and histories that very different causal patterns were a play across the cases. Whatever robust relationships may have been relevant for the subsets of countries sharing relevant similarities would be washed out by the different relationships holding in other countries. And even if a policy relevant variable was found to have a mildly positive or negative relationship with a desired outcome, any contextual knowledge about a case at the focus of the problem orientation would trump the finding that success or failure was modestly correlated with a given variable across disparate cases.
Even though we rejected the submissions that simply reported regression results without intensive analysis of outliers, they have become a dominant form of analysis that has been clogging up the other public policy and economic journals. Despite its limitations, technical econometric analysis has an advantage in the efficient production of publications that through sheer magnitude can dominate the focus of attention of policy analysts and policymakers. The advantage of econometric analysis is that “research programs” triggered by one or a few prominent publications can be pursued easily through rather narrow, routinized sets of analytic techniques. Perhaps the clearest example is the cottage industry that emerged from the infamous “resource curse” publications by Sachs and Warner (2001) that claimed to demonstrate that resource abundance is a detriment to economic growth. Despite scathing critiques persuasively showing the fundamental illogic of both using resource-export dependence as a proxy for resource abundance, and the impossibility of drawing conclusions as to whether countries would be better or worse than the unknown counterfactuals,1 many published articles have accepted the flawed framework in order to use the same cross-country regression approach to report on variations on the choice of variables, time periods, sets of countries, and so on.2 For example, adding a variable of conflict levels, ethnic fragmentation, or different measures of institutional quality provides the opportunity for yet another regression model that flips the relationship between the measure of resource abundance/wealth/dependence and growth rates from mildly negative to mildly positive, or vice versa. The criticisms have not deterred economists from citing Sachs and Warner, either to launch another tweak or to preface another lamentation that resource abundance is a disadvantage3; since 2010, Sachs and Warner’s (2001) European Economic Review has been cited more than two thousand times.
So what did we publish?
Policy sciences resisted (and continues to resist) these four types of submissions of very limited relevance to understanding how to formulate and enact sound policies. Yet, the journal was (and still is) open to a fairly wide variety of articles with policy value added, including new analytic approaches to understand the appropriateness of policy options, insights as to how the decision process may be obstructing the enactment of sound policies, strategies for the enactment of sound policies or institutional changes, previously underappreciated conditioning factors, demonstrations of how multiple conditioning factors are configured together, and so on. If the submission was about how to address a particular policy challenge, we would assess whether the analysis took account of a reasonably broad set of factors.
Characterizing subjectivity can be directed to create more useful perspectives held by policy analysts and policymakers to identify the most compelling challenges and policies. Policy sciences has published landmark articles on “case-based” analysis, pioneered by Brunner (1986), designed to induce stakeholders, analysts, and policymakers to break down misleading aggregate statistics into discrete categories of stakeholders, whether types of individuals (e.g., retirees dependent on Social Security) or regions (e.g., the rural South) that would be impacted differently by any given policy. This reflects appreciation of the importance of directing the focus of attention and overcoming the temptation to rely on overly broad statistics.
Identifying different perspectives held by stakeholders, analysts, and policymakers can improve the interactions among these policy actors. Again, the aggregate approach is typical of attitude surveys destroy information about the variations in perspectives. The Q method, a staple in policy sciences pioneered by Steven R. Brown in its policy applications, is highly compatible with the policy sciences’ appreciation of the nuances of the relationships among identifications, demands, and expectations. Following Brown’s (1974) landmark article in policy sciences, dozens of later articles by Brown or other Q method applications have appeared in the journal.
The policy sciences’ emphasis on fleshing out mental models goes beyond the individual belief structures, to focus on how institutions, deliberately or not, propagate mental models that risk a major distorting impact on policy discourse. Boynton and Deissenberg (1987), assembling the causal connections expressed in mass media messages as to how economies work, revealed troubling discrepancies between these implicit models and the actual functioning of economies.
Another key hallmark of the policy sciences, the insistence on clarifying goals as an essential function of the problem orientation (Lasswell 1970), has generated a host of policy sciences articles exploring the complexity and difficulty of identifying constructive problem definitions. Under our tenure as editors, policy sciences published Judith de Neufville and Stephen Barton’s insightful “Myths and the Definition of Policy Problems,” focusing on how myths (in the sense of beliefs derived from tradition and culturally shared taken-for-granted knowledge, rather than false beliefs) skewed the problem definition of US housing policy through elevating home ownership as a sacrosanct objective (de Neufville and Barton 1987). Rosenman et al. (1988) demonstrated the huge difference in the expected costs of nuclear waste disposal if the problem definition includes a more appropriate recognition of changes in levels of uncertainty. This was the first of seven policy sciences articles addressing the problematic logic and analysis of the Yucca Mountain nuclear waste initiative. Weiss (1989) demonstrated the importance of the evolving problem definitions in the policy discourse over the 1980 US Paperwork Reduction Act and other efforts to balance the costs and benefits of information.
Another broad theme has been the importance of limited knowledge, misconceptions, and different perceptions by different actors. One outstanding example is the 1987 article by David Feldman on the failure of a private energy company, with strong support from federal agencies, to build a pumped-storage hydroelectric project in Virginia and North Carolina. Among much contextual detail, political analysis, and exposition about bureaucratic and legislative machinations, Feldman (1987, 235) offers a gem of an insight by pointing out “a growing vulnerability of such projects to legislative or juridical defeat if policy options are seen as too complex or esoteric to marshal significant popular support.”
The policy sciences movement has propagated its own memes, but without creating mechanistic cottage industries. The concept of “wicked problems,” first published in policy sciences by Rittel and Webber (1973), has more than ten thousand Google Scholar citations. Four hundred articles invoke Lasswell’s publications in their applications of the problem orientation concept. Yet the production of worthwhile policy sciences analysis is not by assembly line. Providing enough context, process understanding, and distinctive insight are tall tasks. What is there about a particular wicked problem, in a particular context, that one needs to know to make the best of it?
This does not mean that worthwhile policy sciences contributions must be pure invention starting with first principles. Although the policy sciences framework permits the analyst to examine a particular problem context starting from scratch, by mapping the relevant institutions and actors, identifying actors’ perspectives and resources, and going through the full set of potential decision-process malfunctions, the more efficient approach is to develop the repertoire of fairly common conditioning factors, particularly those that may escape casual attention, to begin the exploration of the case at hand. Thus, the excellent case-based work on the “resource curse” by Gelb (1986, 1988) and Auty (1990) puts the analyst on the lookout for exchange-rate problems, increase temptations for corruption, greater conflict over the capture of resource wealth, and so on.
Where does this leave us?
If my prediction about the conversion of the public policy field to the true religion of problem-oriented, contextual, process-sensitive, and interdisciplinary analysis was hopelessly naïve, at least we can take solace in the fact that policy sciences has remained faithful, as have many books, and articles in other journals, written by policy scientists and the fellow travelers. The high impact factor of policy sciences demonstrates that the journal is shaping more useful perspectives on how to approach public policy challenges. That is, after all, the goal.
Brunnschweiler and Bulte (2008) note that resource abundance has different impacts than resource dependence, which is what Sachs and Warner use as their causal variable; Kropf (2010, 110) is explicit that Sachs and Warner use “an outright measure of resource dependence, while claiming to test concepts relying on resource abundance.” Davis (1998, 240) asserts that “[t]o state that resource-rich countries are made worse off for their resources relies on comparison with the unmeasurable counterfactual. It suggests that the Congo, Angola, and Nigeria would be doing just fine if natural resources were not found and extracted on their soils. I find this hard to believe….”
For example, Frankel (2010), well after enough critiques demonstrated the flaws of the Sachs and Warner analysis, nevertheless asserted: “It is striking how often countries with oil or other natural resource wealth have failed to grow more rapidly than those without. This is the phenomenon known as the Natural Resource Curse. The principle is not confined to individual anecdotes or case studies, but has been borne out in econometric tests of the determinants of economic performance across a comprehensive sample of countries.”