A Complexity Method for Assessing Counterterrorism Policies

Chapter
Part of the Springer Series on Evidence-Based Crime Policy book series (SSEBCP, volume 3)

Abstract

The incidence of terrorism is marked by uncertainty in time of onset, location, severity, and other attributes. This method applies the theory of political uncertainty and complexity theory to assessment of counterterrorism (CT) policies. Results from this method provide new, potentially actionable insights on the effect of CT policies by examining changes in the time between terrorism incidents T and the severity of such events S (fatalities). Most terrorism incidence patterns lack the “normal” (bell-shaped) or Gaussian distribution that is characteristic of equilibrium systems. Instead, terrorism event distributions often show heavy tails, symptomatic of non-equilibrium dynamics, in some cases approximating a power law with critical or near-critical exponent value of 2. Empirical hazard force analysis comparing pre- and postinterventions can also provide insights on CT policy effectiveness and change in the threat environment. Selected policy implications are discussed, including the usefulness of real-time and anticipatory analytical strategies.

Assessing the performance of counterterrorism (CT) policies and interventions poses some special challenges beyond those normally encountered in assessing the impact of other policies. This is primarily because the incidence of terrorism events is marked by significant uncertainty along several dimensions, such as time of onset, location, intensity, and other incident-related attributes. The analysis presented here applies the theory of political uncertainty and complexity theory to assessment of counterterrorism (CT) policies. Results from this approach can provide new and potentially actionable insights on the effect of CT policies by examining changes in the time between terrorism incidents T and the severity of such events S (fatalities). Empirical hazard force analysis of pre- and post-CT interventions can also provide insights on event severity as well as dynamical change. Selected policy implications are discussed, including the usefulness of real-time and anticipatory analytical strategies.

This paper proceeds as follows. The first section provides motivation for the methods presented and a brief discussion of earlier relevant literature. The second sections presents an integrated methodology for data analysis and model testing, based on the theory of political uncertainty and social complexity theory. The essence of these methods is to use terrorist incident data as signals for understanding patterns of occurrence, such as onset and severity, and more importantly, the latent, underlying dynamics that are causally responsible for observed occurrences. Although technically these methods are statistical, mathematical, and computational, they are essentially information extraction methods for understanding terrorist incidence patterns and deriving new metrics for evaluation. The last section presents a discussion of main results available through these methods and some general conclusions, including discussion of evidence-based CT policy evaluation. The discussion of policy implications is innovative for the integrated multidisciplinary methods used in this analysis, which combine political uncertainty theory and complexity modeling or complex systems theory.

Assessing the performance of counterterrorism (CT) policies and interventions poses some special challenges beyond those normally encountered in assessing the impact of other policies. This is primarily because the incidence of terrorism events is marked bysignificant uncertainty along several dimensions, such as time of onset, location, intensity, and other incident-related attributes. The analysis presented here applies the theory of political uncertainty and complexity theory to assessment of counterterrorism (CT) policies. Results from this approach can provide new and potentially actionable insights on the effect of CT policies byexamining changes in the time between terrorism incidents T and the severity of such events S (fatalities). Empirical hazard force analysis of pre- and post-CT interventions can also provide insights on event severity as well as dynamical change. Selected policy implications are discussed, including the usefulness of real-time and anticipatory analytical strategies.

This paper proceeds as follows. The first section provides motivation for the methods presented and a brief discussion of earlier relevant literature. The second sections presents an integrated methodology for data analysis and model testing, based on the theory of political uncertainty and social complexity theory. The essence of these methods is to use terrorist incident data as signals for understanding patterns of occurrence, such as onset and severity, and more importantly, the latent, underlying dynamics that are causally responsible for observed occurrences. Although technically these methods are statistical, mathematical, and computational, they are essentially information extraction methods for understanding terrorist incidence patterns and deriving new metrics for evaluation. The last section presents a discussion of main results available through these methods and some general conclusions, including discussion of evidence-based CT policy evaluation. The discussion of policy implications is innovative for the integrated multidisciplinary methods used in this analysis, which combine political uncertainty theory and complexity modeling or complex systems theory.

Introduction

Motivation

Assessing the effectiveness of counterterrorism (CT) policies poses numerous challenges, not least of which is the fundamental uncertainty associated with the incidence of terrorism events. Uncertainty is a universal characteristic of terrorism events, not an artifact of measuring their incidence. Such uncertainty presents a multifaceted challenge for both basic science and applied policy analysis. First, the remote and proximate causes of terrorist attacks are multiple and the interaction of such causes remains poorly understood. Second, some spatio-temporal features of terrorism events (e.g., associated statistics and distributions) may be measurable, but understanding and forecasting require a deeper theoretical approach. Third, interactive linkages between different types of events – such as, for instance, between bombings and highjacking, or between attacks across time and space – are not well understood.

Most terrorism incidence patterns (not necessarily CT interventions) lack the “normal” (bell-shaped) or Gaussian distribution that is characteristic of equilibrium systems. Instead, terrorism event distributions are often skewed, showing heavy tails, symptomatic of nonequilibrium dynamics, in some cases approximating a power law with critical or near-critical exponent value of 2. (Power laws arise in many domains, not just complex systems – where they play a special role.) Heavy tails in the incidence of terrorism are a significant nonequilibrium property, because they imply that extreme events occur with much greater than normal frequency (i.e., much higher probability density).

From an applied policy perspective, governmental and nongovernmental agencies involved in CT can benefit from scientific insights concerning the likely onset, severity, location, and other features of terrorist attacks. Because engineered systems are normally designed, built, and operated on the basis of reliable information concerning the operating environment and patterns of “calls on the system,” when it comes to agencies in the CT domain there seems to be less reliance on comparable information. For example, scientifically derived patterns of onset, severity, and location of attacks should be valuable for improving design, implementation, and operation of CT agencies and organizations.

Earlier Literature

The quantitative literature on violence is vast, but studies on the statistical dynamics of terrorism event processes are scant, even after decades of measurement, data development, and research in this area. Although the literature on counterterrorism is also large, the quantitative contributions that are relevant to the methods described in this chapter are few. Most earlier work in this area has been either on quantitative measurement and hypothesis testing (how to generate data and test propositions on terrorist events) or based on statistical regression models. In the former category, the data sets of the START Center at the University of Maryland, especially the Global Terrorism Dataset (GTD), are among the most valuable (START, 2010). Additional quantitative research on insurgency is also instructive (e.g., Townsley, Johnson, & Ratcliffe, 2008), but less focused on specific features of terrorism. In the statistical modeling area, most studies have applied regression analysis to the ITERATE data (Mickolus, Sandler, Murdock, & Flemming, 2004) and more recently the START GTD (e.g., LaFree, Dugan, & Korte, 2004), or similar data sets.

While both bodies of literature are valuable, not much has been used for purposes of advancing an empirically grounded, process-oriented theory of terrorism to better understand the phenomenon, or for improving design and implementation of CT interventions and policy evaluation. The claim in this chapter is that some potentially valuable progress is feasible through the application of concepts and principles from the theory of political uncertainty (Cioffi, 1998) combined with complexity theory (Cioffi & Romero, 2009).

Method

This section presents an analytical procedure for assessing the impact of CT policies, given a threat environment where terrorist attacks are occurring. The emphasis is on the actual computations and data treatments required in the analysis, without actual data. Events drawn from ITERATE, the GTD, or other country- or region-specific data set can serve as practice data for the reader interested in applying this method.

Let X denote a terrorism incident-related random variable, such as the time-interval between attacks T (measured in days), fatalities F produced by each incident (e.g., measured in deaths or other statistic of interest), distance D from the previous incident (or from a base reference location, such as a command center or logistics hub), or other variable(s) associated with terrorist events. Formally, an event process of this kind \( ({X}_{\langle }{{}_{\tau }}_{\rangle })\)can be modeled as an n-tuple of random variables with realizations ordered in historical time τ (so-called “epochal time”; Feller, 1968), where each random variable is defined by its probability functions, p(x) and Φ(x), or probability density function (p.d.f.) and cumulative density function (c.d.f.), respectively.

Figure 7.1 illustrates the overall methodological procedure, as detailed in the following subsections. Empirically, this approach is applicable to terrorism incident data, such as the GTD archive or other comparable data set. Following Cioffi (1998, chaps.  2–4), the procedure begins with data collection and preprocessing (top part of Fig.  7.1) continuing with the main computational event data analysis (bottom part of Fig.  7.1). Synchronic analysis is based on the entire population of data, whereas diachronic analyses are based on epochs as subsets (temporal samples) of the overall population (e.g., pre- and post-attack intervention epochs). Other comparative analyses of interesting subpopulations are also feasible (e.g., by geographic criteria, according to different CT policies, different resources, or other criteria).
Fig. 7.1

Overall computational data analysis procedure for obtaining process features on the incidence of attacks. Hazard force analysis and power law analysis are parallel computational data analysis processes that can use identical data for extracting a range of inferences. Source: Prepared by the author, adapted from Cioffi and Romero (2009)

The core procedures involving computational events data analysis consist of two distinct but interrelated types of quantitative methods (bottom parallel processes in Fig.  7.1): (1) hazard force analysis, founded on the theory of political uncertainty (Cioffi, 1998), and (2) power law analysis from complexity theory (Boccara, 2004). Although traditionally autonomous from each other, in this approach the synergy of the two methods is exploited to obtain new inferences that advance our understanding of terrorist incidents.

Data

As mentioned earlier, a specific study will use a data set such as ITERATE, the GTD, or comparable. For illustrative purposes, suppose that two coded variables used in this study were Date of Attack, used for computing time lapsed between attacks (T) and event severity (S [number of people affected]). Other variables included in the data set might include, for instance: country or region affected, type of event, cumulative death up to a given year, dollar amount invested in CT operations, and a brief description of the type of CT interventions implemented by agencies. Some uses of these and other data are mentioned in the section “Discussion.”

Analyses

As shown in Fig.  7.1, the two analytical methods – hazard force analysis and power law analysis – are applied to the same data for time-between-events T and for severity S. In turn, each analysis is conducted synchronically and diachronically, as explained below.

Temporal Analysis

For each variable (say, time T and severity S) and type of analysis (hazard force and power law) an overall synchronic analysis is first conducted, based on all data for an entire period, followed by a more historically detailed diachronic analysis. The latter is based on several epochs marked by CT interventions:

  • Epoch 1: dd/mm/yyyy to dd/mm/yyyy. Pre-intervention phase.

  • Epoch 2: dd/mm/yyyy to dd/mm/yyyy. Intervention phase.

  • Epoch 3: dd/mm/yyyy to dd/mm/yyyy. Post-intervention phase.

The significance of these three “phases” (epochs, in quantitative analysis terminology) is defined in terms of CT interventions. The main theoretical motivation for these epochs – and additional reason why epochs matter – is that terrorist dynamics, in terms of forces of onset FT and forces of severity FS, which drive the onset and severity of attacks, undergo fundamental changes across epochs due to the presence or absence of CT interventions. Hazard force analysis and power law analysis aim at detecting fundamental change in such latent dynamics, as described in the next subsections. If CT interventions have an effect, then different epochs should reveal significant transitions in terms of forces affecting the event process.

Hazard Force Analysis

Hazard force analysis is a quantitative method for discovering or testing distinct qualitative patterns of unobservable but nonetheless real, causal forces that operate in a process consisting of behavioral events, such as attacks and related violence. This is a method particularly suitable for understanding uncertainty – more precisely, the pattern of uncertainty and its possible cause – and risk (Singpurwalla, 2006) as described by a time series of events data.

Definition 1 (Intensity Function)

The intensity function H(x) of a c.r.v. X is defined by the ratio of the value of the p.d.f. to the value of the complementary c.d.f. of X. Formally, H(x) is defined by the equation
$$ H(x)=\frac{p(x)}{[1-\Phi (x)]},$$
(7.1)
where p(x) and Φ(x) are the p.d.f. and c.d.f. of X, respectively.

Note that, although the intensity or hazard force H(x) is a latent (i.e., unobservable or not directly observable) variable, (7.1) renders H(x) measurable, because both p(x) and Φ(x) can be computed from a sufficiently large set of observed realizations \( {\widehat{x}}_{i}\in X\). The original interpretation of (7.1) as an intensity or force is probably due to Cox (1962), based on Bartholomew (1973, p. 138). Unfortunately, most of the standard social statistical and econometric literature (e.g., Greene, 2011) treats the estimation of \( \widehat{H}\)(x) as just another case of regression, ignoring the much deeper dynamical implications used in this study.

Accordingly, by (7.1), the specific form of H(x) (i.e., constant, increasing, decreasing, non-monotonic) depends directly on the form of the associated probability functions (c.d.f. or p.d.f.). Specifically, four qualitative cases are fundamentally important. To illustrate, let X  =  T, the time interval between attacks, measured – for instance – in days.

Case 1. Constant Intensity: H(t)  =  k. In this special case, the propensity for the next attack to occur – i.e., the hazard rate or event intensity – does not change between realizations, consistent with the notion that escalating and mitigating forces of terrorism are in balance. This case also corresponds to the Poisson distribution with corresponding simple negative exponential density, \( p(t)=k{\text{e}}^{\text{-kt}}\) and \( \overline{t}=1/\widehat{k}={\sigma }^{2}(t)\). This case has arguably the strongest empirical support for many types of socio-political events, both domestic and international (Cioffi, 1998, pp. 113–116). In terms of epochs, we expected to detect a constant or perhaps increasing intensity during Period 1 (pre-intervention), consistent with earlier findings (e.g., Cioffi & Romero, 2009), because the attack process is running a “more-or-less-natural” course unmitigated by CT policy.

Case 2. Increasing Intensity: dH  ∕  dt  >  0. In this case, the hazard force or event intensity increases between attacks, symptomatic of a fundamentally unstable situation where attacks occur under rising pressure or increasing propensity. This situation is akin to a driven threshold system (Rundle, Klein, Tiampo, & Gross, 2000). Following this interpretation, latent onset forces build up and occasionally trigger attack events (Cioffi & Romero, 2009). In terms of epochs, we expected to observe increasing force intensity prior to interventions.

Case 3. Decreasing Intensity: dH  ∕  dt  <  0. In this case, the hazard force or event intensity decreases between attacks, symptomatic of a stable situation where attacks occur under diminishing pressure or decreasing propensity. This situation is akin to a leaky threshold system that dissipates forces as they build up (Rundle, Tiampo, Klein, & Sa Martins, 2002). For example, an effective CT policy may be responsible for dissipation and decreasing propensity for attacks. In terms of epochs, we expected to see this force pattern only in Epochs 2 and 3 (section “Temporal Analysis”).

The above three dynamic cases can be modeled by the two-parameter Weibull equation:
$$ H(x)=k{t}^{\beta -1},$$
(7.2)
where k and β are the scale and shape parameters, respectively. The Weibull model is appropriate for modeling terrorism data for several reasons. First, numerous empirical studies of terrorism data (beginning with ITERATE) have shown that the Poisson process with constant hazard is frequently (albeit not universally) observed, as in Case 1. Second, the Weibull function for (7.1) (i.e., (7.2)) captures a variety of common threat environments, as discussed in the next paragraph ((7.3)–(7.5) below). Third, the Weibull function for \( \widehat{\beta }\approx 3.5\) approximates a normal Gaussian process (equilibrium dynamics) and the Rayleigh process for \( \widehat{\beta }\approx 2\) (linear driven system). Finally, the model is formally parsimonious relative to the variety of implications it is capable of generating (Cioffi, 1998, pp. 117–124).

Using the Weibull model, the estimate \( \widehat{\beta }\) computed directly from data supports the follow inferences concerning terrorist dynamics:

$$\widehat{\beta }< 1:\text{decreasing}\,\text{terrorism}\,\text{force}\Rightarrow \text{stable}\,\text{situation}$$
(7.3)
$$\widehat{\beta }\approxeq 1:\text{constant}\text{terrorism}\text{force}\Rightarrow \text{borderline}\text{situation}$$
(7.4)
$$\widehat{\beta }>1:\text{increasing}\text{terrorism}\text{force}\Rightarrow \text{unstable}\text{situation}$$
(7.5)
Clearly, these three scenarios are qualitatively distinct, and from a policy perspective they correspond to desirable, indifferent, and undesirable conditions (threat environments), respectively. Interestingly, the mean or first moment of T is given by
$$ \overline{t}=k\Gamma (1+1/\beta ),$$
(7.6)
where Γ is the gamma function. Therefore, commonly used estimates based of mean values or other simple statistics are not as intuitive as they might seem and must be computed exactly because \( \overline{t}\) is notoriously sensitive to \( \widehat{\beta }\) (by (7.6)). In other words, the relationship between \( \overline{t}\) and β (7.6) implies that\( \widehat{\beta }\)-estimates will be highly unstable.

Finally, a fourth policy-relevant case in the qualitative form of terrorism force is also interesting:

Case 4. Non-monotonic Intensity. After an attack occurs, terrorism force may rise as in Case 2, but then subside, as in a lognormal function. Alternatively, the terrorism force may subside following an attack and then begin to rise again sometime after, as in a so-called “bathtub” function (Singpurwalla, 2006). These non-monotonic situations are also considered in this analysis, given their plausibility and policy relevance for planning purposes. In terms of epochs, their logic is nonlinear, ruling out monotonic forces.

Empirical examples related to these cases are reported elsewhere (Cioffi, 1998, 2006; Cioffi & Romero, 2009). Summarizing the hazard force analysis procedure, terrorism events data on the time interval between attacks (T) and severity (fatalities) caused by each attack (S) are used to compute the corresponding empirical hazard functions \( \widehat{H}\)(t) and \( \widehat{H}\)(s). These empirical functions are then closely examined to determine their qualitative shape and draw inferences concerning the threat environment. This procedure is repeated for the entire population of data, as well as for each pertinent epoch. The initial expectation is that these estimates would yield mostly Case 1 (constant force), consistent with most earlier studies, but with rising value of k as the epochs progress and CT interventions are implemented.

Power Law Analysis

Power law analysis is a complexity-theoretic method that can be used for drawing additional unique inferences when applied to a set of terrorism data. This method is particularly useful for drawing inferences about the criticality of a process based on observed time series of events over time, combined with some size or intensity, such as attacks. Here we use the so-called type IV power law (Cioffi-Revilla, 2003, chap.  2). Other types of power laws include the rank-size law or Zipfian, various algebraic forms, and others (Kleiber & Kotz, 2003). In this approach we apply the type IV power law (Definition 2 below) because in the case of terrorist attacks it provides the most powerful complexity-theoretic inferences compared to rank-size (Zipfian) and other types of power laws.

Definition 2 (Power Law).

A power law of a terrorism event process\( ({X}_{\langle }{{}_{\tau }}_{\rangle })\)is a parametric distribution model where increasing values xi  Î  X of a given terrorism variable X occur with decreasing frequency, or f(x)  ¥  x −  b, with b  >  0. Formally, f(x) is a p.d.f. given by

$$ p(x)=\frac{a(b-1)}{{x}^{b}},$$
(7.7)

where a and b are scale and shape parameters, respectively.

From this two-parameter hyperbolic equation for the p.d.f., it is easily shown that the complementary cumulative density function (c.c.d.f.), defined as 1  −  Φ(x)  ≡  Pr(X  >  x) (a.k.a. survival function when X  =  T, denoted by S(x)), has the following form in log–log space:

$$ \mathrm{log}[1-\Phi (x)]={a}^{\prime }-(b-1)\mathrm{log}x.$$
(7.8)

Equation (7.8) yields

$$ \Phi (x)=1-\frac{a}{{x}^{(b-1)}}=1-a{x}^{1-b}.$$
(7.9)

Equation (7.8) is commonly used for empirical analysis, because it can be obtained directly from the set of observed values\( {\widehat{x}}_{i}\).

The empirical estimate \( \widehat{b}\)is of major analytical interest, because the first moment of a power law is given by
$$\begin{array}{l}E(x)={\displaystyle {\int }_{\mathrm{min}\left\{x\right\}}^{\infty }xp(x)\text{d}x=a(b-1){\displaystyle {\int }_{\mathrm{min}\left\{x\right\}}^{\infty }{x}^{1-b}\text{d}x}}\\ \text=\frac{a(b-1)}{2-b}{x}^{2-b}{\mathrm{min}\left\{x\right\}}_{\infty }^{},\end{array} $$
(7.10)

which goes to infinity as b  →  2. In other words, there is no theoretical mean size (no expected value E(x) exists) for the terrorism variable X (such as onset times T or fatalities F) when X is governed by a power law with exponent b approaching the critical value of 2, or (b  −  1)  <  1 (below unit elasticity). This is an insightful theoretical result for terrorism patterns such as organizational sizes, fatalities (Cioffi et al., 2004; Richardson, 1945), and terrorist attacks (Cioffi, 2003, chap.  16). The critical thresholdbcritical  =  2 marks the dynamical boundary between terror regimes or threat environments that have a finite average and computable size (b  >  2) and a highly volatile regime that lacks an expected value or mean size (b  ≤  2). This is a theoretical insight derived directly from the empirically estimated value of the power law exponent b – by no means an observable property for any ongoing terrorism process.

In practice, any data set will yield a complete set of moments, so it is easy to miss the significance of special values (such as 2.0) in the power law exponent. While it is true that only a finite number of people can die from a terrorist attack, finding that \( \widehat{b}\) has been drifting toward the critical value of 2 is a clear signal that the threat environment is deteriorating.

Based on previous studies (Cioffi, 1998, p. 52, table 2.1), time-between-attacks T is expected to obey the simple (one parameter) negative exponential p.d.f. of a Poisson process,

$$ p(t)=\lambda {\text{e}}^{-\lambda t},$$
(7.11)
where \( \widehat{\lambda }=1/(\overline{t})\); and attack severity S is expected to obey a power law. Moreover, with respect to the diachronic epochs of CT interventions discussed earlier, we would expect \( \overline{t}\) to increase across periods (epochal time) and \( \widehat{b}\) to approach criticality.

Summarizing the power law analysis procedure, terrorism events data on time interval between attacks (T) and fatalities produced by each attack (S) are used to compute the corresponding empirical power law functions \( \mathrm{log}[1-\widehat{\Phi }(t)]\) and \( \mathrm{log}[1-\widehat{\Phi }(s)]\), for onsets and fatalities, respectively. These empirical functions are then closely examined to determine their qualitative shape and draw inferences concerning security conditions. One can also examine the p.d.f.s directly using kernel estimation. This procedure is repeated for the entire population of data, as well as for each epoch. The initial expectation is that these estimates will yield mostly a poor fit of the power law for the overall synchronic analysis. As epochs pass and CT policies become more effective, any power law pattern should move away from criticality (decreasing variance).

Discussion

Results from the procedures described above can suggest new insights and implications for CT research and policy. The following discussion focuses on the main findings and selected policy implications in reference to issues raised in the Introduction.

Empirical Findings

Results from the onset of attacks T (time between events) in the analysis of overall synchronic patterns often show a non-normal distribution with a heavy right tail, as pointed out earlier (Cioffi, 1998, p. 52, Table 2.1, 2006; Cioffi & Romero, 2009). The formal normality tests (Shapiro-Wilk) can reject the null hypothesis for the presence of a log-normal distribution in the data. The empirical c.d.f and p.d.f. both allow us to visualize this non-normal pattern in the distribution of T. These statistical properties suggest a high degree of political uncertainty, meaning conditions far from the equilibrium conditions of normality with marked central tendency and unlikely to highly improbably extreme events. Normality does not follow from equilibrium; but nonequilibrium conditions are implied by a non-normal process generating heavy upper tail because extreme events (terrorist attacks with fatalities in the thousands, i.e., several orders of magnitude above the mean) have much higher probability density than in a normal, Gaussian process. The Kaplan–Meier estimate of the survival function \( \widehat{S}\)(t) can demonstrate that T has a higher probability of realizing very short time spans between attacks, with rapidly increasing cumulative probability (much faster than Poisson). In addition, the empirical hazard force function can show that the intensity of the force for attacks to take place can decrease, after which such intensity can fluctuate below some value. The average hazard rate for the complete period can also vary across epochs, indicative of the effectiveness of CT policies.

These and other findings are not available through plain observation or even field visits to areas of CT interventions. The two methods based on complexity theory and uncertainty theory should be viewed as separate scientific instruments for assessing CT interventions, each providing different readings on CT performance. While more traditional methods provide significant information of a different nature, these analytical results provide insights concerning terrorist dynamics that are latent albeit measurable. Such insights can shed new light on terrorist activity and underlying processes. As such, these insights can help inform policy-makers on the effectiveness of CT policies implemented or under consideration.

Policy Implications

The following discussion of policy implications moves from some basic aspects of theoretical science in applied CT domains to more practical institutional issues. Throughout, the science–policy nexus dominates the discussion, but several important themes are only summarized due to space limitations.

To begin, the scientific principle according to which “there is nothing more practical than a good theory” (Lewin, 1952, p. 169) is or should be as valid for CT policy analysis as it has been for social psychology – a science that evolved from humanistic origins dating back to Aristotle. In fact, as Vansteenkiste and Sheldon (2006) have noted, Lewin intended to convey a two-way relationship between scientists and practitioners, such that the two would gain from each others’ insights and specialized familiarity with information, issues, and methods – as well as toolkits. Whereas computational terrorism scientists could and should develop research that yields more actionable results, practitioners could and should make greater use of available scientific advances, including viable areas of social science and CT research. The difficulties for each are many but the potential payoff is significant.

Kline’s thesis is as true for terrorism analysts as it is for physicists – some of whom, such as L. F. Richardson (founder of scientific conflict analysis) have made contributions to the science of conflict. Another way to appreciate the power of scientific approaches to CT analysis is by recalling a thesis formulated by the late mathematician Morris Kline (1985), that scientists do not learn mathematics for its own sake, but because mathematics provides a unique and powerful method for discovering fundamental features of the real empirical world that are not accessible through other methods – including direct observation, measurement, or experience. Gravity, pressure, and radiation are among the many natural phenomena that we understand through the exclusive medium of mathematics, even when we can observe their effects. Much the same is true of the terrorism features revealed by the medium of theories such as those applied in this study. Hazard rates (the latent intensity for attacks), half-life (the greater-than-even-odds tipping point for attacks to occur), and criticality (the phase transition to an extreme threat environment) are specific features of terrorist attacks that we know exclusively through the medium of mathematics, not through direct experience or observation.

Within a CT context, the situational awareness dashboard of analysts and policymakers could be significantly enriched by adding new panels for viewing computational indicators, such as those applied in this analysis or others with comparable theoretical foundation. For example, application of these methods soon after Phase I may reveal the gathering momentum for upcoming attacks, perhaps in time to avoid the entrenchment and maturation of effective radical networks by reformulating an appropriate policy.

While this study hypothesized a post hoc situation, by necessity, real-time or near-real-time analysis of uncertainty and complexity indicators is becoming increasingly feasible. This is also significant within a CT context. Already increased interest in open-source data and analysis on the part of the intelligence community is stimulating a new generation of information processing tools that will one day provide real-time capabilities in events analysis and related methodologies (Cioffi & O’Brien, 2007). In addition, the merging of real-time facilities with advanced data visualization and cartographic tools (e.g., social GIS, spatial social science models) – combined with Moore’s Law on exponentially increasing computing power – will soon render feasible information awareness environments that would have been close to unthinkable just a few years ago. Real-time events data analysis will provide significant support not just for CT analysts but also for planners, decision-makers, and others who can benefit from input and feedback – even if Moore’s Law may begin to encounter some limitations.

Besides these improvements, sequential event process modeling of attacks – such as for suicide bombings or IED attacks – could prove helpful for practitioners, as well as challenging from a scientific perspective. For instance, a detailed empirically based event process model (sometimes known as a “business model” in organizational theory or “workflow analysis” in management science) of IED attacks could shed significant light on the attackers’ vulnerabilities, by revealing actionable information that a defender could exploit to prevent attacks or mitigate their effects. Models like this already exist for weapons of mass destruction (Allison, 2004, chaps.  1–5); they should be developed for a broad variety of terrorist attacks. More specifically, event process models should focus on phases in the overall life cycle of an attack:

  1. 1.

    Decision-Making: Attackers deciding to act, including cognitive processes and alternative choice mechanisms

     
  2. 2.

    Planning: Attackers organizing a schedule for implementing an attack, including operational security

     
  3. 3.

    Preparation: Attackers coordinating tasks deemed necessary to execute the attack

     
  4. 4.

    Execution: Attackers carrying out the attack that causes undesirable effects for the defender

     
  5. 5.

    Effects: Consequences affecting the defender

     
  6. 6.

    Recovery: Defenders partially or fully restoring their preattack condition, including socio-psychological aspects

     
  7. 7.

    Investigation: Defenders engaging in a fact-finding campaign to apprehend attackers and their confederates

     
  8. 8.

    Prosecution: Defenders apprehending and processing attackers through the criminal justice system, and last but not least

     
  9. 9.

    Learning: Defenders harvesting lessons learned and using such knowledge to improve preparedness

     

The simple fact that the operational causal structure of an attack processes is serialized – not parallelized – holds fundamental and inescapable policy and practical implications: all serialized behavior is vulnerable to disruption by elimination of one or more necessary conjunction. Effective defenders must therefore learn how to exploit the inescapable serialization of an attacker’s process – by making the difficult life of terrorists impossible or as difficult as possible.

Of course, when it comes to the complex conflict dynamics of terrorism and asymmetric warfare, another important consideration within a CT context is that not all the necessary science is known – not even for selected regions of the world or for subsets of actors – and much will remain unknown for a long time, even as better data and better theories are developed and become available to the policy community. But this situation in the CT domain is not different from what occurs in medicine, engineering, or economics; and yet, public policy in these areas does attempt to draw on the best existing scientific understanding. Understanding what we do not know is as important as mastering what we do know.

It is important to increase the availability and desirability of scientific knowledge on terrorism. The main results from development and application of methods such as these – summarized in the previous section – are to offer new actionable insights that are worth considering in the domain of policy analysis and planning. This science-based strategy – and others like it that apply computational social science approaches to the analysis of real-world conflict events (Cioffi, 1990; King & Lowe, 2003; O’Brien, 2002; Schrodt, 1989; Tsvetovat & Carley, 2005) – should become increasingly available to policy analysts and practitioners. Much remains to be demonstrated, but some evidence of increasing relevance is already available.

Finally, the main emphasis in this chapter has been on assessment of CT policies with a rather narrow scope on terrorism attacks. This was intended as a way to provide focus and specificity. A more generalized perspective should cover the broader lifecycle of terrorism in its various stages, including prevention (e.g., counter-radicalization policies), preparedness, mitigation, recovery, prosecution, and learning. The methods described in this chapter are susceptible to applications in each stage, assuming a proper identification of relevant event processes.

Notes

Acknowledgements

Thanks to two anonymous reviewers who offered comments and suggestions, and to Pedro Romero for initial testing of these ideas in the context of counterinsurgency analysis. Funding for this study was provided by the Center for Social Complexity of George Mason University and by the Office of Naval Research (ONR) under grant no. N000140810378 (Mason Baseera Project). Opinions, findings, conclusions, and recommendations expressed in this work are those of the author and do not necessarily reflect the views of the funding agencies.

References

  1. Allison, G. (2004). Nuclear terrorism: The ultimate preventable catastrophe. New York: Henry Holt and Company.Google Scholar
  2. Bartholomew, D. J. (1973, 1982). Stochastic Models for Social Processes. New York: John Wiley and Sons.Google Scholar
  3. Boccara, N. (2004). Modeling complex systems. New York: Springer.Google Scholar
  4. Cioffi-Revilla, C. A. (1990). The scientific measurement of international conflict: Handbook of datasets on crises and wars, 1495–1988. Boulder, CO: Lynne Rienner.Google Scholar
  5. Cioffi-Revilla, C. (1998). Politics and uncertainty: Theory, models and applications. Cambridge and New York: Cambridge University Press.Google Scholar
  6. Cioffi-Revilla, C. (2003). Power Laws of Conflict: Scaling in Warfare and Terrorism. In Claudio Cioffi-Revilla (Ed.), Power Laws and Non-Equilibrium Distributions in the Social Sciences, Mason Center for Social Complexity, Fairfax, VA 22030. Book manuscript.Google Scholar
  7. Cioffi-Revilla, C., Sean, P., Sean, L., James, L.O., & Jason, T. (2004). Mnemonic Structure and Sociality: A Computational Agent-Based Simulation Model. In David Sallach and Charles Macal (Eds.), Proceedings of the Agent 2004 Conference on Social Dynamics: Interaction, Reflexivity and Emergence, Chicago, IL: Argonne National Laboratory and University of Chicago.Google Scholar
  8. Cioffi-Revilla, C. (2006). Power laws of conflict: Scaling in warfare and terrorism. In C. Cioffi-Revilla (Ed.), Power laws and non-equilibrium dynamics in the social sciences. Unpublished edited volume.Google Scholar
  9. Cioffi-Revilla, C., & O’Brien, S. P. (2007). Computational analysis in US foreign and defense policy. In D. Nau & J. Wilkenfeld (Eds.), Proceedings of the First International Conference on Computational Cultural Dynamics, University of Maryland, College Park, MD, 27–28 August, 2007. Available online.Google Scholar
  10. Cioffi-Revilla, C., & Romero, P. P. (2009). Modeling uncertainty in adversary behavior: Attacks in Diyala Province, Iraq, 2002–2006. Studies in Conflict & Terrorism,32(3), 253–276.Google Scholar
  11. Cox, D. R. (1962). Renewal theory. London: Methuen.Google Scholar
  12. Feller, W. (1968). An Introduction to Probability Theory and its Applications. 3rd ed. New York: John Wiley and Sons.Google Scholar
  13. Greene, W. H. (2011). Econometric analysis (7th ed.). New York: Prentice Hall.Google Scholar
  14. King, G., & Lowe, W. (2003). An automated information extraction tool for international conflict data with performance as good as human coders: A rare events evaluation design. International Organization, 57, 617–642.Google Scholar
  15. Kleiber, C., & Kotz, S. (2003). Statistical size distributions in economics and actuarial sciences. New York: Wiley Inter-Science.Google Scholar
  16. Kline, M. (1985). Mathematics and the search for knowledge. Oxford: Oxford University Press.Google Scholar
  17. LaFree, G., Dugan, L., & Korte, R. (2009). The impact of British counterterrorist strategies on political violence in Northern Ireland: Comparing deterrence and backlash models. Criminology, 47(1), 17–45.Google Scholar
  18. Lewin, K. (1952). Field theory in social science: Selected theoretical papers. Chicago and London: University of Chicago Press.Google Scholar
  19. Mickolus, E. F., Sandler, T., Murdock, J. M., & Flemming, P. (2004). International terrorism: Attributes of terrorist events, 19682003 (ITERATE 5). Dunn Loring, VA: Vineyard Software.Google Scholar
  20. O’Brien, S. P. (2002). Anticipating the good, the bad, and the ugly: An early warning approach to conflict and instability analysis. Journal of Conflict Resolution, 46(6), 808–828.Google Scholar
  21. Richardson, L.F. (1945). The distribution of wars in time. Journal of the Royal Statistical Society (Series A), 107(3–4), 242–250.Google Scholar
  22. Rundle, J. B., Klein, W., Tiampo, K. F., & Gross, S. (2000). Linear pattern dynamics in nonlinear threshold systems. Physical Review E, 61(3), 2418–2431.Google Scholar
  23. Rundle, J. B., Tiampo, K. F., Klein, W., & Sa Martins, J. S. (2002). Self-organization in leaky threshold systems: The influence of near-mean field dynamics and its implications for earthquakes, neurobiology, and forecasting. Proceedings of the National Academy of Sciences of the United States of America, 99(Supplement 1), 2514–2521.Google Scholar
  24. Schrodt, P. A. (1989). Short term prediction of international events using a Holland classifier. Mathematical and Computer Modelling, 12, 589–600.Google Scholar
  25. Singpurwalla, N. D. (2006). Reliability and risk: A Bayesian perspective. New York: Wiley.Google Scholar
  26. START [National Consortium for the Study of Terrorism and Responses to Terrorism]. (2010). Global terrorism database: GTD variables & inclusion criteria. College Park, MD: START Center, University of Maryland. May 2010. Available online.Google Scholar
  27. Townsley, M., Johnson, S. D., & Ratcliffe, J. H. (2008). Space time dynamics of insurgency activity in Iraq. Security Journal, 21, 139–146.Google Scholar
  28. Tsvetovat, M., & Carley, K. (2005). Structural knowledge and success of anti-terrorist activity: The downside of structural equivalence. Journal of Social Structure, 6(2), 23–28.Google Scholar
  29. Vansteenkiste, M., & Sheldon, K. M. (2006). There’s nothing more practical than a good theory: Integrating motivational interviewing and self-determination theory. British Journal of Clinical Psychology, 45(1), 63–82.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  1. 1.Center for Social Complexity and Department of Computational Social Science, Krasnow Institute for Advanced StudyGeorge Mason UniversityFairfaxUSA

Personalised recommendations