Models can be of many types – verbal, mathematical, visual – and there can be many models dealing with the same topic or subject area, some large and complex, some relatively simple. A collection of related models may be termed a theory (Giere 1999). Commenting on the many migration models in the literature, Le Bras (2008) evokes an architectural metaphor: ‘…models are viewpoints on migration. They are not mutually exclusive but combine to form a whole, in the way that architectural drawings made to show plan, section, and elevation complement each other’ (p. 316). This accords with Giere’s notion of perspectival realism (1999, 2006): all models are incomplete, but good models are realistic representations of a limited portion of the real world. There is ‘realism without truth.’

From the model-based perspective, demography has always had an abundance of theoretical models, ranging from simple mathematical models – the basic demographic equation, the exponential growth function – to broad verbal models dealing with large-scale population dynamics over long periods and in many areas – demographic transition theory. But these often were not recognized as theoretical models, or were questioned because they did not agree with empirical observations of one or more concrete cases.

As early as 1958, Coale and Hoover studied the interrelations of population growth and economic development by linking the cohort-component projection model to a standard economic growth model, in an exercise we would now call macrosimulation. But it was not viewed as theory; Notestein in his forward would refer to it as ‘a careful factual analysis’ (1958, pp. v–vi).Footnote 1 Nor did it inspire replications or similar modeling exercises by other demographers. Other economists, however, used time-series and international regression analyses of data on population growth and economic development to question the view that slowing population growth could enhance economic growth. The model was dismissed because some data were found which did not support its conclusions.

In the 1980s, Wachter and Hammel developed Socsim, a powerful microsimulation model of population, household, and kinship dynamics (Hammel et al. 1990), which they used to great effect to study a variety of issues, from future kin relations of the elderly to the demography of incest. Socsim was not widely adopted by other demographers to become part of their workaday toolkit (but see Murphy 2003).

In the 1990s, the economist/demographer Warren Sanderson and his colleagues constructed a substantial systems dynamics model of population, economic and environmental interrelations, using a kind of softwareFootnote 2 that for many demographers had been discredited by its use in The Limits to Growth studies beginning in the 1970s (see Sanderson 1994; Milik et al. 1996). The model was described as ‘…allowing economists, policy analysts and environmentalist to study the interactions between the economic, demographic and anthropogenic sectors of an idealized world, thereby enabling them to obtain insights transferable to the real world,’ a perfect statement of the spirit and purpose of abstract modeling. But again, not many other demographers adopted their approach or worked to replicate or refine it. And systems dynamics software remains outside the ken of the mainstream demographer.

One could find many more examples of early efforts at modeling human populations -mathematical models, microsimulation, macrosimulation – that never became mainstream. Traditional demographic analysis and multivariate statistics have continued to dominate research and training.

Demography has been ambivalent about computer modeling and about its theoretical heritage, preferring the imagined safety of detailed empirical analyses of data. I can think of no major effort to collect and systematize the vast array of demographic models and theories, and few if any books or monographs with titles such as Theories of Human Population Dynamics or Demographic Theory: An Overview. Coleman and Schofield’s edited volume The State of Population Theory: Forward from Malthus (1986), is now over 30 years old.

But it is theory that summarizes knowledge in a field, and provides a reasoned approach to further research, explanation, prediction and policy guidance. Without well-organized theory, demography is not a full-fledged scientific discipline. And computer modeling is an essential tool of theoretical work in the twenty-first century.

Recent developments give reasons for hope. LeBras’ The Nature of Demography (2008), for example, argues for a new approach to demography very much in keeping with the model-based view. A focus on process is central, the processes of individual behavior that give rise to observed macro-demographic period observations. Demographic measurement is important (‘a sort of land surveying applied to populations’), but secondary to the development of demography as an independent scientific discipline. Macro-demographic observations are often unreliable guides to the actual process, because of censoring. LeBras works primarily with mathematical models, which are solved analytically if possible, otherwise by micro-simulation. A model is considered interesting or useful if provides insight into the workings of the demographic system.

Another example is the increasing use of agent-based modeling to study demographic processes. A pioneering work, which documented early work and encouraged more to follow, is Agent-Based Computational Demography: Using Simulation to Improve our Understanding of Demographic Behaviour (Billari and Prskawetz 2003). But agent-based modeling and other forms of micro-simulation are still used by a small minority of demographers.Footnote 3

A major impetus to new approaches in demography has come from the Methodos Project, led by Daniel Courgeau and Robert Franck , and an associated series of monographs. The first of these, The Explanatory Power of Models (edited by Franck 2002), explored the role of modeling in a variety of disciplines, including demography. As context, Franck develops at length the idea of classical induction, which was replaced in twentieth century social science by Hume’s idea of induction and by logical empiricism, to the detriment of social science.

The covering law approach hinders social science research and leads to a pessimistic view of the explanatory capacities of the social sciences… To hold law-like generalizations necessary for true scientific explanation is to sacrifice any possibility of the social sciences deserving such scientific status…. [It] deprives the social sciences of the advantages which the natural sciences enjoy, since they never stopped using the method of classical induction ( Franck 2002, p. 4).Footnote 4

Courgeau edited the second Methodos volume (2003a), entitled Methodology and Epistemology of Multilevel Analysis. This was followed in 2004 by his own monograph, Du Group À L’Individu: Synthèse Multiniveau. In both publications, multilevel analysis is considered as it applies to traditional statistical analysis and to more recent techniques of computer modeling.

Courgeau and colleagues outline an historical progression of ‘paradigms’ in demography, from period analysis, to multilevel analysis, to agent-based models . As a next step, they see the use of ABMs as leading to ‘…a broader model-based research program, which would rely more on computer simulation as a tool of analysis’ (Courgeau et al. 2017). Computer simulation provides powerful tools for the integration of micro- and macro-demographic phenomena.Footnote 5

So, there are many signs of methodological progress in contemporary demography. But looking to the future, what else is needed? A few concrete suggestions:

  1. 1.

    Every working demographer might commit to adding a computer modeling tool to her or his everyday toolkit, to supplement traditional demographic methods and multivariate statistics. For the mathematically proficient, the focus might remain on mathematical modeling and analytic solutions. Those with more programming skills will build models from scratch using powerful and versatile software such as Mathematica or R (see, for example, Willekens 2011). Many will adopt less demanding tools, software for systems dynamics, or microsimulation and agent-based modeling.

  2. 2.

    Similarly, students of demography should be introduced to these tools, even if at an elementary level. In addition to teaching the more traditional skills in statistics and demographic techniques – the modeling of data – demographic training will require students to acquire some facility at modeling ideas and theories.

  3. 3.

    Existing demographic theories and models need to be systematically collected and codified. This is especially the case for behavioral and substantive models not generally covered in monographs on techniques or methods. Given the scope of the task, this would better be done as a collaborative project. It would lead to a compendium or handbook of demographic models, a reference work for everyday use.

  4. 4.

    Rather than dismissing otherwise promising models that don’t agree with some data, there should be more emphasis on refinement and replication. This will result in collections of several good models of a phenomenon, each giving a different perspective.

  5. 5.

    Demographic models need to used routinely in scientific research, analysis of population problems, and policy formation – really used, not just mentioned in opening and closing sections of publications. Purely descriptive research will not be abandoned, especially in government statistical agencies. Academic/scientific research will be more theory-driven – testing the adequacy of theoretical models and using them for rigorous explanation.

These and other concrete steps to implement the model-based view of science could lead to a more complete and mature discipline of demography, an autonomous discipline with strong theory as well as strong data and technique.