1 Introduction

Hans Reichenbach’s early view of coordination and its neo-Kantian features have been an increasingly popular topic of academic research, both with respect to its historical relevance (De Boer, 2011; Eberhardt, 2011; Friedman, 1994; Holland, 1992; Klein, 2003; Padovani, 2011), and as a source of inspiration for systematic theoretical elaboration (Friedman, 2001; Ryckman, 2005; van Fraassen, 2008). According to Reichenbach (1920/1965), certain principles relate the conceptual part of our physical theories with empirical reality since they coordinate the mathematical structures and the theoretical terms by which we express our theories with their physical correlates. However, as it has been recently emphasised (Padovani, 2015a, b, 2017), Reichenbach’s early view of coordination does not simply boil down to a one-to-one mapping relationship between theory and phenomena established from ‘top-down’ by means of certain abstract principles. This insight has prompted novel developments inspired by Reichenbach’s own view, with the aim of providing an understanding of coordination that goes beyond the assumption of only two epistemic levels of analysis—the one of theory and the one of phenomena—and that may be applied to scientific inquiries beyond the domain of physics (Luchetti, 2018).

On similar lines, in this paper I will argue that a liberalised understanding of coordination, yet inspired by Reichenbach’s own view, provides us with a systematic approach to analyse some conceptual and epistemic consequences of a widespread theorising strategy in evolutionary biology, that has recently been discussed by Okasha (2018) under the name of ‘endogenization’. This strategy is based on the attempt of evolutionary biologists to explain in terms of natural selection theory some variables that, at an earlier stage of theorising, constituted the background presuppositions of natural selection theory itself. On the one hand, I will show that the tools of coordination analysis, if properly adapted, can enhance our understanding of conceptual change also in the life sciences. On the other, I will develop Okasha’s own characterisation of endogenization in two ways. First, I will identify three stages of the process of endogenization: quasi-axiomatisation, functional extension, and semantic extension. Second, I will show that the functional extension of the core abstract principles of natural selection theory has important consequences on the meaning of certain concepts of the theory, thus impacting the coordination of these principles with the phenomena that they are supposed to represent.

To do this, I will discuss one specific case of endogenization, namely, the endogenization of selective environment by niche construction theorists. More specifically, I will focus on the functional extension of the abstract principle of heritability in natural selection theory and on its relationship with the semantic extension of the notion of inheritance. I will argue that the functional extension of the principle of heritability leads to the requirement of a semantic extension of ‘inheritance’ and, thus, it requires a new form of coordination between the principle of heritability and the extended domain of phenomena that is endogenized. Finally, I will discuss how niche construction theorists provided their own semantic extension of ‘inheritance’ and I will suggest that it result from their novel form of coordination between theory and phenomena.

In Sect. 2, I will focus on contemporary reassessments of Reichenbach’s early view of coordination. First, I will introduce Friedman’s (2001) influential reappraisal of Reichenbach’s ‘axioms of coordination’. Then, I will discuss van Fraassen’s (2008) critique of Reichenbach, Padovani’s (2015a, b, 2017) reassessment of Reichenbach’s view of coordination, and my own previous attempt to develop a Reichenbach-inspired view of coordination applicable to the life sciences. In Sect. 3, I will introduce the notion of ‘theory extension’ and the distinction between functional and semantic extension. Then, I will outline Okasha’s view of endogenization and the case of the endogenization of selective environment. After discussing endogenization as a strategy of theory extension, I will identify the quasi-axiomatisation and the functional extension of certain core principles of natural selection theory as the first two stages of endogenization. In Sect. 4, I will focus on the effects of functional extension on coordination. First, I will clarify the connection between functional extension and semantic extension, by going back to the case of the endogenization of selective environment. Then, I will analyse the impact of the functional extension of the principle of heritability on the semantic extension of the concept of inheritance. Finally, I will discuss the way in which niche construction theorists provided a new form of coordination between theory and phenomena through their own semantic extension of ‘inheritance’. In Sect. 5, I will summarise my conclusions.

2 Reichenbach’s early view of coordination and its contemporary legacy

2.1 Reichenbach’s axioms of coordination

The issue of coordination was discussed by many empiricist philosophers between the end of the 19th and the first decades of the twentieth century, particularly by Ernst Mach, Moritz Schlick, and Hans Reichenbach. These scientists and philosophers were concerned with the problem of how mathematical representations of physical structures, usually presented in the form of equations, could be ‘coordinated’ with the empirical phenomena that those equations were supposed to describe. In other words, since mathematical structures are themselves devoid of empirical content, they wondered how the theoretical terms that figure as parameters in the equations could be given empirical content and thus, successfully represent concrete phenomena.

In his Habilitationsschrift, Reichenbach (1920/1965) argues that some principles provide the coordination between mathematical representations and their physical correlates, since they enable the application of those mathematical tools to empirical reality. For instance, the principle of genidentity (or identity over time) “indicates how physical concepts are to be connected in sequences in order to define ‘the same thing remaining identical with itself in time’” (Reichenbach, 1920/1965, p. 55). These ‘axioms of coordination’ can change along with the development of the mathematical and physical sciences, but have a constitutive character, which was one of the distinctive features of the Kantian synthetic a priori. According to Reichenbach, we should reject the universality and necessity of the Kantian a priori, but retain the feature by which we constitute the concept of object, since the object of scientific knowledge is not immediately given: “Perceptions do not give the object, only the material of which it is constructed. Such constructions are achieved by an act of judgment. The judgment is the synthesis constructing the object from the manifold of the perception” (Reichenbach, 1920/1965, p. 48). Therefore, in Reichenbach’s view the Kantian notion of synthetic a priori can still be relevant to contemporary science, inasmuch as it identifies those preconditions, expressed as principles or functions of thought, that are presupposed by our knowledge claims about the empirical world. These preconditions prescribe norms that enable the representation of our sense-experiences in terms of abstract mathematical concepts and must be understood as dynamic, rather than fixed and universal as in the Kantian system, since they can change over time.Footnote 1

2.2 Reichenbachian coordination reconsidered: Friedman’s relativized a priori principles

In Dynamics of Reason (2001), Friedman takes inspiration from Reichenbach’s axioms of coordination to develop his own systematic approach based on ‘relativized a priori principles’. According to Friedman’s interpretation of Reichenbach’s work, certain theoretical principles internal to a physical theory provide the coordination between mathematical representations and physical correlates. More precisely, these coordinating principles supply fundamental elements of the conceptual framework within which the theory can be formulated and empirically tested. In Friedman’s view, physical theoretical frameworks are comprised of a mathematical, a mechanical, and an empirical component, where the mechanical part provides the coordination—that is, it establishes and justifies the referential relationship—between the mathematical and the empirical parts. For instance, in Newtonian mechanics, the three laws of motion are the mechanical part, because they coordinate the infinitesimal calculus, viz., its mathematical part, with the domain of concrete phenomena described by the law of gravitation, viz., its empirical part (Friedman, 2001, pp. 71–82).

Friedman’s work stimulated further research on how coordination is obtained—within different physical theories—by means of certain theoretical principles, such as the light principle, the principle of equivalence, and the principle of least action (e.g., Ryckman, 2005; Stöltzner, 2009). According to the view of coordination shared by these authors, such principles have “a fundamental structural significance for the theory” (Padovani, 2015a, p. 123), since they have an essential epistemic role as components internal to the theory itself. It follows that episodes of radical theoretical change usually result in a change of the coordinating principles. In Friedman’s view, this occurs during profound conceptual revolutions that involve Kuhnian paradigmatic shifts. For example, the transition from the paradigm of Newtonian mechanics to that of relativistic physics was marked by a major conceptual shift involving a change of the coordinating principles (Friedman, 2001, pp. 25–46). The conceptual shift characterising the transition to the relativistic framework is essentially a change in the “network of inferential evidential relationships” that must be presupposed in order for an observation or measurement to count as evidence for a theoretical claim (Friedman, 2001, p. 85). In other words, the principles constitutive of the new framework, i.e., the logical-mathematical principles and the coordinating (mechanical) principles, provide themselves the justification for the inferential ‘jump’ from a certain piece of evidence to a theoretical claim within a certain framework.

2.3 Towards a ‘layered’ approach to coordination

Van Fraassen (2008) tackles the debate on coordination from a different starting point, in connection with the role of measurement in scientific inquiry. Our measurement practices presuppose certain classifications provided by a theory of reference, such as the salient parameters to be measured and the relationships between them. However, how can we trust a theory before it has some empirical support, which is usually obtained via measurement?Footnote 2 According to van Fraassen (2008, p. 121), the main function of coordination is to “determine how measurement can establish a value for what is measured”. On these grounds, he criticises Reichenbach’s view of coordination, by arguing that the axioms of coordination alone cannot account for the historical process through which a certain measured parameter and the procedure to measure it are coordinated. Van Fraassen interprets Reichenbach’s axioms as a-historical and too abstract and, therefore, as inadequate to relate mathematical structures to empirical phenomena. According to van Fraassen, a new form of coordination cannot be obtained in isolation from its historically prior form, and the concurrent historical development of theory and measurement procedures must be analysed to understand how coordination is established.

Although she shares van Fraassen’s focus on measurement, Padovani (2015a, 2017) rejects his negative assessment of Reichenbach’s view of coordination as a-historical and overly abstract. On the contrary, she argues that in Reichenbach’s early writings we can find several ideas that could be developed into a less idealised account of coordination, compared to some of his contemporary reappraisals (Padovani, 2011, 2015a). For example, Reichenbach stressed the notion of mutuality of coordination, meaning that coordination cannot be achieved simply by choosing axiomatic principles that establish the referential relationship between abstract concepts and concrete phenomena. There is, in fact, some sense in which “the undefined side […] prescribes the order of the defined side” (Reichenbach, 1920/1965, p. 42), where the ‘undefined side’ is the one of perceptual experience. In other words, the level of experience is the ultimate source of empirical confirmation because only through experience we can establish whether the chosen axioms lead to a consistent form of coordination.Footnote 3 Therefore, although overlooked by both Friedman and van Fraassen, Reichenbach’s view involved another bottom up direction of coordination in which empirical evidence, usually obtained via measurement, has a crucial role for the assessment of whether the axioms of coordination lead to univocal coordination (Padovani, 2015a).Footnote 4 Most importantly, Padovani (2011) argues that not all the coordinating principles listed by Reichenbach may be at the same level of generality, since some of them, like the principle of probability, represent more fundamental presuppositions that are constitutive of other, less fundamental coordinating principles. Therefore, physical coordination should be viewed as comprising different ‘layers’, in that it involves several epistemic dimensions including theorising and measurement, in which various presuppositions have a coordinating function (Padovani, 2015a, 2017).

In a recent paper, I took up Padovani’s suggestion and outlined a ‘layered’ view of coordination as an alternative to Friedman’s (Luchetti, 2018). As I discussed above, Friedman (2001) develops his account of the relativized a priori principles against the backdrop of his interpretation of Reichenbach’s work and corroborates it with examples from space–time physics, most notably those of Newtonian mechanics and relativistic physics. In that paper, I argue that Friedman’s view of coordination is virtually inseparable from his analysis of the structure of scientific theories as composed of three epistemic layers, the empirical, the mechanical and the mathematical one. However, granted that this is an adequate representation of theory structure in space–time physics, in other scientific domains—and especially outside of the physical sciences—the relationship of theoretical and mathematical elements with the concrete phenomena that they are supposed to represent seems difficult to fit into a three-level structure such as the one Friedman identifies. Yet, this is not a good reason to abandon the perspective of coordination as a tool to understand how justification for the representational relationship between abstract representations and concrete phenomena is obtained in other scientific domains. Starting from this consideration, I develop a more flexible and ‘layered’ view of coordination, which I then apply to the case of the Hardy–Weinberg principle in population genetics.

In this paper, I do not reapply the same view, even though I share its assumption that the analysis of coordination understood in a more liberal sense than Friedman’s, but still in the Reichenbachian spirit, can be usefully deployed for different purposes, some more general, others more local; some concerning epistemic elements such as scientific laws, some others concerning theoretical terms, yet others concerning empirical data (e.g., measurement outcomes), etc. Their common core is that coordination is to be understood as a process through which the epistemic subjects working in a scientific framework obtain sound and non-circular justification for the representational relationship between an abstract representational tool and the concrete phenomena that it represents. Crucially, the extent to which a representation is abstract and the represented is concrete—or closer to the empirical world—is a matter of degree and context. Therefore, coordination can involve several ‘layers’ at different levels of abstraction, rather than only one. Depending on the specific scientific inquiry under analysis, including the epistemic tools available (mathematical, conceptual, material, etc.), the degree of maturity of the discipline, and the complexity of the phenomena investigated, the number of layers comprising the coordination between the concrete phenomena and their abstract representation can vary.

Developing Reichenbach’s notion of coordination into such a nuanced perspective is useful to epistemic analyses of scientific inquiry for two reasons. First, a more flexible view of coordination—compared, for instance, to Friedman’s own view—can be fruitfully applied to the life sciences, often characterised by less regimented theoretical structures than the physical ones. Secondly, this approach need not restrict its focus to episodes of radical theory change, in which relatively abstract mathematical principles are replaced to establish novel referential relationships. Rather, it provides a rationale to investigate also instances of non-revolutionary theoretical change, of non-theoretical scientific change, and of theory extension, in which the scope of a theory is extended to novel domains of phenomena.

In the rest of this paper, I will show how a more flexible understanding of coordination can provide us with a systematic approach to analyse the development of scientific concepts from the interaction between the evidential and theoretical advancements in a context of theory extension. More specifically, I will show how it can be usefully applied to better understand some epistemic and conceptual intricacies resulting from the process of endogenization in evolutionary biology.

3 Theory extension in evolutionary biology: analysing the strategy of endogenization

3.1 Theory extension and its impact on coordination

When a scientific theory extends its range of application to phenomena beyond the ones for which it was originally conceived and developed, we clearly do not witness a case of theory replacement but, rather, of theory extension. With theory extension some phenomena—whether they were previously accounted for by another theory or not—are internalised by a pre-existing theory. In other words, the scope of certain theoretical principles or concepts of the pre-existing theory is extended to a domain of phenomena larger than the one for which they were previously used. In some cases, as I will suggest, such an extension has an impact on the coordination between theory and phenomena. This means that theory extension involves a change in how scientists justify the referential relationship between the phenomena and the abstract conceptual resources used to represent them.

Indeed, some cases of theory extension may seem to fit a rather straightforward narrative of scientific growth, by which scientists discover that certain already-established empirical regularities hold also in novel domains, thus leading to an extension of the explanatory scope of the theory. In some others, theory extension appears to involve a process resulting in the extension of the referential power of some principles or concepts.Footnote 5 In this second case it seems that the meaning of certain terms used in the theory changes, by leading to what Chang (2004, p. 150) has labelled semantic extension:

I will use the phrase semantic extension to indicate any situation in which a concept takes on any sort of meaning in a new domain. We start with a concept with a secure net of uses giving it stable meaning in a restricted domain of circumstances. The extension of such a concept consists in giving it a secure net of uses credibly linked to the earlier net, in an adjacent domain. Semantic extension can happen in various ways: operationally, metaphysically, theoretically, or most likely in some combination of all those ways in any given case.Footnote 6

Chang’s notion of semantic extension indicates that the referential scope of a term has been extended, by stretching its applicability to empirical phenomena different from the one(s) to which it was initially supposed to refer.Footnote 7 The increased referential power of this term can be justified by means of metaphysical assumptions, by relying on a set of physical operations (such as measurement procedures), by relying on theoretical justifications, or by different combinations of these.

However, as I will discuss more in-depth in Sect. 4, a theory can be extended for certain modelling, explanatory or, in general, theorising purposes, while this extension per se may fail to provide necessary and sufficient conditions for the semantic extension of some terms that are embedded in the theory. In these cases, the semantic extension of a theoretical term could be better conceived as a final, mature stage of a process of theory extension, where the latter starts with the functional extension of some abstract theoretical resources, i.e., their extension for modelling, explanatory or, in general, theorising purposes. In the rest of this section, I will discuss a case of theory extension in evolutionary biology to analyse the relationship between functional and semantic extension and their impact on the coordination of the theory with the phenomena that it describes.

3.2 Okasha’s view of endogenization in evolutionary biology

Okasha (2018) has recently provided an interesting analysis of how biologists, while doing evolutionary modelling, treat certain variables as endogenous, since they represent the evolutionary forces deemed to be active, while they assume other variables as exogenous, in that they represent background conditions of evolution by natural selection. Okasha identifies a broad trend in evolutionary biology over many decades by which exogenous variables, usually treated as background presuppositions of evolution by natural selection, are ‘endogenized’ in the domain of possible evolutionary theorising. In a nutshell, endogenization boils down to taking some phenomena so far believed not to be directly accountable for in terms of natural selection theory but, rather, idealised away as variables that constitute its background conditions, and describing them as products of evolution by natural selection. This theoretical strategy is so pervasive in evolutionary biology that the historical development of this discipline can be viewed as a “successive endogenization of variables that previous theorists had treated as exogenous” (Okasha, 2018, p. 3).

One particularly striking aspect of endogenization is that the same theoretical resources deployed at later stages to account for these background presuppositions required at an earlier stage the backgrounding of those very presuppositions, in order to identify and develop the core theoretical principles in the first place. In other words, while these background variables had initially been assumed as fixed, so that the theory of natural selection could explain another host of phenomena, such as adaptation, speciation, and phylogenesis, at a later stage the very same theory supplied an explanation of these variables, which from fixed idealised presuppositions became endogenized phenomena. To clarify how endogenization works, I analyse one case that will also provide the material for my discussion of the impact of theory extension on coordination: the endogenization of (selective) environment.

Within the neo-Darwinian framework, the environment is considered as the factor that exerts selective pressures on organisms.Footnote 8 Organisms, in response to these pressures, evolve adaptations via natural selection to ‘fit’ the environment. Therefore, the environment is treated as an exogenous variable, it is considered as an external factor for evolution by natural selection. This assumption has been challenged by researchers working in the field of niche construction theory. These scientists emphasise that organisms can fit the environment not only by evolving adaptations through natural selection, but also by transforming the environment during their lifespan. They do so, for instance, by regularly modifying local resource distributions, by choosing and changing habitats, or by constructing artefacts. These activities are widespread in the animal kingdom, among both vertebrates and invertebrates, from beavers building dams and badgers constructing burrow systems, to birds building nests, and yet to spiders weaving nets. In addition, it is typical of plants, too, since they change numerous traits of their habitats, including temperature, humidity, chemical composition, acidity, patterns of light and shade, etc. Most organisms contribute to the construction of their environmental niches.

In an influential book, Odling-Smee et al. (2003) argue that these niche constructing activities emerge as responses to external selective pressures. At the same time, by transforming the environment, these activities contribute to changing the selective pressures which, in turn, prompt further adaptive responses from the organisms and, therefore, affect their fitness.Footnote 9 In other words, niche construction is not simply a modification of the environment per se: it is a modification of the selective environment induced by organisms themselves. As a result of this change in selective pressures, members of many species inherit the cumulative environmental changes that previous generations have induced. The effects of niche construction can overturn external sources of selective pressure and give rise to unusual evolutionary trajectories and equilibria, for instance by fixating otherwise deleterious alleles.

The interactions between genetic and ecological inheritance can echo into macroevolutionary patterns, as in the case of the evolution of photosynthesis in early bacteria that led to an increase of oxygen in the atmosphere and, consequently, to the evolution of organisms capable of aerobic respiration (Danchin et al., 2011; Erwin, 2008). These effects are particularly striking in the case of human evolution. Laland et al. (2010) summarise decades of studies on how gene-culture interactions have shaped human evolution. As they highlight, anthropological studies and data from the human genome systematically converge in showing that numerous genes have been subject to positive selection as a response to cultural practices.Footnote 10 Therefore, human evolution cannot simply be understood as a process of adaptation to changes in the environment caused by events beyond human control. On the contrary, gene-culture dynamics can have fast and sizeable evolutionary consequences, so that it can be considered one of the most relevant patterns of human evolution.

In sum, according to niche construction theorists, the organism-environment interactions typical of a species cannot be ignored when analysing evolutionary dynamics, since they fundamentally affect the strength and direction of the selective pressures on that species and others. Including the process of niche construction in the domain of phenomena accounted for by natural selection allows for the incorporation of environmental components and their interactions with the biological realm in evolutionary modelling. In epistemological terms, this move is prompted by recognising the idealising character of the assumption that the only way for organisms to respond to selection pressures from the environment is by evolving adaptations. This idealisation is not universally justifiable, since niche construction demonstrates how organisms can fit the environment by transforming the latter, and this variable should be taken into consideration in evolutionary modelling, when relevant to the modelling goal. Incorporating niche construction represents an attempt at endogenizing the environment, that is, at bringing the formerly exogenous variable of the environment within the scope of evolutionary theorising. What is endogenized, in this case, is the selective environment, that can be understood as a variable co-evolving with the traits (i.e., the niche constructing activities) evolved by the organism to interact with the environment itself.

3.3 Endogenization as theory extension

The case of niche construction is just a representative example of how endogenization works. Other variables that had been taken as background conditions of evolution by natural selection, including hierarchical organization, individuality, mutation rate, sex, genetic recombination, fair meiosis, population structure, etc., were subject to endogenization.Footnote 11 This process did not unfold exactly in the same way in all cases, and it did not bring identical results. For instance, some variables, such as the environment, have only been partially endogenized, whereas others have been completely explained away, as in the case of mutation rate. However, all these lines of theorising share a common underlying epistemic mechanism. With the notion of endogenization, Okasha captures a specific type of theory extension in evolutionary biology, whereby phenomena that were idealised away as background presuppositions in previous formulations of Darwinian natural selection theory could later be accounted for by that same theory.

On a surface level, the progressive endogenization of background variables in the history of evolutionary theory fits a narrative of cumulative progress, according to which “evolutionary biologists have responded to various crises by augmenting the preexisting framework, building on what was already there” (Pigliucci, 2007, p. 2743). In other words, Okasha’s notion of endogenization seems fully consistent with the view that evolutionary biologists react to the discovery of new phenomena or to the emergence of conceptual issues by extending the application of already-established theoretical tools, rather than by developing alternative theoretical frameworks or suggesting radical conceptual shifts. In addition, his analysis of endogenization complements previous scholarly work focusing on how evolutionary theorising was deployed beyond the life sciences, as it was used in the study of animal behaviour (Avital & Jablonka, 2000; Jablonka & Lamb, 2005), in certain areas of psychology (Campbell, 1960), and in the study of culture, including language, morality, and science itself (Boyd & Richerson, 1985; Cavalli-Sforza & Feldman, 1981; Hull, 1988). By showing how endogenization worked as a strategy of theory extension directed at certain presuppositions of the theory itself, Okasha provides further systematic evidence of the historical process during which the theoretical resources of evolution by natural selection progressively expanded their scope.

From an epistemological perspective, Okasha (2018, p. 2) characterises endogenization as “a particular way in which the generality of evolutionary theory has been increased over time”. More specifically, he seems to suggest that the generality of the core principles of Darwinian evolution, that is, the possibility to provide an abstract formulation of these principles, enabled endogenization as a strategy of theorising to flourish.Footnote 12 Although he does not elaborate more on this aspect, Okasha’s point echoes with Wimsatt’s (1987) epistemological analysis of how evolutionary theorising could be applied beyond the life sciences. Wimsatt discusses the example of the theory of evolution by natural selection to suggest that certain theoretical principles can be abstracted away from their context of origin, so that they become less tied to the empirical details of the domain within which they were initially formulated. This process of abstraction makes these resources ‘portable’ from the old domain to new ones, in which they can serve some explanatory or representational role. This is the case, for example, of evolutionary epistemology, that describes the development of science according to the explanatory scheme of evolution by natural selection.

In the rest of this section, I will build on Okasha’s and Wimsatt’s considerations to systematically analyse endogenization as a process that unfolds through different stages. My goal is to carefully distinguish among the effects of endogenization at the level of abstract theoretical principles, those at the level of the phenomena accounted for by the theory, and those concerning the relationship between these two levels, that involve several epistemic components. Although I will only focus on one of these components, namely, the concept of ‘inheritance’ and its relationship to the theoretical principle of heritability, I will show that a ‘layered’ perspective on coordination can be effectively deployed to analyse some of the epistemic and conceptual effects of endogenization as a widespread modelling strategy in evolutionary biology. Not only will this analysis provide a deeper explanation of these effects than the one offered by Okasha in terms of the greater generality of the theory; it will also show that the notion of coordination, if suitably interpreted, can be applied to reconstruct cases of theory extension in the life sciences.

3.4 Stages of endogenization: quasi-axiomatisation and functional extension

In the context of the heated debate on the levels of selection, Lewontin (1970) formulated a distilled version of Darwin’s (1859) fundamental explanatory structure in the form of three principles that underpin the theory of evolution by natural selection.Footnote 13 These three principles can be applied to any of the different units of selection (molecule, cell, organism, group of organisms, species), and, therefore, have been considered as a sort of ‘recipe’ for evolutionary change (Godfrey-Smith, 2007, 2009a). The principles, as worded by Lewontin (1970, p. 1), are the following:

  1. (i)

    Different individuals in a population have different morphologies, physiologies, and behaviors (phenotypic variation)

  2. (ii)

    Different phenotypes have different rates of survival and reproduction in different environments (differential fitness)

  3. (iii)

    There is a correlation between parents and offspring in the contribution of each to future generations (fitness is heritable).

What Lewontin provides is a quasi-axiomatisation of evolution by natural selection in terms of the principles of variation (V), differential fitness (DF), and heritability (H).Footnote 14 On the one hand, this is the result of an individuation and extraction of three principles from the informal Darwinian formulation of the theory (Griesemer, 2013, pp. 306–307). This extraction also required an operation of abstraction from a large amount of empirical observations, from which Darwin originally inferred the structure of the mechanism of evolution by natural selection (Wimsatt, 1987).Footnote 15 These three principles evidently involve abstract mathematical and statistical concepts (variation as a statistical distribution, fitness as rates of survival and reproduction, heritability as intergenerational correlation), even though they are not expressed in mathematical form.Footnote 16

On the other hand, this quasi-axiomatisation also resorts to idealisation,Footnote 17 since it leaves out variables deemed as crucial by Darwin himself, such as the famous notion of struggle for existence. More precisely, idealisation contributes to fixing the epistemic function of these three core Darwinian principles so as to make them the most relevant explanatory variables within the theory, in comparison both to previous formulations and to other variables that were fixed as background presuppositions of natural selection, such as individuality, the environment, etc. By ‘fixing the epistemic function’, I mean that some component of a theory is held fixed with respect to its epistemic role. For instance, the more rigorously a concept is defined, the more it is possible to deploy it unambiguously for a certain epistemic purpose, such as quantifying the effects of a phenomenon across a wide variety of empirical inquiries. If we take the concept of ‘fitness’, the more rigorously it is defined as a measure of reproductive success, the more it is possible to use it to test the effects of selection. This does not mean that mathematisation is the only way to set fixed some epistemic components, nor that quantification is the only epistemic function of interest. This short example is just meant to clarify that the fixing of the epistemic function of a component endows it with a quasi-definitional status, thus (temporarily) bracketing the possibility of modifying or replacing it, so that other parts of the theory comprising it can be developed or empirical claims based on it tested. The fixing of the epistemic function must not be confused with the fixing of the meaning of a concept or principle.

Lewontin’s quasi-axiomatisation is not the only form of regimentation of the theory of evolution by natural selection that has been developed in the history of evolutionary biology.Footnote 18 However, it was a highly influential one, especially in the context of the levels of selection debate, since it showed that evolutionary change does not necessarily happen only at the level of individual organisms, as was implied by the neo-Darwinian framework. For instance, his three principles abstract and idealise away from the assumption that natural selection only operates on individual organisms, understood as genetically discrete and homogeneous units. Variation, differential fitness, and heritability represent the enabling conditions for natural selection to operate on populations of individuals, where ‘individual’ does not refer to any privileged type of biological entity, but it can be applied to all the different hierarchical levels (molecules, cells, organisms, groups). Therefore, Lewontin’s principles became a powerful tool to argue for group-level selection and show that the assumption of hierarchical organisation of the biological realm could not be considered merely as a given background presupposition of the theory of natural selection, but that it could be explained in terms of those very principles.Footnote 19

In a nutshell, the fixing of the epistemic function of the three core Darwinian principles and their abstraction from specific empirical domains was essential to extend their applicability to model phenomena that were previously idealised away as background variables, so that these could be represented as products of natural selection. Therefore, the quasi-axiomatisation of the core Darwinian principles can be understood as the first stage of endogenization, since abstraction and idealisation, as Wimsatt (1987) argues, increase the quantity of conceptual dependencies that the principles can sustain, thus enhancing their ‘portability’, i.e., the possibility to apply them to novel empirical domains. Consequently, the second stage of endogenization is the functional extension of the core Darwinian principles, such that they are used to account for phenomena formerly represented as idealised background variables (Table 1). As an upshot of the functional extension of the principles of variation, differential fitness, and heritability, the formerly idealised background variables, such as that of hierarchical organisation, become endogenized phenomena.

Table 1 Stages and outcomes of endogenization from quasi-axiomatisation to functional extension

In the case of niche construction theory, discussed in Sect. 3.2, the variable of selective environment is endogenized since it is accounted for as a product of natural selection. This can also be viewed as resulting from the quasi-axiomatisation and functional extension of the three core Darwinian principles. According to niche construction theorists, phenotypic differences in the modes of interaction between organisms and environment (variation), that are considered to be heritable (heritability), although not only through biological reproduction, provide a selective advantage (differential fitness) to the organism-environment niche. This means that the theoretical scope of the core Darwinian principles is extended to describe a phenomenon that was not previously considered as influencing evolutionary dynamics, namely, the niche-constructing activities. Since niche-constructing activities change the environment, and these changes produce further transformations both in the distribution of phenotypic variations and in the selective pressures cumulatively inherited by future generations, the variable of selective environment is the endogenized component. In the next section, I will move on to consider how the functional extension of theoretical principles impacts the coordination between theory and phenomena.

4 Semantic extension and coordination in the endogenization of selective environment

4.1 From functional extension to semantic extension

In this section, I will argue that the functional extension of the core Darwinian principles has an impact on certain fundamental concepts involved in the coordination between theory and phenomena. More specifically, I will argue that the functional extension of these abstract principles requires the semantic extension of some concepts that enable the concrete applicability of the principles to the empirical reality. Therefore, semantic extension can be viewed as resulting from a change of the coordination between the principles and the phenomena that they are supposed to represent. To do that, I will zoom in on the case of the endogenization of selective environment by niche construction theorists that I presented in Sect. 3.2.

More precisely, I will focus on the functional extension of the principle of heritability and on how it leads to the requirement of a semantic extension of the concept ‘inheritance’. In this context, I will consider heritability as an abstract theoretical principle like the one formulated by Lewontin, that is, as a general precondition of natural selection that refers to the presence of an intergenerational correlation in phenotypic variation distributions, rather than as any of the specific measures of heritability as they are conceived in the neo-Darwinian framework.Footnote 20 The notion of inheritance, on the other hand, identifies the concrete mechanism(s) of intergenerational transmission and retention of variation and it can be viewed as providing a key piece of the coordination between the abstract principle of heritability and its physical correlate, namely, the actual patterns of phenotypic similarity across populations due to the retention and transmission of variation. In other words, the scope extension of the principle of heritability to model and, eventually, endogenize certain variables calls for an extension of the referential scope of ‘inheritance’, without, however, delimiting its precise scope. This can be viewed as a gap in the coordination between theory and phenomena resulting from endogenization, although plausibly not the only one, and it is the source of the current conceptual controversies over the correct meaning of ‘inheritance’.

As we have seen, in niche construction theory the endogenized variable is that of selective environment. Yet, the endogenization of selective environment seems to rely on a different meaning of ‘inheritance’ than the one of genetic inheritance assumed by the neo-Darwinian framework, namely, the transmission of genetic material from parents to offspring during reproduction, taken to be responsible for phenotypic heritability. In fact, niche construction theorists assume another dimension of inheritance, on top of the genetic one: “Selected habitats, modified habitats, and modified sources of natural selection in those habitats are also transmitted by the same organisms to their descendants, as a consequence of their niche-constructing activities, through a second general inheritance system in evolution, ecological inheritance” (Odling-Smee, 2007, p. 278).

Ecological inheritance is an indirect type of inheritance, in contrast to the direct transmission of genes from parents to offspring, since it does not directly involve a generational event, but happens through the medium of the environment. Why then call it inheritance? What is the justification for extending the referential scope of this term? If we look at the debates on the meaning of ‘inheritance’, we quickly acknowledge that this is far from being a settled issue in evolutionary biology. In what follows, I outline the gist of the controversy on the correct semantic extension of ‘inheritance’ and I discuss how it relates to both the process of endogenization and the topic of coordination.

4.2 When coordination collapses: conceptual controversies over ‘extended inheritance’

As I mentioned above, Lewontin’s principle of heritability merely implies a statistical correlation in the retention of changes of phenotypic variation across generations.Footnote 21 What Lewontin had in mind was an abstract descriptive characterisation of a correlation in terms of the statistical measure that biologists use to model evolutionary dynamics, rather than a causal notion of inheritance as a set of physical systems or processes responsible for that state. Within the neo-Darwinian framework, heritability has different mathematical definitions aimed at quantifying the amount of phenotypic variation of a trait in a population that is due to genetic variation. Among them, we have heritability as parent–offspring covariance, that is the one presupposed by Lewontin’s abstract principle. Indeed, Lewontin’s principle does imply that a certain amount of phenotypic variation is due to variation of genetic factors, but its abstract characterisation allows for the intergenerational correlation in variation distributions to include other factors that influence heritability via non-genetic channels.Footnote 22

However, the crucial point is that, within the neo-Darwinian framework, one single mechanism is responsible for this statistical correlation: genetic inheritance, that is, the transmission of stretches of DNA between parents and offspring via reproduction. In this sense, it may be said that, within the neo-Darwinian framework, the term ‘inheritance’ only refers to the mechanism of genetic inheritance. This referential relationship justifies the possibility to represent actual patterns of phenotypic similarity due to the retention and transmission of variation in terms of a statistical correlation implied by the principle of heritability. This is possible because, given the neo-Darwinian definition of ‘inheritance’, one single physical process realises the retention and transmission of variation, viz., genetic inheritance. In other words, the identification of inheritance with the causal mechanism of genetic inheritance is a central piece that, within the neo-Darwinian framework, enables the univocal coordination between physical states and their abstract representation as a statistical correlation, since it prescribes that the only phenotypic variation that is heritable is the one rooted in the transmission via the genetic channel.

This identification has been challenged by the rise of new subdisciplines of evolutionary biology, such as developmental systems theory, gene-culture co-evolution theories, and epigenetics. Scientists working in these fields refer by the label of ‘non-genetic inheritance’ to other causal factors (variously conceptualised as systems, channels, mechanisms, or processes) that influence the retention of changes in variation distributions within and across generations, such as epigenetic, behavioural, ecological, and cultural factors. Consequently, many proposals for an extended concept of inheritance have been put forward (Danchin et al., 2011; Griffiths & Gray, 1994; Jablonka & Lamb, 2005; Odling-Smee, 2007; Sterelny et al., 1996).

However, the proliferation of putative inheritance systems as a reaction to the restrictive neo-Darwinian identification of inheritance with the genetic mechanism has raised the worry of obtaining, as a result, an over-inclusive concept of inheritance. This worry refers to two possible deleterious effects. First, an overly extended notion of inheritance might create problems when it comes to formal modelling of evolutionary dynamics (Odling-Smee, 2007). The neo-Darwinian identification of genetic inheritance as the only mechanism preserving the retention of changes in variation distributions among generations is certainly a strong idealisation. Yet, this idealisation served the modelling purpose well, as testified by the great achievements of population genetics in the twentieth century. The second issue comes from the view that most of the proposals to extend ‘inheritance’ assume that every form of transmission is a form of inheritance (Merlin, 2017).

In my view, reconceptualising ‘inheritance’ is needed partly as a direct consequence of the functional extension of the principle of heritability. This functional extension—as I have shown in the case of niche construction theory—is aimed at modelling phenotypic similarity that was previously idealised away. Niche construction theorists have extended the domain of application of the principle of heritability, together with that of variation and differential fitness, to model the changes in the distribution of phenotypic traits that result from niche constructing activities in evolutionary terms. However, due to this functional extension, the coordination between the principle of heritability and the parent–offspring covariance in phenotypic similarity as it is conceived within neo-Darwinian theory can no longer hold.

More precisely, the functional extension of the principle of heritability breaks the identification of ‘inheritance’ with genetic inheritance, which was a crucial piece of that coordination. As I discussed above, this is the case because this identification prescribes that the only phenotypic variation that is heritable is the one rooted in the transmission via the genetic channel. In addition, the functional extension of heritability leads to the necessity of a reconceptualization, or semantic extension, of the term ‘inheritance’, so as to include more processes responsible for phenotypic similarity, albeit without providing clear boundaries for its extension (Fig. 1). This ‘collapse’ of the representational relationship between the principle of heritability and its physical correlates can be understood through the lenses of coordination analysis. In fact, a central component that justified the representational relationship between the principle of heritability (together with the other principles underlying natural selection theory) and its concrete domain of phenomena until the functional extension, i.e., an agreed-upon notion of inheritance, is now missing (Fig. 2).

Fig. 1
figure 1

Unfolding of the effects of the functional extension of the principle of heritability on the concept of inheritance. The arrows indicate the succession of stages

Fig. 2
figure 2

Representation of the effects of functional extension with respect to the layer of theoretical abstraction of the epistemic components and of their relationships

As I mentioned above, Merlin (2017) argues that current proposals for extending ‘inheritance’ fail to do this properly, since they include types of transmission that are not in line with what she calls the ‘theoretical role of inheritance’ in natural selection theory. For instance, forms of horizontal and oblique transmission, such as epigenetic lateral gene transfer or cultural inheritance, which bypass inter-generational transmission via reproduction, are merely ways of acquiring new variations, and not of cumulating already existing variations and preserving their continuity across generations, as the theoretical role of inheritance would prescribe.

Merlin’s point is worth dwelling on. The problem she raises is how to reconceptualise ‘inheritance’ in the light of empirical evidence of new causally relevant factors, other than genetic ones, influencing the retention in changes of variation distributions across generations. This conceptualisation should refer to all those physical realisers that contribute to inheritance but leave out those physical realisers that merely contribute to the acquisition of variation. To do that, she appeals to the theoretical role of inheritance and refers to Lewontin’s formulation of the principle of heritability. Lewontin’s formulation states that a statistical correlation holds between parents and offspring and, therefore, according to Merlin, it requires that transmission must occur in a parent–offspring lineage, and not, for instance, within the same generation. In addition, these correlations relate to the contribution of parents and offspring to future generations, therefore implying continuity across generations and the cumulation of variation across time. In Merlin’s view, if we assume that a mechanism of transmission, without preserving continuity and cumulation of variation, is sufficient for inheritance, we make the mistake of including physical mechanisms or processes which only realise acquisition of variation rather than inheritance.

Merlin’s argument seems to presuppose a specific reading of the concept ‘inheritance’ out of Lewontin’s abstract formulation of the principle of heritability. Yet, the requirements that make up Merlin’s ‘theoretical role of inheritance’ do not seem to be uncontroversial, and different interpretations of the principle of heritability seem to guide different choices concerning the extension of the term ‘inheritance’.

For example, Charbonneau (2014) argues that parent–offspring lineages are not necessary for a population to undergo Darwinian natural selection, as Lewontin’s principle of heritability would seem to require, according to Merlin. Therefore, in Charbonneau’s view, ‘inheritance’ should not refer only to vertical transmission, viz., transmission between parents and offspring, as the only system that preserves transgenerational retention of changes in variation distributions. Rather, generation and memory, which Charbonneau (2014, p. 739) defines “without reference to a specific biological ontology of entities and processes”, suffice to guarantee transmission, continuity, and cumulation of changes in variation distributions. Generation and memory can be physically realised by mechanisms other than genetic transmission, as in the case of diffused ecological inheritance. Although Charbonneau emphasises the contrast between his view and Lewontin’s formulation of the principle of heritability, it could rather be considered as a development of Lewontin’s quasi-axiomatisation. In fact, Charbonneau explicitly abstracts away from parent–offspring lineages, while preserving the functional role of the principle of heritability as an abstract precondition for natural selection.

4.3 Semantic extension as a result of novel coordination: the case of ecological inheritance

Let us now go back to niche construction theory and the endogenization of selective environment. As I mentioned in Sect. 4.1, niche construction theorists extend the notion of inheritance to encompass ecological inheritance in addition to genetic inheritance. Ecological inheritance is an indirect system of inheritance since it does not require vertical transmission or reproduction. Rather, it happens through the medium of the environment, which is modified by the niche-constructing activities of the organisms. The cumulative effects of these modifications influence future generations in that they directly produce durable changes in the distribution of certain phenotypic variations or, more deeply, by modifying selective pressures.Footnote 23

However, one might wonder—on the lines of Merlin’s argument—whether that counts as inheritance, and not as acquisition of new variations. According to Odling-Smee (2007, p. 280), ecological inheritance is indeed inheritance “because some of the environmental consequences caused by the repeated niche-constructing activities of multiple generations of organisms in their environments accumulate or persist in environments across generations”. The function of cumulation and continuity in the transmission of changes in variation distributions is satisfied by ecological inheritance, even though the transmission has high fidelity only in its interaction with genetic inheritance (Odling-Smee & Laland, 2011). The quote above illustrates that niche construction theorists have reconceptualized the notion of inheritance by extending it in such a way that it refers also to the system of ecological inheritance. At the same time, their semantic extension of ‘inheritance’ is in line with what they assume is its core epistemic role, that is, the concept enabling the identification of those systems that contribute to the retention of changes in variation distributions by means of transmission, cumulation and continuity.

The reconceptualization of ‘inheritance’ proposed by niche construction theorists should be viewed, as I did for Merlin’s and Charbonneau’s proposals, also in connection with their interpretation of the principle of heritability. As I discussed above, niche construction theorists functionally extended the core Darwinian principles to model niche-constructing activities in terms of evolutionary dynamics, thus leading to the endogenization of the variable of selective environment. Yet, the functional extension of the principle of heritability, implemented to account for the (kinds of) phenotypic similarity that could not be causally explained in terms of genetic inheritance, undermined the identification of ‘inheritance’ with the mechanism of genetic inheritance. This identification is, within the neo-Darwinian framework, crucial to coordinate the principle of heritability as one of the general theoretical preconditions for natural selection and concrete patterns of phenotypic similarity, since it posits that only phenotypic variation produced by genetic inheritance is heritable. The abandonment of this identification requires the search for a new, more abstract, and more inclusive concept of inheritance that mirrors the functional extension of the principle of heritability to account for more (kinds of) phenotypic variation than before. Hence, the conceptual controversies on extended inheritance that I discussed above.

Within this context, niche construction theorists, in the same way as Merlin and Charbonneau, base their extended notion of inheritance on their interpretation of the principle of heritability. More specifically, their semantic extension of ‘inheritance’—such that it encompasses also ecological inheritance—is motivated by their functional extension of the principle of heritability to model the variable of selective environment. As I discussed in Sect. 3.4, functional extension is preceded by a stage of quasi-axiomatisation, in which the principles become more abstract and idealised. This is the case also for the endogenization of selective environment. Niche construction theorists adopt a highly abstract version of the principle of heritability, so that it allows for the modelling of phenotypic traits that are not inherited via genetic inheritance, thus abstracting away from parent–offspring lineages. Therefore, their semantic extension of inheritance is not just motivated by their functional extension of the principle of heritability, but it is justified by their interpretation of its theoretical role.

In terms of coordination analysis, we can see how niche construction theorists defined a key component of the coordination between the functionally extended principle of heritability and the domain of phenomena that it refers to—namely, their extended notion of inheritance—by holding fixed the epistemic function of an abstract theoretical principle in a novel empirical domain. However, their semantic extension of ‘inheritance’ is not guided only by theoretical considerations. As I discussed in Sect. 4.2, they identify causal factors that influence the retention of changes in variation distributions within and across generations. Therefore, their reconceptualization of inheritance is based also on the recognition that the transmission and retention of traits via niche-construction activities were akin, in some relevant respect, to the transmission and retention of traits via reproduction. This is because this non-genetic form of transmission, cumulation and retention was causally responsible for a certain amount of phenotypic similarity that was excluded as such by previous theorising.

5 Conclusion

Controversies over the adequate semantic extension of the term ‘inheritance’ are still hotly debated. I have argued that these controversies should be viewed in connection with the functional extension of the principle of heritability, understood as a general epistemic precondition of the theory of evolution by natural selection. This extension, together with that of the principles of variation and differential fitness, is crucial to the process of endogenization of phenomena previously idealised away as background presuppositions of natural selection theory, such as the variable of the environment. As I showed above, different interpretations of the principle of heritability prescribe different ways of working out the referential scope of ‘inheritance’ by including within it the physical transmission mechanisms that are thought to best fit both evidence and theory. In current discussions on the possibility of an extended evolutionary synthesis, different attitudes towards restricting or expanding the referential scope of ‘inheritance’ seem to reflect different modelling and explanatory purposes and, thus, may be subject to a pluralistic interpretation.

However, I have shown how the referential scope of ‘inheritance’ has been extended to include the mechanism of ecological inheritance in niche construction theory. This was achieved in the light both of the modelling purpose that resulted in the endogenization of selective environment, and of the identification of the relevant similarity, at the level of phenomena, between phenotypic variation due to genetic inheritance and phenotypic variation due to ecological inheritance as ways of transmitting and retaining variation. I have argued that these two pieces of justification for the semantic extension of ‘inheritance’ by niche construction theorists represent a novel form of coordination between the level of theory, in this case the principle of heritability, and the level of phenomena, that here refers to the patterns of phenotypic variation that they aimed to model in evolutionary terms. In other words, niche construction theorists could develop their extended notion of ‘inheritance’—required in the light of their functional extension of the principles of natural selection to model selective environment—only by providing some form of justification for the extended representational relationship between these abstract theoretical principles underpinning their theory of reference and the concrete phenomena that they aimed to model, i.e., by providing a new form of coordination. Therefore, this novel coordination is preconditional to the semantic extension of inheritance and, thus, it is a crucial component of endogenization as a process of theory extension.

Clearly, the understanding of coordination that I have assumed in this analysis is not in terms of a single principle that provides a one-to-one mapping between a mathematically expressed theory or concept and a set of physical phenomena, as per Friedman’s view. Rather, coordination must be understood in a more flexible and ‘layered’ sense, especially in sciences beyond the physical ones, since several epistemic dimensions, such as those of measurement, modelling, and theorising are highly entangled when it comes to analysing the representational relationship between theory and phenomena.

In this paper, I have only focused on one key piece of the coordination between the theory of natural selection and empirical phenomena, that is, the concept of inheritance, and in a specific context of theory extension, namely, the endogenization of selective environment. By discussing the relationship between the principle of heritability as an abstract theoretical precondition of natural selection, the concept of inheritance, and the extended domain of phenotypic variation that niche construction theorists aimed to model, I did not mean to offer a complete analysis of the coordination between neo-Darwinian theory or the extended evolutionary synthesis and empirical reality. Further work is certainly required to understand, for instance, the relationship between the epistemic elements considered above and different statistical measures of heritability, as well as its role in the general context of endogenization and in the endogenization of selective environment, more specifically. However, I have shown that the analysis of coordination, if suitably interpreted, gives us crucial insight on some epistemic and conceptual consequences of the strategy of endogenization in evolutionary biology that are not accounted for by Okasha’s own account. More generally, I have shown how developing Reichenbach’s original perspective of coordination into a more flexible and ‘layered’ notion can provide us with a systematic tool to analyse conceptual change even outside of the physical sciences and in cases of theory extension.