The main point of this paper has now been made—the current evidence does not (yet) justify selective semantic realism about dark ‘matter’, and a fortiori no full-blown scientific realism about dark matter either. One may wish to retort however that we know for certain that luminous matter plus Einstein’s laws of gravity cannot be the full or correct story. We know that something is missing. Do we not have good reason to believe that physics will narrow down the common core concept of dark ‘matter’ in the foreseeable future? And if so, does this guarantee not allow us to already be scientific realists, including semantic realists, about dark matter right now? Before responding to these questions, let us continue the history of genes up until current times [37, 39, 51]. Be prepared for a messy story—too messy to vindicate contemporary realism about genes and arguably even messy enough to justify contemporary anti-realism about genes.
Genetics has advanced immensely since Mendel, initially corroborating the simplicity of Mendel’s factors but eventually revealing a concept that is much more complex than expected. After Mendel, genes could be accessed more directly, not merely through breeding experiments. Classical genetics focused on the chemical nature of the gene, conceiving of each gene as being responsible for the production of a single enzyme—a protein that is a biological catalyst for chemical reactions in the cell. Classical genetics transitioned to molecular genetics in the 1950s, with as highlight the 1953 DNA model of Watson and Crick and the subsequent uncovering, in the 60s, of the genetic code. This is a mapping from stretches of DNA to protein structure. The concept of a “gene” was eventually identified with Benzer’s concept “cistron” [52], the DNA stretch that codes for the structure of a single protein molecule. At this point it may seem that we have a vindication of Mendel’s factors of inheritance, a tangible and semantically thick concept of a gene that is a precisification, a filling in, of the original idea.
However, the cistron is merely a ‘unit of function’ of DNA, a ‘unit’ that is, in various senses, neither neatly localised, nor necessarily temporally stable, nor defined uniquely or without vagueness, nor a complete, context-independent explanation of observed breeding patterns. It is not clear that we should talk about a unified concept, a unit, at all. Consider first locality and stability. A single cistron need not be contiguous—it can be interspersed with non-coding stretches of DNA called introns [51, 53, 54], or it can even correspond to stretches of code from two discrete organisational collections of DNA called chromosomes (via a process referred to as RNA trans-splicing). Conversely, multiple cistrons need not be (fully) separated, as their associated DNA stretches can partially overlap [51, 55] or one code can even be embedded in the other. More radical even than RNA trans-splicing—where the stretches of code from two chromosomes are used but the chromosomes are left intact—is recombination of genes. During meiosis chromosomes can cross over; they break and exchange genetic material in a way that does not respect cistron boundaries. The unit of recombination that is “passed on intact is not the same as the [functional] unit that makes a protein” [37, p. 84]! This lead Benzer to conclude by the mid 1950s that the word ‘gene’ had become a “dirty word” [43, p. 285] [39]. For similar reasons it is also not clear that it is the correct evolutionary unit that is selected for, with the cross-generational stability that Mendel expected. Although evolution is often even defined in terms of gene frequency, with the gene being passed on as a discrete unit, this is arguably an idealisation [37, §6.3]. A gene is arguably both too large—cistrons are broken when gene recombination occurs via sex or meiosis—and too small, as larger stretches of genetic code (of various sizes!) are usually selected for, i.e. genes tend to be evolutionarily relevant in the context of other genes rather than by themselves. There is no unique length of a DNA stretch that is passed on intact across generations, contra the spirit of Mendel’s original ideas.
Consider now the ambiguities in the definition of the concept of a gene, as well as its merely partial, context-dependent role in accounting for observables. The complexity of modern genetics has lead some authors to simultaneously recognize multiple gene concepts. Moss [56] modestly starts off with two concepts: gene-P (an instrumentalist definition—anything in the genome that relates to predicting the observable phenotype, without requiring a one-to-one relation between genes and traits—somewhat resembling early genes) and gene-D (a specific molecular sequence). Griffiths and Stotz [57] employ three or four concepts.Footnote 18 Before that, Sterelny and Griffiths [60, p. 133] even went as far as claiming that the concept of ‘gene’ is context dependent, it being used as a “floating label” for any bit of DNA that is of interest.
Moreover, the gene-D is by itself indeterminate with respect to phenotype; its interaction with a plethora of other developmental resources is what determines the observable traits [39, 51]. For instance, molecules binding to ‘promoters’ located near the cistron can determine whether that stretch of code is or is not transcribed into RNA, and raw RNA transcripts can then be alternatively spliced by trans-acting repressors and activators into various finished RNA transcripts for protein synthesis. These two and a myriad of other factors determine whether and how the gene contributes to the synthesis of proteins and is eventually responsible for traits. Interestingly, the white-eye ‘gene’ in fruit flies that made Morgan’s group (see above) famous turned out to be a mutation in a promoter instead [37]. Explanation of cross-generational patterns of a trait has become context-dependent, a holistic story, rather than being reducible to a single gene, an ontologically privileged “atom of inheritance” [37, p. 85].
Add to this multiplicity of concepts of genes and the holistic, complex nature of modern genetics that there does not seem to be a unique genetic unit of evolution, and we realise that ‘genes’ are, at the very least, “more indefinite and blurry entities than had been supposed” [37, p. 97]. More strongly put, “[g]enomes are more organzied objects, and their partition into genes more artificial, than the classic models suppose” [37, p. 99]—a deliberate simplification. “There is a fact of the matter about the structure of DNA, but there is no single fact about the matter what the gene is”.Footnote 19 This Mendel-inspired view of a hereditary atom, despite being the driver of success in the early 20th century, might now have become “a hindrance to our [further] understanding”, “a concept past its time”Footnote 20 that is “no longer useful”.Footnote 21
I am tempted to conclude that we should, at the current time, be anti-realists about genes. Even if one does not agree with this, it is fair to say that the current understanding of the concept of gene has not (just) filled in Mendel’s vague concept, but at least significantly augmented it and to an extent replaced it [37, 64]. We may wonder whether we would talk about genes at all, if not for the route via Mendel and Morgan [37]. Would Mendel recognise the current concept as being in the spirit of what he had in mind, or consider it an unconceived alternative? Even if so, if one were a realist about Mendel’s factors in the early days, one would not have known that anything like this modern notion of ‘gene’ is what one was believing in. The Mendelian promise of a unified concept has not born out; ‘gene’ is still, or perhaps again, a semantically thin concept, at best.
Let us return to the questions at the start of this section. Several lessons can be learnt from the history of genes, all revolving around the cautionary theme that we should not count our chickens before they hatch. We may be in situations where we are certain that the ontology of our current theory or theories is not the whole story, or at least not the correct story, but without sufficient knowledge of what is missing we cannot (yet) be realists about the missing entity. To be a realist about the unknown is to not be a realist. This is most obvious in cases where our reasons for believing that the current story is incomplete are purely theoretical. That the standard model of particle physics has what many consider to be internal explanatory gaps, such as a lack of explanation of there being three generations of quark pairs and lepton pairs, does not by itself give any knowledge of that what is supposedly missing. Knowing that quantum theory and general relativity do not mesh well together is not sufficient to know what is missing or what needs to be changed. The cases of dark matter and genes are slightly different, slightly better, in that we have some empirical patterns to go on: cross-generational traits in breeding experiments, galaxy rotation curves, etc. However, we have seen that these initial hints from nature might falsely suggest a simple solution, a simple concept, realism about which would seem to solve the problem. Semantic thinness should not be mistaken for definite simplicity.
There is no guarantee that we will be able to make the semantic common core concept of dark ‘matter’ more precise, more definite, less blurry, less vague. Even if we do manage to obtain more knowledge, that knowledge may reveal a concept that is more complex, less material, and more context-dependent than many of our current models suggest. Similarly to the case for genes, the locality of dark matter may not be as traditionally expected—the location of fuzzy dark matter [50], with its De Broglie wavelength of the order of 1kpc, is highly indefinite; Verlinde’s entropic gravity designed to mimic dark matter effects is non-local due to its holographic nature [65]. A cistron is in the first instance defined functionally, rather than materially, as an entity in a well-defined, contiguous region of space/DNA. The MOND-formalism is sometimes also viewed, in the first instance, as a functional role, an algorithm, rather than a modification of gravity or a dark matter particle.
Similarly to the case for genes, we may require multiple types of dark matter. After all, the luminous matter sector contains a whole zoo of types of particles, so why would the energetically dominant dark matter sector of the universe not consist of multiple components?
Similarly to the case for genes, advocates of dark ‘matter’ indeed being matter rather than a modification of gravity tend to expect that the galactic correlations that MOND is known to be able to explain as well as several ‘small-scale challenges’ [31, 32] are in fact due to a complex, messy interaction of dark matter and luminous/baryonic matter. Context matters.
Similarly to the case for genes, hybrid theories (see Footnote 11) suggest, at best, that there are entities that are both matter and spacetime or that are not (conceptually) stable over time (but switch, say, from being an aspect of gravity/spacetime when in galaxies to being dark matter when outside of galaxies) [28]. At worst, they may indicate a blurring of these traditional concepts, or even their inapplicability and need for replacement [30]. It is not guaranteed that once we find dark ‘matter’ it will be recognisable as such by, say, Zwicky, who first considered dark matter in the 1930s, in the context of galaxy clusters, or by the observers of galaxy rotation curves in the 1970s.
All these similarities consider merely conceived alternatives to mainstream dark matter candidates. Unconceived alternatives might make matters worse, just as the intricacies of the current concept of a gene which were very much unconceived of by Mendel.
Finally, even if the future of dark matter research vindicates a simple dark matter model that is close to the concept as originally envisaged, this still does not imply that we are currently justified in being (semantic) realists about dark matter already.