…there are two factors: namely, the nature of the organism and the nature of the conditions. The former seems to be much more the important; for nearly similar variations sometimes arise under, as far as we can judge, dissimilar conditions; and, on the other hand, dissimilar variations arise under conditions which appear to be nearly uniform. (Darwin 1872, p. 32)

It is generally acknowledged that all organic beings have been formed on two great laws [my italics]—unity of type and the conditions of existence…

On my theory, unity of type is explained by unity of descent. The expression [my italics] of conditions of existence…is fully embraced by the principle of natural selection…“Hence in fact the law [my italics] of the Conditions of Existence is the higher law; as it includes, through the inheritance of former adaptations, that of Unity of Type.” (Darwin 1872, p. 194–195)

These passages provide a robust foundation for the Extended Synthesis (Brooks 2011a, b). With the first, Darwin stated that the historical narrative of life must take precedence over perceived fit to current surroundings in evolutionary explanations. With the second, he implicated a higher law of the “conditions of existence” without naming it. The key to knowing the higher law lies in deeper understanding of the nature of the organism.

What Are Organisms?

Darwin’s dualistic view of the nature of the organism is reflected in his using adaptation to mean both “function” and “evolutionary response to environmental change.” Organisms need adaptations sufficient for survival and reproduction in the environment of the moment. During periods of relative environmental stasis, variations on those adaptations conferring non-zero fitness accumulate in a population. Organisms whose adaptations are particularly well suited to those local conditions dominate numerically at those times, but all variants with non-zero fitness survive in a species. If a single best-adapted variant replaced all others in a species, that species would go extinct with the next environmental change. A subset of adaptations as “functions” is adaptations as “evolutionary process in changing environments” (Brooks and McLennan 2002). Darwin made this a cornerstone of his theory because organisms were often found in situations in which the same adaptation had non-zero fitness in different environments (e.g., deciduousness in trees originated associated with drought but also functions well associated with cold winters), and that organisms found in the same surroundings often exhibit many different adaptations for dealing with those surroundings. During periods of environmental stasis, Darwinian evolution is “survival of the adequate with the fittest dominating numerically,” while during periods of environmental change, Darwinian evolution is “survival of the adequate with the fittest going extinct, replaced by a variant that had lower fitness in the previous environment.” This is The Gamblers Ruin: no matter how fit you are (no matter how much of the money at the poker table you have) at any given place and time, you can still go extinct if conditions change (someone with almost no money can have a run of good cards and take all your money). Just ask the non-avian dinosaurs.

Maynard Smith and Szathmary (1995) realized that solving the fundamental mystery of the nature of the organism lay in understanding the link between the origin of life and its subsequent evolution. That understanding lies in recognizing the duality of organisms as metabolic and information systems. The earliest major evolutionary transitions involved trade-offs between the need to exist, a metabolic problem, and the need to produce new generations, an informational problem. The resulting division of labor allowed more efficiency for metabolic and informational functions, but at the cost of greater integration and mutual dependence.

Darwin’s Law of the Conditions of Existence must therefore accommodate “Organisms as Metabolic Systems” and “Organisms as Information Systems” within a single conceptual framework.

Organisms as Metabolic Systems

Everything that happens has a cost. Physics treats this reality as a gigantic accounting system governed by the laws of thermodynamics. The most important of these laws is the second, which states that in all causal transformations, energy goes irreversibly from a higher to a lower state, or ability to do work. Irreversibility means energy-transforming systems have a sense of time, and can be said to be “making time” or “buying time.” The transactions can be measured in terms of the net transformation of energy (most easily seen as heat loss) or in terms of the movements of particles in the system affected by the transformation of energy (called statistical mechanics). No matter how these transformations manifest themselves, there is a net cost called “entropy.” So, another way to characterize the second law is to say that in every spontaneously occurring causal activity, entropy increases.

Traditional thermodynamics limited the energy account of a system. When energy levels inside a given system reached the same state as those outside the system, there could not be any more transformations. This is called “equilibrium.” It is the point at which the energy bank account has been exhausted, the system has achieved maximum entropy. Only if new energy of a higher grade than the surroundings is transferred into the system can it function again. Recognition of the implications of this version of the laws of thermodynamics caused panic and denial among many late nineteenth century physicists. Just as the Darwinian reality inspired “Nature red in tooth and claw,” thermodynamics inspired “Heat death of the universe.” To complete the circle, Ludwig von Boltzmann, a pioneer in statistical mechanics, gave a public lecture in Leipzig in 1905, proclaiming that Darwinian evolution was a statistical mechanical manifestation of the second law of thermodynamics.

Less than a decade later, Lotka (1913, 1925) became one of the first twentieth century authors to formally characterize biological systems as metabolic systems, maintaining themselves in highly organized states with respect to their environments by exchanging matter and energy irreversibly with their surroundings, taking in relatively high-grade energy and using it to perform useful work within the system. He suggested that the inevitable structural decay that accompanies such transactions could be delayed, although not reversed, by the system’s accumulation of energy from outside. Even here, organisms have a dual nature. Organisms undergo heat-generating transformations, involving a net loss of energy from the system, usually in the form of heat, and conservative transformations, involve changing free energy into states that can be stored and utilized in subsequent transformations. All conservative transformations in biological systems are coupled with heat generating transformations, but the reverse is not true; there is a heavy energetic cost to maintaining structure (Brooks et al. 1989; Brooks and McLennan 1990; Maurer and Brooks 1991).

Closed, or equilibrium, thermodynamic systems are a sort of WYSIWYG (what you see is what you get) system. Given a certain amount of matter and energy, the energy will be transformed to a lower state and the matter dispersed in its container as a result. Once the energy levels inside match the surroundings, and the matter inside is dispersed maximally given the boundaries of its container, equilibrium is reached and all work ceases. The bank account is empty, maximum entropy has been achieved. Equilibrium systems show no duality in energy use, so that framework clearly is inadequate for understanding biological systems.

Open, or nonequilibrium, systems allow new energy and matter to flow through, and so long as the flow continues the system functions. Total entropy changes (dS) in open systems (called entropy production because there is no a priori entropy maximum) are subdivided into exchanges between the system and its surroundings (deS: heat-generating transformations) and production internal to the system (d i S: conservative transformations). Exchanges between organisms and their surroundings cost a lot and are accompanied by much waste dissipated into the surroundings; hence, deS is large compared with d i S. However, open systems can maintain their structural integrity only by producing entropy internally (d i S > 0). Or,

Organismal production (d i S), manifested as information production, storage and transmission (biomass and inheritance), is critically important, even though it represents a tiny portion of an organism’s energy budget.

Biological systems maintain themselves in highly organized states far from thermodynamic equilibrium with respect to their environments through causal engagement with the surroundings, mediated by a “phase separation” (Prigogine 1980). That is, there is an “inside” and an “outside,” delineated by a physical boundary. For all organisms, this boundary is provided by cell membranes, which are simultaneously physical barriers between the inside and outside of the organism and highly selective mechanisms for modulating the exchange of matter and energy between the organism and its surroundings. For multicellular organisms, this barrier is a complex of cell membranes.

Production rules govern internal processes for which there is an energetic “cost” or “allocation.” Following Zotin and Zotina (1978), Brooks and Wiley (1988) used ψ to denote energy dissipation within the system. The function includes two major classes of processes: (1) the external dissipation function , mostly heat generated by production within the organism and lost to the surroundings, adding to the energy lost as a result of bringing matter and usable energy into the system from the surroundings and (2) the bound dissipation function , all structure maintained within the organism. In organisms, can be further subdivided into allocations for accumulating biomass and allocations for accumulating information that can be passed on by inheritance . Thus, d i S can be viewed heuristically as

Heat-generating processes, deS and , occur when energy and entropy flow in opposite directions, moving the system toward disordered states. Organisms slow these effects by “exporting” entropy to the surroundings; if all the heat generated by processes associated with bringing matter and energy into an organism stayed in the organism, it would rapidly die. Conservative transformations are characterized by energy and entropy flowing in the same direction, entropy production being retained within the system and tending to move the system toward more structured states. As entropy and energy flow through biological systems at different rates, structure accumulates at different levels of organization; furthermore, the structure at any given level is constrained by energy and entropy flows at other levels.

Organisms maintain themselves through time by exploiting “resource gradients” in the surroundings (Ulanowicz 1997), determined by interactions between abiotic and biotic factors. Abiotic factors can be structured in part by metabolic components of biological production . For example, both the capture of incoming solar energy by organisms and their mass re-radiation of heat affect the thermal profile of the earth. Likewise, oxygen production as a byproduct of photosynthesis or of carbon dioxide as a byproduct of aerobic metabolism affect the composition of the earth’s atmosphere. More simply, production (d i S) can influence exchanges (deS). Biotic factors are also subject to the influences of the structural portion of biological production . Metabolism tends to move biological systems in the direction of minimizing energy gradients in the environment, to the extent permitted by the inherited capabilities (and limitations) of the organisms involved (Ulanowicz 1997; Brooks and McLennan 2000). In other words, accumulated genetic information constrains the patterns of energy flow within organisms and between organisms and their surroundings (deS).

Biological systems produce entropy at different rates because energy stored by conservative transformations is degraded at different rates. At the lowest organizational levels, shortest time intervals, and smallest spatial scales, the greatest contribution to ψ is . If we examine cellular or sub-cellular structure, metabolic processes dominate explanations of observed structure. Most entropy production is dissipated into heat loss. At more intermediate levels of organization, space, or time, predominates. Most entropy production at this scale is dissipated into accumulating and maintaining biomass. Finally, on the largest and longest scales, predominates, and the patterns relevant to biological explanations represent accumulation and maintenance of genetic diversity. From the perspective of the surroundings, these patterns are correlated with energy gradients, whereas from the perspective of the genealogical system, they are correlated with phylogenetic relationships and patterns of geographical distribution mirroring geological evolution occurring on similar temporal and spatial scales.

The Controversy

Theoretical studies (Prigogine and Wiame 1946) and popular texts (e.g., Schrödinger 1945; Blum 1968; Prigogine 1980) laid the groundwork for a view which, ironically, links life and its evolution to the second law mostly by its presumptive ability to circumvent the law. Schrödinger, Blum and Prigogine argued that life was physically improbable, demanding an explanation involving rare events. Following the New Synthesis, they accepted the progressive nature of evolution, also contrary to the expectations of the second law, at least in its nineteenth century formulation. Prigogine et al. developed a heuristic model in which life originated as an improbable event and evolved into increasingly improbable states. The model suggested that life could not have originated “on its own,” it must have had “outside help,” by which they meant thermodynamic flows from the surroundings into the system, deS. And they discovered something exciting—near-equilibrium, random fluctuations in the exchanges between the system and surroundings could theoretically produce states of lowered entropy. They reasoned that if such fluctuations were, on rare occasions, “captured” in a stable state, they could move themselves farther and farther away from thermodynamic equilibrium, “feeding on negentropy” (Schrödinger 1945). This view became so widespread that when Broda (1983) discussed Boltzmann’s 1905 lecture, he inserted “[negentropy]” after “entropy” throughout the text.

The metabolic duality of organisms provided presumptive support. In the process of exchanging matter and energy with their surroundings, organisms degrade their surroundings more than themselves, remaining in a low-entropy state relative to their surroundings [this is the meaning of negentropy, not that entropy has decreased]. This view informed two general concepts in biology—a form of self-organization (Depew and Weber 1995) and the principle of maximum entropy production (Swenson 1989). Superficially appearing to be the Darwinian duality (the nature of the organism and the nature of the conditions), “self-organization” in this context means the tendency for the system to organize itself according to the nature of the surroundings. The principle of maximum entropy production asserts that systems utilize resources from the surroundings as rapidly as possible, construed as a kind of selection in which whoever sequesters the most energy fastest wins, starving out their slower competitors. Both concepts ascribe causality to the surroundings; significantly absent is thermodynamic production, d i S. Thus, there is no “nature of the organism.”

Nor do biological systems behave as the principle of maximum entropy production suggests. Zotin and Zotina (1978) documented the general pattern. Early in ontogeny, organisms exhibit high metabolic rates, similar to maximum entropy production. This “immature” stage, however, is always replaced by a “mature” phase, characterized by reduced metabolic rate. Finally, all organisms enter a “senescent” stage in which metabolic rate decreases to a point that the organism no longer functions. This same dynamic occurs during ecosystem succession (Ulanowicz 1997). Decreasing rates of entropy production are determined by interactions between the surroundings and the “sense of self” the organism inherits from its parent(s).

In contrast with Darwin, Boltzmann felt life was a struggle for entropy, not for survival. I think both were correct. In one sense, organisms struggle to stay alive, and do so by processing matter and energy from their surroundings. They must find the necessary forms of matter and energy to sustain (their) lives. Organisms finding themselves in a place and time where such resources are available are thus able to “survive” in Boltzmann’s sense. His viewpoint is easier to see if we consider the source rather than the fate of the matter and energy organisms use to sustain life. Living systems must find usable energy. Plants find abundant “free energy” in the form of photons of light coming from the sun. The source of those photons is thermonuclear reactions in the sun involving states of matter and energy that no terrestrial life can use. Being relatively low energy products of the sun’s thermonuclear reactions “exported from the system to the surroundings,” photons are part of the suns entropy production. Plant biomass built using photonic energy is part of the entropy production of the plant. When an herbivore eats plant biomass, it’s feeding on entropy.

Evolution is more than just living, however, it’s descent with modification. Biological systems, from organisms to ecosystems, exist in a low-entropy state relative to their surroundings, but not relative to their own previous state. This is the result of producing and maintaining structure that is complex and organized relative to the surroundings, according to inherited information specifying internal production rules, which are largely insensitive to the details of environmental conditions (Darwins Necessary Misfit).

Organisms as Information Systems

Information theory has developed two general perspectives, “communications theory” and “measurement theory.” Both agree that information (1) is anything transmitted from a “source” through a “channel” to a “receiver” and (2) is an abstraction rather than a material part of the system.

In communications theory, the amount of information sent from a source is calculated using a statistical entropy function. Transmission errors result from poor encoding at the source or from noise in the transmission channel. Meaningful information is that subset of transmitted information actually recorded by the receiver (there may or may not be a separate decoder). All processes affecting transmission and reception of the information decrease the entropy of the message from its maximal value at the source. Physical entropies are expected to increase as a result of work done on the system, so either information transmission is not a physical process or the communications view of entropy is non-physical.

Measurement theory provides a second formalism. Brillouin (1962) distinguished “free information,” an abstraction involved in descriptive exercises, and “bound information,” referring to material properties of the system (but not requiring that information be a material part of the system). Bound information is determined with respect to the “complexions” (microstates) of the system. It is calculated using a statistical entropy function but, contrary to communications theory, is expected to exist only in systems for which there is a non-arbitrary microstate/macrostate distinction. Bound information is defined as

where, Hmax refers to the totally relaxed state of the system (usually estimated by randomizing the observed components of the system). Brillouin defined I as “negentropy,” which is converted into bound information by measurement (measuring devices are receivers), so negentropy = information.

There must be an additional conception of information for biology because biological information (based on nucleic acids) is a communications system with a material basis. What is needed is a formalism in which (1) information is a material part of a system rather than just an abstract representation and (2) there is an objective difference between macrostates and microstates in calculations of informational entropies.

The Brooks/Wiley Proposal

Brooks and Wiley (1988) used the informational “entropy function” to examine changes in information over time in biologically realistic situations. The function summarizes changes in the number of parts, the number of kinds of parts, and the relative frequency of the different kinds of parts. This has four illuminating formulations: (1) the “actual” entropy (Hobs; information content or expressed information) calculated on the basis of the observed distribution of components of the system; (2) the “maximum possible” entropy (Hmax;information capacity), estimated by calculating the entropy value for the components of the system at any given time if they were all randomized; (3) an absolute difference (HmaxHobs; information: Gatlin 1972; macroscopic information: Landsberg 1984a,b); and (4) two conceptually related relative differences (Hobs/Hmax; Order: Landsberg 1984a,b) and Id/Hmax (Id = information density; redundancy: Gatlin 1972). Simple heuristic simulations emulating biological processes associated with the storage and transmission of information (e.g., reproduction, ontogeny, and speciation) produced three generalities (for illustrations, see Brooks and Wiley 1988): (1) Hobs increases over time; (2) Hobs is a “concave function” of time, as historical constraints retard the rate of entropy increases; and (3) the difference between Hmax and Hobs increases over time, permitting the growth of structure and organization (Collier and Hooker 1999). Point 3 thus points to an informational duality in the nature of the organism; entropy and information/organization/order (redundancy) both increase over time. If Hmax is a function of the capacity, or potential, of a system and Hobs is a function of the content, or expression, of some of that potential, the difference between total information capacity and information content is proportional to the constraints, inherent and extrinsic, on the system. For example, additive genetic variance could be construed as an indication of population-level entropy, while genetic correlations would be an indication of organizing principles constraining that variance.

There is a physical basis for the Brooks–Wiley formalism, which shows increasing rather than decreasing informational entropy because the calculations account for historical correlations among the parts of the system. Frautschi (1982, 1988; see also Landsberg 1984a,b) contrasted two classes of processes that generate entropy. The first is equilibrating temperatures between system and surroundings. Biological systems exhibit this behavior through processes that result in heat loss. The second is expansion of the phase space (the realm of possibilities) in which the system resides, increasing its number of accessible microstates (possible configurations). Organization increases so long as equilibration (equiprobable distribution of the system over its microstates) occurs at a slower rate than the expansion of the phase space, allowing a lag between the increase in realized entropy (Hobs) and the increase in the maximum possible entropy of the system (Hmax), which is a linear function of the logarithm of the number of states or size of the phase space. If the phase space expands faster than the system can fill it up, increasing entropy can be accompanied by the emergence of organized structure. In cosmology, this argument explains the spontaneous and irreversible emergence of stars, solar systems, galaxies, and other organized structures, in which fundamental forces linking material bodies, like gravity, slow down the entropic diffusion of matter in the universe to such an extent that organized structures emerge as a result of, and not at the expense of, increasing entropy. In biological systems, mutations (as well as higher order genotypic and epigenetic phenomena) expand the genetic phase space, while their inheritance systems (Jablonka and Lamb 1995; Maynard Smith and Szathmary 1995; 1999; Szathmary 2000), as well as the environments in which they exist, play roles analogous to fundamental forces like gravity (Brooks and Wiley 1988).

Collier (e.g., 1998, 2000) related biological information to the causal capacity of a system, its ability to impose distinctions on its surroundings. In a way, the emphasis is on how the system produces effects on measuring devices and not on how the measuring devices are affected. Collier proposed that physical (=material) information systems occur as arrays, or multi-dimensional messages, in which macrostate and microstate distinctions are distinguished non-arbitrarily, and that in order for this information to be related to physical concepts there must be (1) a physical (material) basis for the information, (2) an energetic cost in producing the information, and (3) a real (non-arbitrary) macrostate/microstate distinction. Since the discovery of the chemical structure and function of DNA, there has been a material basis for biological information, satisfying (1) above (see also Collier and Hooker 1999; Brooks et al. 1989; Smith 1988, 1998, 2000).

Energy dissipated within the system as a result of work done by the system of d i S) is intropy, meaning internal thermal entropy (overhead of Ulanowicz 1997). Energy that is converted into structure of d i S) is enformation, meaning intrinsic information (or structural entropy; these distinctions originated in Collier, 1990). Conservative processes within biological systems are coupled with heat-generating processes, so there is an energetic cost associated with the production and maintenance of biological information. Intropy and enformation are interconvertable (e.g., energy brought in from the surroundings can be converted into structure, say glycogen, which can then be converted into heat). Intropy is converted into enformation by cohesive properties of the system. Cohesive properties, ranging from molecular affinities to cell–cell adhesion, to genetic compatibility, mate recognition, and genealogy, also provide resistance to fluctuations from lower levels, allowing macroscopic properties to emerge. Cohesion is thus analogous to inertia. The major transitions in evolution (Maynard Smith and Szathmary 1995; 1999) are all associated with the emergence of new forms of cohesion, which permit enformation to be stored and transmitted more efficiently.

Cohesive properties are also the key to understanding microstate/macrostate distinctions in biological systems. According to Collier, macrostate/microstate distinctions are determined objectively by part/whole associations. The number of accessible microstates is increased by the production of new components, either at a given level or through the opening up of new levels of organization. Biological systems accomplish this by conservative transformations. For example, auto-catalytic processes producing monomers make “monomer space” available for molecular evolution. Some monomers have high chemical affinities for each other and will spontaneously clump into dimers and polymers. Once polymers begin to form, “polymer space” becomes available to the evolving system. At this level, polymers are macrostates, and monomer and dimer distributions are microstates. Causal interactions among polymers create new levels of organization in which polymer distributions are the microstates and new levels of organization are the macrostates, and so on. Each new functional level creates a hierarchy of increasing structural intricacy, manifested by increasing allocation of the entropy production in structure. Therefore, allocation of d i S to might be proportional to entropy increases due to the expansion of phase space resulting from the creation of new possible microstates. A protein-coding unit might be considered a macrostate, while all the actual sequences that code for that protein would be the microstates; a locus could be a macrostate, and all alleles corresponding to that locus the microstates; phenotypes could be macrostates, and all genotypes corresponding to a given phenotype would be microstates.

That organisms are digital replicators is a key element in understanding their essentially unlimited capacity for variation (Maynard Smith and Szathmary 1995, 1999; Szathmary 2000) and also accords with the Brooks–Wiley formalism. Consider a replicator comprising a string of DNA 1,000 bases long. According to traditional calculations, the sequence contains a maximum of 2,000 bits of potential information (this fully defines its structure at all levels). Such calculations have been used to suggest that there is not enough information in the DNA of a cell to specify an entire organism, and therefore information from the surroundings must be incorporated during development. This parallels interpretations of self-organization as systems organizing themselves with respect to the surroundings—the organism’s information system is cheaply produced because its only function is allowing the system to conform to the much larger amount of information in the surroundings (and for which the energetic cost, d e S, is much greater). If the string were a holistic replicator, it would function strictly as a single unit. If that string functioned as a digital replicator, however, it could be read at multiple levels (in this case, 1,000 levels from single bases to the entire sequence) to produce a diversity of information. Each of these readings would yield a maximum information capacity of 2,000 bits. Thus, if each reading of a given sequence is equivalent to all others informationally, and if these readings are not interactive (i.e., not cohesive), the total possible readings would have a maximum information capacity of 2,002,000 bits. If the bases are interactive, then these self-interactions will constrain the total information capacity. The total amount of information that could be expressed at any one point in time is highly constrained by the fact that bases (a similar argument holds for genes, tissues, and organisms) are causally linked, so accessing some information will limit (or eliminate) expression of other—cohesion therefore putting an upper limit on the amount of information potential/capacity that could be expressed at any one time. Accessing the same system in different ways sequentially through time permits the same constrained quantity of information to be additive, since at each point in time a different 2,000 bits is expressed. The information system is cheap to produce but has more than enough potential information to specify an organism—in fact, much of that potential must be dissipated (d i S) in order to distil out an organized organism. Ontogeny is that distillation process—it’s an energy-efficient algorithm for converting digital information into analog output using matter and energy from the surroundings (deS) to accomplish the task, paid for by dissipating potential information.

A final aspect of the Brooks–Wiley formalism pertains to the meaning of information. Many information theorists define information as meaningful only if a receiver converts it. A neo-Darwinian perspective is that replicators are the source of biological information, and the environment in which they live is the receiver, the nature of the conditions determining what is meaningful (Gatlin 1972). The Darwinian interpretation is that replicators are both source and receiver, transmitting information from themselves at t0 to themselves at some future tn. The nature of the organism thus determines what is meaningful information (Csanyi 1989; Kampis 1991, 1998; Brooks and McLennan 1997). This perspective has parallels in self-correcting computer programs and self-correcting capacities of DNA. Information is meaningful because it is organized in such a way that an organism develops and transmits the information to the next generation (Collier and Hooker 1999). The environment “selects” among varying forms of meaningful information; it does not give them meaning.

Summary

“We’re Just Recycled History Machines” (Jimmy Buffett, Dont Chu-know)

A unified theory of biology must incorporate naturalistic explanations for, but cannot be solely a theory of, the origin of life. The same general mechanisms that allowed life to originate must also explain its subsequent evolution within a single narrative flow. Maynard Smith and Szathmary (1995) understood this, so their framework is the best candidate for the Extended Synthesis (Brooks 2011a,b). Boltzmann believed Darwin’s Law of the Conditions of Existence was the Second Law of Thermodynamics. Advances in the thermodynamics of open systems have allowed us to confirm that in the duality of organisms as metabolic and information systems (Vasas et al. 2010).

Associating evolutionary dynamics with thermodynamic production underscores the importance of time and history in biological explanations. All genomes are historically conservative, making evolution affordable; they are copies of templates, and the templates do not have to be reconstructed in every generation. History lowers the cost of innovation because innovations are modifications of pre-existing information. History lowers the cost of ecological specialization because specialists on widespread resources have many options (Brooks and McLennan 2002; Agosta et al. 2010). Adaptability occurs virtually for free because adaptability is a synonym for retained history of what worked in the past. Finally, history lowers the cost of community organization if colonizers bring traits that allow them to use resources not being used by residents, thereby avoiding competition (Brooks and McLennan 2002).

We now have a thumbnail of a Theory of Biology. The Meta-Game of Life is “Persist as long as possible by integrating information flow and functional engagement with the surroundings.” The strategy for accomplishing the meta-game is “Increasing the efficiency of the information flow (the nature of the organism) enhances self-stability, which creates various forms of selection, which enhance mutual stability between the system and its surroundings (the nature of the conditions).” Even more simply, paraphrasing Maynard Smith and Szathmary (1995), “So long as the information flows, adaptation will take care of itself.”