A growing number of studies have begun to investigate the impact of environmentally mediated mortality threats on life history strategies. For example, research indicates that children growing up in contexts characterized by socioecological stressors, such as father absence and economic unpredictability, reach puberty earlier and are more sexually precocious than those growing up in contexts lacking such stressors (Belsky et al. 1991; Belsky et al. 2010; Draper and Harpending 1982; Ellis 2004, 2005; Ellis and Essex 2007; Moffitt et al. 1992). Despite the increased interest in the role that one’s ecological conditions play in shaping life history strategies, much less is known about the role that one’s vulnerability to lifespan-shortening diseases play in these resource allocation decisions [with notable exceptions, see e.g., (Jones et al. 2008; Rickard et al. 2014; Waynforth 2012)]. In the following paper, we give an overview of research conducted by ourselves and others that suggests that one’s vulnerability to disease – stemming both from external sources, as well as one’s immunocompetence – has important implications for resource allocation decisions that comprise one’s life history strategy. In particular, we highlight the role that a person’s immunocompetence plays in calibrating life history strategies. We then highlight important issues and challenges in this emerging area of inquiry, illuminating potentially promising avenues of research needing elucidation. Together, this research suggests that vulnerability to illnesses may play a key role in modulating life history strategies.

Life History Theory

Life history theory is a well-established, biologically-based theoretical framework used to predict how and when organisms – including humans – will allocate effort among the various tasks required for survival and reproduction (Charnov 1993; Kaplan and Gangestad 2005; Roff 1992; Stearns 1992). Because somatic effort is inherently limited, organisms often face important trade-offs in how they distribute these resources toward several competing life-sustaining components – growth, maintenance, and reproduction – at any given moment in time. For example, energy allocated toward immune function cannot be used to attract a mate, and vice versa. Accordingly, throughout development, individual organisms must ‘choose’ how to divide up somatic resources toward achieving the various sub-goals required for successful survival and reproduction (Ellis et al. 2009; Kaplan and Gangestad 2005). How and when an individual resolves these tradeoffs constitutes that individual’s life history strategy (Del Giudice et al. 2011; Ellis et al. 2009).

Although organisms’ life history strategies are multidimensional and complex, they are often described as falling along a continuum of fast to slow. Those adopting faster strategies tend to make trade-offs that prioritize mating effort. They mature relatively rapidly, begin reproducing early, and have a greater number of offspring, in whom they tend to invest relatively little. Those who follow a slower strategy, on the other hand, make energy allocation decisions that prioritize prolonged development and delayed reproduction. They reach sexual maturation later, have a later age at first reproduction, and have fewer offspring, in whom they invest a great deal. Although first developed to yield insight into life course differences observed between species, this theory also provides a framework for understanding such variation within species. For instance, although a slow life history strategy is characteristic of humans in general, individual men and women differ in their life history strategies (Daan and Tinbergen 1997). Some individuals mature quickly, have a relatively large number of children, and invest little beyond what is absolutely necessary in each child. Others develop more slowly, have fewer children, and invest heavily in each child. Such variation suggests that humans may possess mechanisms for adjusting their life history strategies based on local socioecological conditions (Belsky et al. 1991; Ellis 2004; Hill and Kaplan 1999; Stearns 1992; Stearns and Koella 1986; West-Eberhard 2003).

Ecological Factors that Influence the Formation of Life History Strategies

What are the factors that influence an individual’s life history strategy? Although there are many, research indicates that the risk of mortality in one’s environment plays a key role in modulating life history pathways. Mortality risks can be characterized as extrinsic or intrinsic (Ellis et al. 2009). Each type of mortality risk is predicted to have unique effects on life history strategies, and is described below.

Extrinsic mortality threats are those that cannot be avoided by changes in behavior or resource allocation decisions made by the organism. In other words, these are mortality risks over which one has little control. Key environmental factors that impact one’s extrinsic mortality risk are the relative degrees of harshness and unpredictability in an environment (Daly and Wilson 2005; Ellis et al. 2009; Simpson et al. 2012; Stearns 2000). Environments that are characterized by high levels of harshness (factors that increase the risk of morbidity / mortality, such as scarce resources, high homicide rates, or poor maternal care) or by high levels of unpredictability (such as changing familial status or fluctuating resource availability and parental care) tend to encourage adoption of faster life history strategies (Del Giudice 2009; Ellis et al. 2009; Kaplan and Gangestad 2005). Such resource allocation decisions make good adaptive sense in these contexts because they decrease the risk of an organism perishing without first having had the opportunity to reproduce. Consistent with this hypothesis, research finds that women growing up in neighborhoods in which the life expectancy is lower tend to have their first child at a significantly younger age than those living in neighborhoods in which the life expectancy is higher- a pattern that repeats itself when making comparisons within nations and across time periods (Griskevicius et al. 2011a; Low et al. 2008; Nettle and Cockerill 2010; Wilson and Daly 1997)

An intrinsic threat is one in which death can be delayed or avoided by increased energy allocation to one’s own somatic development / maintenance (Rickard et al. 2014; Stearns 1992, 2000; Waynforth 2012). Life history theory predicts that intrinsic mortality threats should prompt individuals to adopt resource allocation strategies that prioritize somatic rather than reproductive effort. For example, if a person is sick with the flu, the progression of the illness (and the heightened chances of resulting mortality) can potentially be forestalled or reversed by allocation of effort toward immune function, fever, and behaviors that foster recovery. Indeed, numerous studies have documented the well-known phenomenon of “sickness behavior”, which is characterized by many of these changes (e.g., diminished motor / social / sexual behavior, diminished foraging / eating, general anhedonia, increases in slow-wave sleep, and fever). Although such behaviors were originally believed to be a maladaptive byproduct of an organism being too ravaged by sickness to cope, this response is now understood to reflect a well-orchestrated constellation of behavioral adaptations that help a given organism remain safely at home, diminish odds of predation, obtain adequate rest, and ward against the intrinsic threat of pathogen replication (Aubert et al. 1995; Dantzer and Kelley 2007). Therefore, although extrinsic risks promote more immediate reproductive effort, intrinsic threats can increase effort directed toward one’s own somatic maintenance, provided one has the bodily resources to do so.

Vulnerability to Disease as an Extrinsic Mortality Risk

To date, much of the research examining the impact of extrinsic mortality threats on life history strategies have focused almost exclusively on external threats, such as the homicide rate or famine risk in the environment. However, a growing number of researchers have begun to note that one’s life history strategies should also be influenced in important ways based on variation in factors that impact one’s internal state, such as bodily robustness, immune competence, and the burden of deadly disease in one’s environment (Charlesworth 1990; Jones et al. 2008; Gluckman et al. 2007 ) From this view – although life history theory predicts that intrinsic threats such as manageable infections (e.g., colds or flu) will favor increased somatic effort (e.g., sickness behavior) – the extrinsic threat posed by lifespan-shortening disease over which one feels little control (e.g., due to generally poor immune competence, chronic illness, or, potentially, living in an environment rife with virulent pathogens) may speed up one’s life history strategies (McNamara et al. 1999; Van Noordwijk and de Jong 1986). Such a shift in bodily resource allocation would decrease the likelihood that one would succumb to disease before first having had the chance to reproduce.

A growing body of research has found support for the hypothesis that the burden of lifespan-limiting disease – both internal and external to the organism – may encourage the adoption of faster life history strategies. For example, Jones et al. (2008) found evidence that Tasmanian devils living in areas that have high rates of a deadly facial tumor disease have a 16-fold increase in precocious sexual maturity. Others have found that mosquitoes in poor condition respond to parasitic infection by laying eggs sooner than do more able-bodied mosquitoes or those not infected by disease (Vézilier et al. 2015). In humans, researchers have found that individuals experiencing chronic health conditions such as cancer, epilepsy, and diabetes experience earlier pubertal timing (Park et al. 2012; Widen et al. 2012), engage in more risky sexual activity (Erickson et al. 2005; Suris et al. 2008; Suris and Parera 2005; Valencia and Cromer 2000), and have an earlier age at first reproduction (Waynforth 2012) than those without these illnesses. For example, a large longitudinal study in Britain recently revealed that having a serious life-expectancy-reducing chronic illness diagnosed in childhood is predictive of having an earlier age at first reproduction (Waynforth 2012). Specifically, individuals with chronic disease were found to be 1.6 times more likely to have had a first child by age 30 than their developmentally healthier counterparts.

Recently, research in our laboratories has begun to build on this body of research to examine whether one’s immunocompetence may play a key role in modulating life history strategies. The competence / efficacy of one’s immune system plays a key role in modulating one’s risk of death and its function is largely outside one’s direct control (McDade 2003). In addition to providing a key line of defense to help ward against external pathogenic threats, one often overlooked facet of immune function involves the ability of select white blood cells (NK cells and T cells) to recognize and destroy nascent cancer cells (Lanier 2004; Swann and Smyth 2007). Deficiencies in immune function – whether operating through stress, effects of early life experience, and / or simply through carrying suboptimal alleles linked to immune function – are therefore predicted to be associated with a heightened risk for infection and disease, as well as diminished longevity (Kaiko et al. 2010; Lowin et al. 1994; Pike et al. 1997; N. Sakaguchi et al. 2003; Shavit et al. 1984). Accordingly, we have recently hypothesized that the functionality of an individual’s immune system should play an important role in modulating life history strategies.

To test whether there is a relationship between immunocompetence and life history strategies, we have examined the relationship between measures of vulnerability to illness – using both self-report measures of health as a proxy of immune function and actual biological indicators of immune system reactivity – and behavioral and cognitive correlates of faster life history strategies. We predicted that individuals who reported (or displayed biologically) greater vulnerability to illness would report having greater difficulty delaying gratification, would exhibit a greater preference for immediately available rewards, would exhibit riskier sexual behavior, and would score lower on measures of slower life history strategies than would those with less vulnerability to illness.

We first conducted two surveys that examined the relationship between self-reports of vulnerability to illness and factors that measure decision-making approaches characteristic of faster or slower life history strategies (Prokosch and Hill 2015). In these studies, we examined the relationship between health history, the ability to delay gratification, and the cognitive and behavioral proxies of one’s life history strategy that are measured using the short form of the Arizona Life History Battery (Mini-K; Figueredo et al. 2006). We chose to measure the ability to delay gratification because faster life history strategies are characterized by a preference for more immediately available rewards (Griskevicius et al. 2011b). We used the Arizona Life History Battery because it has been validated as a measure of behaviors that characterize different life history pathways (Figueredo et al. 2004; 2007).

In the first survey, 102 undergraduate participants (51 women, 51 men; M age  = 19.75, SD age  = 3.21 completed a survey measuring their (a) history of vulnerability to illness (e.g., “As a child, I missed school frequently due to illness”Footnote 1), (b) childhood SES (Griskevicius et al. 2011b) (c) ability to delay gratification (measured using the delay of gratification inventory [DGI]: Hoerger et al. (2011); e.g., “I have an easy time delaying gratification”) and (d) cognitive and behavioral indicators of life history strategy (measured using the mini-K from the Arizona Life History Battery: Figueredo et al. (2006) e.g., “I can often tell how things will turn out”). As predicted, our results revealed that individuals higher in vulnerability to illnesses reported more difficulty delaying gratification and exhibited fewer cognitive and behavioral indicators of a slower life history strategy on the mini-K (See Table 1). Importantly, greater vulnerability to illness was found to be negatively predictive of the ability to delay gratification even after controlling for the impact of childhood SES, scores on the mini-K, or both simultaneously.

Table 1 Study 1 correlations between measures of impulsiveness and vulnerability to illness, using developmental and genetic sickness history as a marker

In a follow-up survey, a separate sample of participants (N = 57; 40 women; M age  = 19.68, SD age  = 1.18) again filled out (a) the DGI (Hoerger et al. 2011), (b) the Barratt Impulsiveness Scale (BIS11, “I often act without thinking”) (Patton et al. 1995), and (c) a task assessing preferences for immediate versus delayed monetary gains (Griskevicius et al. 2013). With each of these measures, a higher score corresponded to a greater desire for an immediate gain, which is characteristic of faster life history strategies (Griskevicius et al. 2011b). Next, participants reported on (d) an expanded measure of vulnerability to illnessesFootnote 2 and (e) the same measure of childhood SES. Results again revealed a negative relationship between vulnerability to disease and decision-making characteristic of faster life history strategies (see Table 2). As in our first study, the results remained significant even after controlling for the effects of childhood SES. Together, the results of these studies are consistent with the hypothesis that immune function – here, measured by proxy via self-reported health vulnerability – may play a role in modulating cognitions and behaviors that comprise one’s life history strategy.

Table 2 Study 2 correlations between measures of impulsiveness and vulnerability to illness, using overall sickness history as a marker

More recently, we have expanded on this research by examining biological markers of immune function. Specifically, we have been examining the relationship between decision-making correlates of faster life history strategies and (a) the functioning of one’s natural killer (NK) cells and (b) proliferation of one’s white blood cells in response to stimulation. NK cells are large, granular lymphocytes that are notable for their ability to kill virus-infected cells, as well as cells that have turned cancerous, without prior sensitization (Butterfield and Whiteside 2014). NK cells play a key role in the first line of defense against viral infections and the development of tumors, and are therefore predicted to play an important role in modulating one’s risk of extrinsic mortality risk. We have also examined the proliferation of peripheral blood mononuclear cells (PBMCs) when challenged in vitro with different classes of mitogens. Although a much coarser measure of immune-mediated extrinsic mortality risk, the ability of T cells and B cells to undergo clonal expansion and proliferate when stimulated represents a key feature of the adaptive immune response to challenge (Murphy 2011; Stone et al. 2009). We therefore also measured proliferation in response to each a T cell and B cell mitogen as a secondary measure of immune reactivity.

The results of this new research has begun to reveal that a key aspect of immunocompetence – the reactivity of one’s NK cells in response to a live tumor challenge – is predictive of cognitive and behavioral correlates of faster life history strategies. In particular, we found NK cell activity to be positively associated with scores on the short form of the Arizona Life History Battery (higher scores reflect slower life history strategies), negatively associated with risky sexual behaviors, and negatively associated with the ability to delay gratification (Hill et al. 2016). As in the self-report studies, these associations held even after controlling for childhood SES. No such associations were found for either of our proliferation measures (i.e., proliferation in response to LPS or PHA, at any of three time-points measured). We are in the process of measuring the mitogen-induced cytokine release that occurred in tandem with the proliferation, however, which may help uncover the larger picture of participant immunocompetence in this domain.

The results of our pilot work, as hypothesized, offer interesting potential corroboration of the prior research conducted in our laboratory using questionnaire measures of perceived infectability, and indicate that diminished levels of immunocompetence, whether obtained via actual biological measures or self-reports, predict the adoption of a fast life history strategy. The additional measures currently being added to the study (both immunological and cognitive / behavioral) will help determine which facets of immune function are most predictive of life history strategy, impulsivity, and risky sexual behavior, and will also help elucidate what kinds of behavioral and decision-making processes correlate with diminished immune responsiveness.

Immune Function: More than an Intermediary between Early Life Stress and Faster Life History Strategies

Here we have proposed that immunocompetence plays a key role in calibrating life history strategies. According to the perspective we develop here, immunocompetence should play a key predictive role in the development of one’s life history pathway, regardless of the source of one’s immune-based vulnerabilities. This is an important point to address because a growing body of research has found evidence that stress, particularly when encountered early in life, may play an important role in varied facets of immune function (for review, see Slavich and Cole 2013). Given that one’s developmental conditions play an important role in predicting both immune function and faster life history strategies, some have hypothesized that bodily health and robustness may be key mediators in the relationship between early life stress and faster life history strategies [see e.g., (Rickard et al. 2014)].

Although strong evidence supports the idea that one’s childhood circumstances impact both immune function and life history strategies, we argue that the impact of immune function on life history pathways should occur above and beyond its effect as a mediator between these two variables. Indeed, we hypothesize that less robust immune function will predict the adoption of faster life history strategies, regardless of the causes for one’s immune-based vulnerabilities (e.g., through sub-optimal alleles or chronic stress). Because the activity of one’s immune system involves many distinctly separate yet cooperative facets of immune function, each of which converge to impact one’s risk of extrinsic mortality, even those growing up in safe, benign environments with few external mortality threats are predicted to adopt faster life history strategies if they lack immune competence. Indeed, it has long been known that various genetic problems –from genetic anomalies writ large, such as the presence of extra chromosomes [e.g., trisomy 21 (Heston 1977; Zampieri et al. 2013)] to more subtle single-gene / single nucleotide mutations – factor significantly into a large number of immunological problems. As is true for the genetics of height (Allen et al. 2010; Wood et al. 2014), but to an much greater extent given the tremendously multifaceted nature of immune function, a large number of small variations in numerous relevant alleles may impact outcome when it comes to immune competence, sometimes with suboptimal outcomes. Such instances range from defective complement opsonization (Fremeaux-Bacchi et al. 2008; Sharma and Pangburn 1996) that increase one’s susceptibility to bacterial infections, to the anomalous “friendly fire” casualties that are incurred in autoimmune conditions (Aaltonen et al. 1997; Sakaguchi et al. 2006), or even unusual cases in which lymphocytes lack the ability to rearrange the genes encoding antigen receptor diversity, leading to severe immune deficiency (Schuler et al. 1986).

Although some of the cases cited above are uncommon or dramatic, others are not, and little doubt exists that allelic variations have affected immune function throughout human history, in combatting a wide variety of pathogens, whether viral, bacterial, protozoan, or helminthic. Therefore, if poor immune function renders an individual vulnerable to disease threat, or the individual experiences mere perception of heightened vulnerability, the adoption of a faster life history strategy would be predicted. Indeed, findings in the field of immune genetics provide solid evidence that even healthy individuals often carry specific alleles of immune-related genes that may render them either more or less vulnerable to certain kinds of immune deficiencies, autoimmune diseases, or cancers (see e.g., (Medzhitov and Janeway 1997). There is tremendous between-individual variability in immune function, and only a portion of this variability can be accounted for by one’s exposure to developmental or adulthood stress. We hasten not to discount or dismiss the impact of stress on immune function, but rather to underscore that there are a number of causes for variation in immune competence. This variability – regardless of its origin – is hypothesized to play a role in modulating life history strategies due to its recurrently important role in determining mortality risk over evolutionary time.

Mechanisms of Immunocompetence-based Life History Calibration

The idea that immunocompetence might predict differences in life history strategies implies the necessity for a mechanism or mechanisms linking these two variables. Although little is currently known about what such mechanisms might be, we here offer some possibilities. One possibility is that it this relationship is simply mediated by one’s internal model / perception of his / her immunocompetence that grows out of one’s personal health history. From this perspective, one may develop a working model of perceived immune competence (something that could be assessed using a measure like the perceived infectability subscale of the PVD-PI: Duncan et al. 2009), based on personal experiences with illness and clearance of pathogens.

An alternative route by which a person may become cognizant of her own immune competence/status (other than simple perceptions of prior illness) is that it occurs through immune-nervous system signaling of which one is mostly not consciously aware. Indeed, considerable evidence indicates numerous avenues of bidirectional communication between the immune system and nervous system. For example, it is widely appreciated that psychological stress or other neural activities may alter immune function via neuroendocrine (e.g., HPA axis activity; Steptoe et al. 2007) or via autonomic fibers that release neurotransmitters /neurohormones directly onto immune cells replete with varied neurotransmitter receptors (Felten et al. 1991; Felten and Felten 1994). A multitude of studies has demonstrated such effects upon a variety of immunological outcomes and activities. Rather less well appreciated are ways by which immune system activity may signal the central nervous system (i.e., moving the other direction on this two-way street). Afferent signals informing the CNS of peripheral immune events come primarily via immune cell release of small molecular weight signaling proteins known as cytokines (Ader 2014; Dantzer et al. 2014; Licinio and Wong 1997; Wohleb and Godbout 2013).

In addition to the neural-immune interaction pathways described above, immune challenges themselves trigger cytokine production in the periphery that signal the brain, both directly and indirectly. Some have even gone as far as to compare the immune system itself to a kind of dispersed interoceptive sensory organ that – in addition to its primary role in discriminating self from non-self and eliminating non-self – reports on one’s internal state of infection/sickness directly to the brain (Blalock 1984). Although the precise means by which such peripheral immune events are transduced continue to be elucidated, proinflammatory cytokines produced in the periphery following infection or other immune events undoubtedly trigger numerous changes in the CNS (Raghavendra et al. 2004). Although cytokines were initially not believed to have access to the brain, due to the blood–brain barrier, research demonstrates that cytokines may gain access to the CNS via active transport across the blood–brain barrier (Licinio and Wong 1997), may enter the brain via circumventricular organs (areas of the brain that lack definitive blood–brain barrier properties; Konsman et al. 2002), or may signal the brain via vagal afferents (Maier et al. 1998), leading to de novo synthesis of proinflammatory cytokines inside the brain itself (e.g., often through the brain’s resident macrophage/immune cell, the microglial cells; Nayak et al. 2014). Further, and intriguingly, Kipnis and colleagues have demonstrated that circulating T cells play a supportive and protective role in brain function and cognition, as mice depleted of T cells have marked deficiencies in hippocampus-dependent learning, deficiencies that can be reversed with adoptive transfer of brain antigen-reactive T cells into a host otherwise bereft of T cells (Radjavi et al. 2014; Yirmiya and Goshen 2011). These provocative data indicate strongly that diverse and often unexpected routes of communication between immune system and brain are at play in both sickness and in health.

Some of the most extensively investigated instances of peripherally produced proinflammatory cytokines influencing brain and behavior include numerous studies that have shown that cytokines (e.g., during bouts of sickness or cytokine-based treatment therapies) trigger depressive behaviors (Gershenfeld et al. 2005; Kiecolt-Glaser et al. 2002; Raison et al. 2006), that cytokines may induce a constellation of biologically adaptive behaviors known as “sickness behaviors” that enforce rest, keep infected animals safe from predation, and maximize recovery ( Aubert et al. 1995; Dantzer and Kelley 2007; Konsman et al. 2002; McLinden et al. 2012; Shattuck and Muehlenbein 2015), and that cytokines significantly impact numerous aspects of learning/memory/plasticity, such as acquisition (Sparkman et al. 2005), memory consolidation (Barrientos et al. 2002; Kranjac et al. 2012; Pugh et al. 1998), and memory reconsolidation (Kranjac et al. 2012). Therefore, given the strong evidence that peripheral immune activity signals the CNS, and produces distinct changes in neural, neuroendocrine, and behavioral parameters, it is entirely plausible that an individual’s physiology is not only subconsciously “aware” of ongoing internal infectious events, but also may construct, throughout the course of life, a general awareness or internal model of her/his own immune competence, a model that may influence life history strategy. Future research is needed to better understand which, if any, of these mechanisms contribute to the observed relationship between vulnerability to diseases and life history calibration.

General Discussion

Life history theory offers an elegant and useful theoretical framework for understanding key tradeoffs made in effort and somatic energy utilization during the lifespan (Charnov 1993; Kaplan and Gangestad 2005; Stearns 1992). An organism’s life history strategy is shaped by a number of factors, including a variety of distinct mortality threats. A portion of these threats will be intrinsic threats, or threats that can be mitigated or forestalled by changes in behavior or increased investment in somatic capital. For example, an individual can help mitigate the likelihood of succumbing to an illness like the flu by shunting bodily energy resources toward immune function and recovery, even at the expense of immediate reproduction. On the other hand, a significant number of threats to survival and reproductive success are extrinsic threats, or threats that cannot be avoided or mitigated by forces within one’s control (e.g., growing up in a neighborhood with a high rate of violent crime, or enduring a chronic illness). Given that extrinsic mortality threats cannot be controlled, they trigger a very different kind of response than do intrinsic threats, including shunting energy resources toward reproductive effort, rather than somatic effort. The chronic shunting of energy capital toward reproductive effort and away from somatic effort and growth, constitutes adoption of a faster life history strategy, and makes adaptive sense, in that failure to do so may compromise an organism’s likelihood of passing along its genes while it still can, before possibly succumbing to the extrinsic threat.

Although the majority of the extrinsic threats studied to date in the context of life history theory are external extrinsic threats, a relatively small number of studies have begun to examine effects of internal extrinsic threats, or threats related to the internal state of the organism (e.g., somatic condition, general health, chronic illness, e.g., Nettle et al. 2013; Rickard et al. 2014; Waynforth 2012). Such studies uniformly indicate that internal extrinsic threats, or threats to survival and reproduction that involve suboptimal aspects of one’s own internal state, in similar fashion to what is observed with external extrinsic threats, trigger an altered developmental trajectory that favors the adoption of a faster life history strategy and accelerated/augmented reproductive effort. These effects include earlier pubertal timing (Park et al. 2012; Rickard et al. 2014), diminished age at first reproduction (Waynforth 2012), and increased risky sexual behavior (Valencia and Cromer 2000). Building on this research, we hypothesize that immune function – which plays a key role in modulating one’s extrinsic morbidity / mortality risk – will also play a key role in modulating and maintaining life history strategies.

Emerging research from our labs finds that less robust immune function – measured both by proxy – using self-reports of vulnerability to illness and more directly through examining biological measures of immune function – is predictive of cognitions and behaviors associated with faster life history strategies. In particular, our research finds that natural killer cell activity (i.e., NK cell killing of radioisotope-labeled tumor cells) was inversely correlated with faster life strategies (based upon mini K measures), negatively correlated with an ability to delay gratification, and negatively associated with risky sexual behaviors (Hill et al. 2016). As in the prior studies, these effects were observed even when controlling for SES, another factor that has been known to modulate both health and life history strategy. Although no significant effects were observed for the two measures of mitogen-induced lymphocyte (e.g., T and B cell) proliferation utilized, our study is still ongoing, and the preliminary pilot study analyses may prove to be underpowered. Alternately, it would not be surprising at all to find that some aspects of immune function are associated with life history strategy, while others are not. Indeed, given the tremendously multifaceted and complex nature of immune function, we would hypothesize exactly that. It may be that proliferation responses to mitogens/pathogens are simply too coarse/basic a measure to be useful predictors in our regression analysis. Although our study utilizing varied biological assays of immune function is ongoing, and the number of immunological assays being expanded, these promising results, along with those obtained from studies utilizing self-reports of immune competence, indicate that the central nervous system is in tune with the nature of an organism’s immune function, not only in terms of nervous system modulation of any ongoing immune events (Blalock 1984), but also in terms of the relative competence of the organism’s ability to defend against infectious pathogens and malignant cells, and can modulate life history strategy based upon this information. These data are consistent with the hypothesis that immune function may play an important role in calibrating life history strategies.

The interesting findings summarized above bring numerous new questions and challenges to the fore. For example, although we have predicted that poor immune competence will favor faster life history strategies, under some conditions you may predict the opposite. That is, it is reasonable to predict that those with poor immunity via genetic variation, under certain circumstances, may experience shifts in energy budgets that favor increased investment in somatic effort as a means of improving survivability. Future research will need to address what the conditions are that favor one type of investment strategy over the other.

It will also be important for future studies to better determine which facets of immune function most impact the development of life history strategies, via which mechanisms they impact behavior, and isolating the direction of causality in the relationship between immune function and cognitions/behaviors associated with faster life history strategy. Given the multifaceted complexity of immune function and the myriad measures one could possibly examine, albeit within the limits of a relatively small number of cells that can be obtained non-invasively, from a relatively small peripheral blood draw from participants, choosing the “right” measures of immune function is no simple task. Further, even some of the “gold standard” functional immune assays are not without caveats, as most such assays rely upon in vitro methodology, in which white blood cells are removed from the complex milieu of the body and tested under artificial conditions.

In addition to these challenges, future work will also need to determine the extent to which, if at all, one’s internal psychological model of immune function (i.e., perceived vulnerability to illness) correlates with different biological measures of in vitro and in vivo immune function. This determination is an important one, given key studies cited in this review that appear to indicate the validity of self-report measures of general immune competence. Although our preliminary data obtained via biological measures of immune function offer some preliminary evidence that the perceived infectability measures may sometimes coincide with more objective and specific measures of immune function (at least on a portion of immune measures garnered), we do not yet have enough data to perform these correlational analyses. Moreover, it is important to note that whether or not any correlations between perceived immunocompetence and biological measures of immune competence exist, several studies have demonstrated that perceived infectability reliably predicts life history strategy. Indeed, mere perception then, may be enough, either via identical, parallel, or alternate pathways used to communicate immune activity/competence to the brain, and thus calibrate life history strategy.

Another potentially fruitful avenue of exploration, if current trends hold, will involve elucidation and dientangling of the biological and psychological mechanisms by which the CNS is informed of one’s immune competence (e.g., which aspects of immune function or lifelong history of sickness factor into one’s adulthood perception of vulnerability to illness/infection, and which personality traits might affect this perception), along with elucidation of the mechanisms by which the CNS modulates life history strategy (e.g., do these mechanisms share commonality with those outlined above that operate in depression, sickness behavior, or learning?). This latter aspect is, of course, a long-standing need for life history theory more generally.

Another relevant topic to examine might be whether growing up in a pathogen-dense ecology (e.g., closer to the equator or near a malarial swamp) can alter one’s perceived vulnerability to illness, potentially interacting with one’s perceptions regarding one’s immune function or biological vulnerability, to impact behavior in important ways. Indeed preliminary evidence for such effects exists in humans (see e.g., Hill et al. (2015). As an interesting parallel, could the nature of one’s commensal microflora -the host of bacteria that inhabits our GI tract, skin, and mucosal membranes, a true indicator of one’s internal “ecology” known to affect not only digestive and immune function, but also a number of neurobiological and psychological endpoints- also act to inform life history strategy and decision-making?

Lastly, a future direction for moving forward would be examining whether public health policies aimed at improving immune function in children (e.g., improved nutrition, sufficient sleep, or moderate physical exercise) might not only confer obvious health benefits to children in terms of diminished infectious disease and cardiovascular health, but also potentially shift an otherwise vulnerable individual away from adopting a faster life history strategy, thereby encouraging investment in somatic effort and development, and simultaneously avoiding some of the pitfalls associated with a fast life history strategy. We are hopeful that the research being conducted in our laboratories will provide a useful starting point for beginning to examine some of these important questions.

To date, much of the research examining the impact of extrinsic threats on life history strategies have focused almost exclusively on external extrinsic threats. Here, we propose that the functionality of one’s immune system – which plays an important role in one’s mortality risk – may also play a key role in calibrating life history strategies Although the results of the research testing these hypotheses are still emerging, they are beginning to paint a rich picture wherein one’s internal condition, including the functionality of one’s immune system, has important implications for a variety cognitions, behaviors, and developmental outcomes.