Introduction

Urbanization, commonly defined as the change in size, density and heterogeneity of cities, is a rapidly occurring global phenomenon. Projections suggest that within the next several decades, urbanization will continue in both developed and developing nations, such that by 2050 86% of the population of developed nations and 64% of developing nations will be urban residents. More immediately, by 2030, it is estimated that there will be an additional 1.35 billion people living in world cities [1, 2].

Urbanicity, a term that more specifically refers to the presence of conditions that are particular to urban vs nonurban areas, is ultimately defined as the impact of living in urban areas at a given time [3]. We acknowledge that urbanization and urbanicity can bring many potential benefits, including, but not limited to, those involving commerce, trade, education, communications and the efficient delivery of government and health-care services; however, the problems associated with rapid urbanization are many. In particular, urbanization is increasingly linked to changes in behavior that contribute to noncommunicable diseases (NCDs) or so-called diseases of civilization. For example, urbanization has been associated with diminished levels of physical activity, compromised sleep and unhealthy dietary choices [38].

A related issue is the globalization of the food industry, reflected in substantial changes in efficiencies of production, marketing, transport, advertising and sale of food; this has prompted profound shifts in the composition of diets globally and has been one of the major drivers of changes in the distribution and burden of the NCDs over the second half of the 20th century [9, 10]. These changes may particularly affect those residing in deprived urban and suburban areas where there is a far higher density of retail venues that facilitate the consumption of unhealthy food commodities [11].

Among the NCDs, mental disorders—major depression and anxiety in particular—have been described as an impending global epidemic. Although less is known concerning the prevalence of these disorders in developing nations than elsewhere [12, 13], all data indicate that the burden of disease attributable to mental disorders will continue to rise globally over the coming decades [14]. Living in an urban environment [1517], as well as global shifts associated with urbanicity [1821], the food supply and food industry [22], climate change [23], food insecurity [24, 25], overnutrition and underactivity [26] and an overall transition from traditional lifestyles [2729] have each been linked to increases in depression and other mental disorders. It is understood that depression and other mental disorders are not exclusive to urban areas; however, higher rates in urban areas have been reported with a degree of consistency [30, 31]. Moreover, looking specifically at neighborhood disadvantage, the adjusted odds of experiencing an emotional disorder are 59% higher in urban youth vs those in rural areas [32].

Regardless of the urban–rural divide, within developed nations, up to one-third of all visits to primary care clinicians involve patients with emotional disorders, most notably anxiety and/or depressive conditions [33]. In both developed nations and those in the midst of an epidemiological transition, depression is rapidly rising within the burden of disease rankings [3436]. Moreover, depression is already part of an epidemic of comorbidity and will likely become even more so. Specifically, obesity, type 2 diabetes and other chronic diseases strongly associated with lifestyle behaviors are highly comorbid with depression [37, 38].

Because diagnosable depression and anxiety [3941] (and their subclinical and subthreshold variants that may otherwise escape detection [42]) are also outcomes associated with, and drivers of, subsequent cardiovascular and other chronic disease states (that is, they are both upstream and downstream of “physical” illness), a multidisciplinary approach to prevention and treatment is necessary [43]. The results of studies involving adults with depression indicate that treatment outcomes from standard first-line antidepressant or psychotherapy interventions (for example, cognitive-behavioral therapy) are far from being universally adequate. Notwithstanding the debates concerning the placebo effect vs antidepressant efficacy [44, 45] and their potential adverse effects [46] (as well as publication biases concerning cognitive–behavioral therapy [4749]), treatment resistance, relapse and nonresponse remain a reality [50, 51]. Moreover, even with optimal care and access to services, the results of modeling studies suggest that only a proportion of the burden of mental disorders can be averted, pointing to a clear need to focus on prevention to reduce the incidence of mental disorders [52]. From a research perspective, there is an urgent need to examine various environmental or lifestyle variables more thoroughly, including those wherein there are scientifically plausible mechanisms that can explain risk.

With this background in mind, we explore recent developments within an emerging discipline known as nutritional psychiatry. The relationship between dietary patterns and risk of mental health disorders has been the subject of intense research in recent years. There seems little doubt that dietary patterns and mental health, as well as other NCDs, are intertwined with socioeconomic circumstances [53]. We examine some of the potential socioeconomic and environmental challenges that detract from a dietary pattern or optimal nutrient intake that might otherwise support positive mental health.

As discussed below, in the context of urbanization, cultural and technological changes and the global industrialization and ultraprocessing of food, the findings related to nutrition and mental health are connected to some of the most pressing issues of our time. They are also of relevance to matters of biophysiological anthropology and a potential evolutionary mismatch between our ancestral past and the contemporary nutritional environment. At the outset, we acknowledge that much of our focus here is on depression. This is largely because depression has received the lion’s share of international research attention as it relates to some of the nutritional and environmental variables; however, it should be understood that these discussions are of relevance to diverse topics ranging from reductions in violent behavior [54] to overall positive mental outlook in those who are otherwise well-adjusted [55, 56].

Nutritional psychiatry research

Among the variables that might afford resiliency against mental disorders or, on the other hand, contribute to risk, diet has emerged as a strong candidate [57]. Viewed superficially, nutrition is a variable that needs little explanation in terms of its mechanistic potential. The brain operates at a very high metabolic rate, utilizing a substantial proportion of the body’s energy and nutrient intake. The very structure and function (intra- and intercellular communication) of the brain is dependent upon amino acids, fats, vitamins and minerals and trace elements. The antioxidant defense system operates with the support of nutrient cofactors and phytochemicals and is of particular relevance to psychiatry [58]. Similarly, the functioning of the immune system is of substantial importance to psychiatric disorders and profoundly influenced by diet and other lifestyle factors [59]. Neuronal development and repair mechanisms (neurotrophic factors) throughout life are also highly influenced by nutritional factors [60]. In short, a theoretical framework of biological mechanisms whereby nutrition could exert its influence along a continuum ranging from general mental well-being to neuropsychiatric disorders is highly plausible. However, despite its role as the foundation of physiological processes, nutrition as a factor in promoting positive mental health (or working against it) has traditionally suffered from scientific neglect and/or poorly designed studies that compromise interpretation.

Encouragingly, there are signposts indicating a major shift related to nutritional matters in mental health. Given the aforementioned background on the burden of mental disorders, this development is timely. A variety of epidemiological studies, including high-quality prospective studies, have linked adherence to healthful dietary patterns with lowered risk of anxiety and/or depression [6167]. The results of these studies indicate that nutrition may provide a very meaningful layer of resiliency. The effect sizes suggest a clinically relevant value, not simply a minimal statistical significance. Healthy dietary patterns characterized by a high intake of vegetables, fruits, potatoes, soy products, mushrooms, seaweed and fish have recently been associated with a decreased risk of suicide [68]. Specific elements of the diet—green tea and coffee, for example—have also been linked to a decreased risk of depressive symptoms [69]. Moreover, nutrition is emerging as a critical factor within the developmental origins of health and disease (DOHaD) construct. Early nutrition is now being linked, convincingly, to later mental health outcomes [70]. Moreover, at the other end of the lifespan, adherence to a Mediterranean diet results in better cognitive outcomes and a reduced risk for dementia [71, 72].

In support of these large population studies, at the opposite end of the research continuum, sophisticated bench studies are determining the pathways whereby individual elements of diet—polyphenols, for example—can influence brain function [7375]. The area of dietary fats is yet another area of intense scrutiny, one where high-fat, obesogenic diets consumed during the pre- and perinatal period have been associated with long-term changes in neurotransmission, brain plasticity and behavior in offspring [7678]. Moreover, beyond the detrimental effects of excessive fat per se, recent experimental evidence suggests that all fats, and even classes of fats (for example, various saturated fat sources), cannot be painted with the same brush in the context of brain health [7982]. It is also worth noting that the results of numerous international studies have linked low blood cholesterol levels with suicide risk [83], a finding that might be related to low levels of the more specific high-density lipoprotein that is otherwise promoted through healthy dietary practices [84].

On the one hand, in between the epidemiological and experimental evidence, the results of a variety of clinical studies suggest the value of ω-3-rich oils in disorders including, but not limited to, bipolar depression [85], posttraumatic stress disorder [86], major depression [87] and the indicated prevention of psychosis [88]. On the other hand, given the brain’s continuous use of such a wide variety of nutrients for its structure and function, and the potential errors in metabolism that may be at play in mental disorders, the traditional focus on a single-nutrient resolution [89] may mask the potential value of multinutrient interventions [90]. Zinc provides a good example. The results of a number of preclinical and clinical studies have indicated that zinc supplementation may be helpful for people with major depression [91], and low dietary zinc has not only been correlated with depression [92], but has also been linked to abnormalities in the response to antidepressant medication [93]. However, zinc is well-established to play key roles in the metabolism of ω-3 and other essential fatty acids [94]. Indeed, there are complex ways in which ω-3 and zinc work together in the epigenetic control of human neuronal cells [95]. It is likely, therefore, that in the clinical setting, a combination of nutrients may provide better outcomes [96]. In the process of translational medicine, further examination of multinutrient bench and intervention studies will be necessary [97, 98].

For better or for worse, large percentages of adults in developed nations are consumers of dietary supplements [99], and this seems especially true of those with mental disorders. For example, two-thirds of all patients with mood disorders in one recent Canadian study were found to be supplement consumers, and 58% were taking supplements in combination with psychotropic medications [100]. Such findings only add to the urgency of evaluating the role and importance of nutritional variables, ranging from overall dietary patterns to the epigenetic controls exerted by isolated nutrients. Moreover, there is a clear need to explicate in detail the ways in which dietary supplements may interact with medications, particularly the widely prescribed classes of antidepressants and anxiolytics.

There is also an imperative to examine more closely the ways in which the nonnutritional elements of the modern diet may contribute to various brain-related disorders. Such research should include, but is certainly not limited to, the possible role of food-derived residual pesticides in brain health [101] and the ways in which synthetic food dyes, artificial sweeteners and flavor enhancers may influence behavior [102104]. The results of preliminary randomized, placebo-controlled research suggest that short-term exposure to gluten can induce depressive symptoms in people with nonceliac gluten sensitivity [105].

Although it is commonly held that there is “no health without mental health” [106], the emerging evidence suggests that the opposite is also true: Good physical health and aerobic fitness are vitally important to mental health [107, 108]. We would argue that if a healthy diet influences mood, it might also increase the likelihood that an individual will remain physically active. This might be especially true in people with depression or obesity, in whom mood-related barriers diminish the desire perform physical activity or minimize its postexertion appeal [109111]. A positive emotional state encourages future participation in physical activity [112, 113]. Supplementation of individual components of traditional diets, such as vitamin C, have been shown to reduce heart rate, perceived exertion and general fatigue postexercise in overweight adults [114]. Adoption of the Mediterranean diet for 10 days has been found to result in significantly increased vigor, alertness and contentment among participants vs controls [115]. Adherence to traditional diets, such as the Mediterranean diet, has been linked to objectively measured physical performance [116, 117]. More research is required to tease apart the ways in which nutrition can be used to tackle the motivational barriers to initiating and maintaining a routine of physical activity.

In people with mental disorders (and even acute diminished mental outlook), there appears to be involvement of chronic low-grade inflammation [118], oxidative stress [119], impaired metabolism [120], alterations to microbiota [121] and numerous other pathways that can be mediated by nutrition. In particular, there is currently much enthusiasm surrounding the potential of probiotics and prebiotics (or any agents capable of causing favorable shifts in the intestinal microbiota) as a means to promote positive mental health [122]. However, there has been little attention paid to the central role of diet as a variable working for or against the potential value, particularly over the long term, of these so-called psychobiotics [123, 124]. The same high-fat Westernized dietary factors that may negatively alter intestinal microbiota and promote intestinal permeability (provoking low-grade systemic inflammation via lipopolysaccharide endotoxin [121]) have also been shown to disrupt the critically important blood–brain barrier [125127]. There may be untold consequences of diet-induced assaults to the integrity of both intestinal and the blood–brain barriers as they relate to mental health.

Clearly, there have been exciting advances in nutritional psychiatry research. The overall findings make it clear that nutrition matters in mental health. However, the degree to which nutrition matters remains unknown; the existing research, beyond validating nutrition as a variable of importance, has also served to generate a myriad of research questions that will require evidence-based answers. This in itself supports our call for broad collaboration to move this field of investigation forward and to explicate mechanistic pathways in order to identify targets for intervention. Many of the obstacles to learning how and the extent to which nutrition can work for or against positive mental health are related to the complex ways in which nutrition is intertwined with social, cultural and economic factors. In the next section, we address some of those complexities.

Environmental challenges, social determinants

The available evidence suggests that traditional dietary habits are beneficial for positive mental health and that unhealthy dietary choices are a related (but independent) detractor from a state of good mental health. Assuming for a moment that the evidence continues to grow more robust (an assumption generally supported by current meta-analysis results [66, 128]), the critical components of next-phase research will include further examination of the environmental factors that push toward (or pull away from) healthy, traditional dietary patterns. The environmental forces that facilitate a pull away from traditional diets appear to be extraordinarily strong.

Before examining some of the environmental factors that may work against nutrition for mental health, it is important to consider two contextual facts. First, mental health disparities within nations are commonplace; for example, depression does not occur randomly, and its rates, along with other mental disorders, are far higher in the disadvantaged social strata [129131]. Second, the volumes of international research concerning obesity are likely of major relevance to the global mental health crisis. There seems little dispute that obesity can contribute to subsequent mental health problems. However, there are convincing prospective studies showing just the opposite—that depression and/or anxiety in children and adults predict subsequent weight gain over time [39, 132137]. Therefore, scientific discussions concerning environmental factors related to obesity are highly relevant to the topic of mental health [138]. Also, much like depression, obesity is not randomly distributed within nations; it, too, is slanted toward neighborhood deprivation and lower socioeconomic strata (SES) [139].

Global urbanization, highlighted in the Introduction, is clearly linked to higher consumption of energy-dense, nutrient-poor foods and beverages. The results of epidemiological and experimental research suggest that it is the highly palatable combination of sugar, fat and sodium that plays a key role in the attractiveness of such foods [140142]. From an evolutionary perspective, the pleasure associated with energy-dense food consumption in the ancestral environment could motivate intake as a means of offsetting the scarcity of foods. Moreover, contemporary economic factors magnify the allure of inexpensive, widely available, energy-dense and nutrient-poor foods. Research indicates that higher costs are associated with nutrient-rich fruits and vegetables and quality protein sources such as fish and lean meat [143145].

Animal experiments with the so-called cafeteria (Westernized) diet back up the contention that the availability and variety of palatable energy-dense foods drives long-term overconsumption of those foods, coincident with alterations to gene expression in the brain’s reward system [146, 147]. Indeed, experimental models of early-life stress show that the cafeteria diet can dampen the stress response, indicating that its consumption may be a form of self-medication [148]. This is supported by human research on the relationship between psychological distress and the increased consumption of so-called comfort foods [149151]. Indeed, research shows that, at least in healthy adults, the direct infusion of fatty acids into the stomach (bypassing visual, gustatory and olfactory cues) can rapidly attenuate a laboratory-induced negative mood state [152].

Taken together, the physiological responses to the consumption of energy-dense comfort foods are likely to be behaviorally reinforced in people with mental health disorders. If we factor in economics and convenience in people who experience financial stress and physical fatigue, the likelihood of these foods’ being preferentially consumed increases. Moreover, the urban environment of people who live in low-SES neighborhoods, as it relates to food, is much different from that in affluent areas. Not only are fast-food outlets and convenience stores commonplace in disadvantaged neighborhoods [11, 153155], but outdoor advertising of high-energy, low-nutrient foods and beverages is more prevalent there [156158]. Not surprisingly, because there is little dispute that well-financed, targeted marketing is effective [159], brand name logo recognition of major fast-food outlets is high among children in low-SES neighborhoods [160].

Research also shows, on the one hand, that people in a lowered mood state are less likely to focus on the future health consequences of dietary decisions. Positive mood, on the other hand, increases the likelihood that an individual will not discount the future implications of what they are eating in the here and now [161]. This type of research, under the umbrella term of delayed (or temporal) discounting, is important in the context of the urban nutritional environment. Humans are prone to discount the value of future rewards and prioritize smaller but more immediate rewards. The strength of discounting, however, is associated with impulsivity, depression, obesity and unhealthy lifestyle habits [153, 162165]. Minimal discounting of future considerations (that is, placing a high value on the future) is associated with a healthier diet and higher physical activity levels [166]. The overall cognitive load within an urban environment may magnify delayed discounting [167]. Subtle aspects of the urban environment, such as subconscious exposure to fast-food logos or even answering surveys in the general proximity of an urban fast-food outlet (vs full-service restaurant), increase impatience and encourage the discounting of future rewards for immediate gain. The immediate gain in this case might be the pleasure associated with low-nutrient, highly palatable foods and beverages. Moreover, individuals who reside in neighborhoods with high concentrations of fast-food outlets are more likely to prefer smaller immediate rewards over larger future gains [155].

Moving along this continuum, emerging research shows that food advertising as a driver of unhealthy food consumption is more pronounced when cognitive load is high (that is, distracting mental demands) and that those with lower SES backgrounds are more susceptible to the effects of advertising under cognitive load [168]. New research also confirms that excessive fast-food consumption is indeed attributable to the neighborhood food environment, with as much as 31% of the variance in excessive consumption attributable to living in urban areas with a moderate or high density of fast-food outlets [169]. However, fruit and vegetable consumption is markedly diminished in neighborhoods with a high clustering of fast-food outlets [170, 171]. Surrogate physiological markers of fruit and vegetable intake (for example, serum carotenoids) support the notion that adults who reside in deprived or lower SES neighborhoods are less likely to consume these foods or may displace them with unhealthy choices [172, 173].

Connected to the marketing forces influencing a pull away from traditional dietary habits is screen-based media consumption. Excess screen-based media consumption and so-called technostress have recently been linked to poor psychological health [174182], and research shows that in neighborhoods where walkability is less than optimal, screen time is higher [183, 184]. Moreover, children who reside in urban environments [185] and deprived neighborhoods [186] have higher daily screen time than their rural or affluent counterparts. Higher screen time and depression could merely be a matter of association with less physical activity [187]; however, research shows that screen time (independently of physical activity) is significantly associated with the consumption of high-energy, low-nutrient foods and beverages [188]. Prospective research shows that baseline screen time predicts higher consumption of sugar-rich beverages when queried 20 to 24 months later [189, 190]. Similar results have been reported concerning baseline screen time and subsequent dietary habits in high school students and young adults: More screen time was found to predict consumption of fast-food, snacks and high-energy foods and beverages 5 years later [191]. The ways in which screen media can influence nutritional imbalances by encouraging high-energy, low-nutrient food consumption—via distraction of the activity itself and marketing of such foods during the activity [192195]—is an area of research which requires expansion.

Evening screen media consumption, and even low levels of light at night (LAN), may suppress melatonin levels and compromise sleep [196]. Compromised sleep, too, has not only been linked to depression but also is associated with a prioritization of unhealthy foods choices [197, 198], and, at least in experimental models, LAN results in a more pronounced inflammatory response to high-fat foods [199]. Indeed, circadian rhythm disruption appears to be a distinct biological stressor that interacts with Westernized dietary patterns in the promotion of intestinal permeability and the loss of normal intestinal microbiota [200, 201].

Despite the research indicating that the consumption of energy-dense, high-fat, high-sugar and sodium-rich foods may be a highly palatable form of self-medication, the results of collection of dietary records and mood scores over a weekly period suggest that (in healthy university students) consumption of such foods increases negative mood 48 hours after consumption [202]. Additionally, food and beverage consumption is often a social activity. A greater understanding of the ways in which social isolation can minimize dietary diversity [203] or social support can facilitate healthy dietary patterns [204] is an essential area of research in nutritional psychiatry.

As a segue to our discussion regarding the evolutionary mismatch and nutritional anthropology, it is important to point out the cultural aspects of traditional dietary patterns as exemplified by the Mediterranean and Japanese models. It is not by chance that the Japanese and Mediterranean diets are each officially listed by the United Nations Educational, Scientific and Cultural Organization (UNESCO) on its Lists of Intangible Cultural Heritage in Need of Urgent Safeguarding and Register of Best Safeguarding Practices [205]. Both of these dietary patterns are steeped in tradition—nutritionally and otherwise. The listing by UNESCO specifically makes reference to the shared family and community experience in the consumption of these respective diets, including the ways in which dietary practices have been passed down from generation to generation. In other words, the cultural aspects of the diets are integral to adherence.

Cultural influences remain a driver of traditional dietary patterns; however, the Westernization of Japanese and Mediterranean diets appears to be corroding the culture of dietary tradition in both regions [206208]. Lower rates of adherence to the Mediterranean diet have been reported to be common among urban youth in Italy and Greece compared to their rural counterparts [209, 210]. Lower amounts of screen time are associated with higher adherence to the Mediterranean diet in Italian children [211]. The extent to which individual interest in traditional dietary patterns is associated with both social capital and psychological distress is a subject of ongoing research [212].

The relationships between healthy dietary habits and mental health are obviously complex. They weave their way through many aspects of modern technology and culture. The extent to which these nutritionally related environmental challenges ultimately contribute to the causation of mental disorders or present a barrier to effective treatment is a question in clear need of exploration. We hope that this will be the legacy of the field of nutritional psychiatry. In the meantime, it seems safe to say that the environmental challenges discussed above, those that might accelerate the adoption of Western dietary patterns, are colliding with an ancestral past that is still dictating human physiology [213].

Evolutionary mismatch?

Nutritional anthropology has taught us much concerning the makeup of ancestral dietary patterns. Undoubtedly, the ancestral diets were regionally varied, and it seems clear that there was no archetypal “Stone Age cuisine.” However, despite their differences, ancestral diets largely included a high intake of plant foods (rich in fiber and phytochemicals and facilitating a net base production), animal foods and a diverse range of commensal bacteria. Ancestral diets were also united by their absence of highly processed foods containing high sodium, added fats and refined sugar. In other words, the anthropological evidence concerning ancestral diets indicates that their regional and seasonal differences were minor compared to the ways in which they differ from the modern nutritional landscape [214, 215]. With this in mind, we can move toward discussions of evolutionary perspectives and epigenetic opportunities. After some background and historical framing, we next focus on three components of ancestral and traditional dietary practices: cooking techniques, dietary acid load and microbiota.

Evolutionary explanations for contemporary increases in psychopathologies include the mismatch theory. Broadly, this proposal suggests that there are gaps between the physiological and psychological requirements of individuals, as determined through millennia of adaptation to natural and social environments, and the ability of the modern environment to help fulfill those brain or mental health–related needs [29, 216]. The likelihood of mismatch is increased, according to theory, when culture and technology change more rapidly than human biological evolution [217]. Some of the lifestyle modifications associated with rapid technological and cultural changes—diminished physical activity, daily hassles and low-grade psychological stress, increased consumption of low-nutrient and energy-dense foods, loss of biodiversity in microbial contact—could chronically and unnecessarily stoke an immune defense system that is stuck in its ancestral past [218]. The ancestral immune system remains biologically well-adapted to preventing pathogen invasion, limiting acute infections, working with commensal microbes and assisting in programming the development of the nervous system [219]. However, its ability to perform these and other functions, as it has done for millennia, may now involve collateral damage in the form of allergies, asthma, autoimmune conditions and increased risk of neuropsychiatric disorders [220222].

The notion that modern dietary intake is at odds with our ancestral past, and that this disconnect has health-related consequences, is not new. In the early part of the 20th century, a small minority of medical writers were sounding alarms concerning the implications of a shift away from traditional diets. In 1922, the editors of the Medical Standard journal issued a stern warning concerning the increased consumption of the “output of the candy shop, soda fountain or pastry shop,” or, as they called it, “denatured, devitalized and demoralized foodstuffs.” More specifically, the editors’ argument was that such foods and beverages were at odds with optimal physiological functioning: “In other words, any kind of food is unfit to eat when its composition, through some artificial form of kitchen treatment, has become hostile to physiological life.… What is not for me is against me.… Never before in the world’s history of individual evolution were so many temptations put before him in the field of appetite and sensuous appeal as today” [223]. Also at this time, some experts were examining the dietary practices of people maintaining an isolated or indigenous existence [224], and others, such as noted nutritional scientist Frances Stern (1873–1947; later she would have a clinic at Tufts Medical Center (Boston, MA, USA) dedicated in her name), were assisting disadvantaged immigrants with ways in which to maintain their traditional dietary habits in an affordable way [225].

Stern, for her part, addressed the leaders of North American psychiatry at an annual conference in 1929, “hoping that the leaders there may guide the nutritionist to greater wisdom in the treatment of her patient.… In the field of dietotherapy, however, there is need of further light from psychiatry with regard to the interpretation of the mental life of patients, and the nutritionist is looking to the mental hygiene movement for research into the nature of the relationships between food and disturbances of the emotional life” [226]. Only today, the better part of a century later, has a response to Stern’s plea begun to take shape. From the perspective of nutritional anthropology, it was not until the 1980s that a scientific framework began to emerge, one in which shifts away from traditional dietary habits were at the core of the mismatch discussions and connections to chronic degenerative diseases [227].

Dietary patterns are more than simply reflections of macronutrients. In addition to the cultural aspects alluded to earlier, they involve methods of preparation. Experts in anthropology have noted that cooking techniques involving water—steaming and boiling—were widespread in our ancestral past. Indeed, although the precise dating of so-called Stone Age soups and stews remains unknown, they appear to be a minimum of 20,000 years old [228]. Some experts suggest that the use of steam to release necessary fat from animal bones may have been in practice 50,000 years ago or more [229, 230]. Suffice it to say that water-based cooking is an ancient practice. Those investigating the marked shift away from traditional dietary patterns in China and elsewhere have noted that the shifts toward processed foods and less potassium-rich foods are associated with changes in cooking technique. Specifically, there has been a decline in the consumption of foods prepared by steaming or boiling [4]. The traditional Japanese diet, one currently undergoing erosion, has included steamed and boiled foods since its early origins [231]. Indeed, the dashi (soup stock) is said to be the backbone of Japanese cuisine. Dashi made with flakes of dried bonito has been shown to improve mental outlook in randomized, controlled human trials [232] and to reduce anxiety-like behavior in experimental rodent studies [233].

One of the consequences of a shift away from water-based cooking to that of high heat in the absence of water (that is, baking, roasting, grilling, frying) is that it encourages the formation of advanced glycation end products (AGEs). Not to be confused with heterocyclic amines (HCAs) and polycyclic aromatic hydrocarbons (PAHs) produced by grilling meats (HCAs and PAHs experimentally linked to cancer), AGEs are highly oxidant compounds formed through the nonenzymatic reaction between reducing sugars and free amino acids. AGEs form in the body under normal metabolic circumstances; however, oxidative stress and inflammation are a well-known consequence of AGE levels rising beyond normal limits. The results of recent investigations have shown that the preformed AGEs in foods also add to the endogenous AGE burden. When considering the Westernized dietary pattern, with its reliance on foods thermally processed with high (dry) heat to enhance flavor, appearance and color, it should not be surprising that many of its foods are very high in AGEs [234]. The results of human studies have shown that a shift toward stewing, steaming, poaching and boiling of foods (lowering the AGE burden by approximately 50%) significantly reduces systemic inflammation and oxidative stress [235, 236]. Most of the research has been focused on type 2 diabetes and obesity, with improvements in insulin sensitivity noted in both populations after a switch to a low-AGE diet [237, 238]. Interestingly, people with depression and schizophrenia have low blood levels of soluble receptor for AGEs, a receptor tasked with binding and essentially halting some of the destructive consequences of AGEs [239, 240]. The emerging research related to AGEs forces us to look at not only dietary patterns but also the ways in which the foods within these patterns are prepared.

Another understudied aspect of the relevance of ancestral diets in contemporary mental health relates to the dietary acid load. Acid–base homeostasis is an example of a tightly controlled set of physiological mechanisms, checks and balances that are almost certainly a by-product of the Paleolithic nutritional environment [241]. The best available evidence suggests that the diets of our early East African ancestors were predominantly net base-producing [242]. The Westernized diet, with its heavy protein (amino acid) load, sodium intake and relative absence of fruits and vegetables (typically rich in buffering potassium, bicarbonate precursors, magnesium and calcium), has been linked to what has been described as a low-grade systemic metabolic acidosis in otherwise healthy adults [243]. Notwithstanding the pseudoscientific extrapolation of the low-grade acidosis proposal into so-called alkaline diets by lay writers, intriguing prospective research has shown that dietary acid load increases subsequent risk of type 2 diabetes [244]. Acid load has also been linked to cardiovascular health [245], elevated body weight, increased waist circumference and a lower percentage of lean body mass [246248].

In our current context of mental health, it is noteworthy that a diet rich in potassium (compared to the DASH diet (Dietary Approaches to Stop Hypertension) and a separate group in a high-calcium diet) was found to improve depression, tension, energy and the Profile of Mood States global score in otherwise healthy adults over the course of 1 month [249]. Also relevant to mental health is a specific physiological change induced by a high-acid load, fast-food type diet. In otherwise healthy adults who consumed an acidic, Western-type diet for 9 days, researchers neutralized the diet with bicarbonate supplements (while subjects maintained the high-acid-load diet) and found that cortisol levels were reduced significantly vs controls maintaining the diet [250]. This result suggests that the high acid load itself is a stressor because cortisol elevation can contribute to inflammation, oxidative stress and lowered mood states [251].

Because the acid–base balance in human blood and other tissue is so tightly regulated, with the exceptions of kidney disease and acute, overt metabolic acidosis, it would be tempting to dismiss the relevancy of dietary factors in regard to their ability to influence pH in and around tissue. However, despite unaltered blood pH, the chronic administration of oral bicarbonate has been shown to increase the extracellular pH of tumors, subsequently reducing the in vivo number and size of tumor metastases. Ultimately in those studies, the reductions in metastases after oral bicarbonate led to increased survival rates of the animals with tumors [252, 253]. In the general population, dietary acid load is associated with lower serum bicarbonate levels [254]. Returning to mental health, it has been shown that the ability of inhaled carbon dioxide to induce fear can be diminished by systemically administered bicarbonate. The amygdala is a highly sensitive chemosensor and contains within it the acid-sensing ion channel 1a (ASIC1a). Direct reductions of pH within the amygdala can evoke fear behavior, whereas inhibition of the ASIC1a receptor has been shown to have antidepressant-like effects in various stress models [255, 256].

Could there be connections between behavior, dietary acid load (and the low end of the normal physiological blood pH range from 7.36 to 7.38), ASIC1a sensitivity and the low serum bicarbonate levels associated with dietary acid load? Perhaps. In the meantime, we do know that potassium-rich diets are less frequently consumed by people with depression [257] and those in lower SES groups [258]. This is an area ripe for research. Related to this topic is the emerging research showing that even mild states of dehydration can negatively influence mood and cognition [259].

An aspect of evolutionary medicine that has ignited a spark in the international research community is a possible connection between the microbiome and mental health. This area of research, at least in its contemporary tone, has many of its original roots in the 1980s hygiene hypothesis, that which suggested that the global rise in allergic disease could be related to diminished opportunity for early-life exposure to pathogenic microbe exposure via increased hygiene and use of antibiotics as well as smaller family size [260]. A decade later, after a landmark 1998 paper by Swedish physician Agnes Wold, the focus shifted toward a more general gut microbial deprivation hypothesis, one that included “low exposure to bacteria via food or the environment in general. All this results in an ‘abnormally’ stable microflora” in Westernized nations [261]. The traditional focus on harmful microbes shifted toward lactic acid bacteria and commensals, with novel scientific frameworks bridging the immune system to emotional health via nonpathogenic microbes [262, 263].

The shift toward an abnormally stable (inflammatory) gut microbiota in Westernized nations became more evident when sophisticated 16S rRNA sequencing techniques allowed detailed stool analyses to be conducted. The results of recent studies have consistently shown that, compared to Westernized urbanites, rural dwellers and hunter-gatherers in Africa and South America have higher levels of microbial richness and diversity. The available evidence suggests that diet, rather than hygiene per se, is the key component to microbial diversity [264267]. The results of analysis of the microbiome within ancient human coprolite (paleostool) samples indicate more alignment with the modern hunter-gatherer than with the urban dweller in a developed nation [268]. Because ancestral diets seem to set up a gut microbial diversity that is distinct from that currently being associated with obesity and type 2 diabetes [269], we once again arrive at a list of further questions pertaining to the ways in which traditional diets influence mental health.

Scientific interest in lifestyle medicine in general, and nutritional influences related to DOHaD in particular, has grown rapidly in the past several years [270, 271]. These discussions have often focused on obesity, cardiovascular disease, type 2 diabetes and allergic diseases. Yet, there are still only limited discussions related to parental and early-life nutrition, epigenetic changes, microevolution, life-course and even transgenerational mental health [272274]. Children born preterm (vs peers delivered at full term) are at significantly higher risk for developing mental disorders [275]; therefore, studies showing that healthy diets, at the time of conception [276] or during pregnancy [277], are associated with reduced risk of preterm delivery may have enormous implications for mental health. Although allergy is most often the first NCD to present itself in the clinical realm, subtle and even not-so-subtle behavioral alterations may predict subsequent allergic disease risk [278, 279].

It is not our desire to privilege mental health in the DOHaD discussions; rather, we wish to highlight the notion that a broader view is required, one that avoids silos and encourages collaboration. It seems very likely that the same evolutionary mismatch implicated in the global rise in allergic and autoimmune diseases is not distinct from that which seems to be at play in mental disorders [280]. Early-life immune programming via diet and microbes appears to be a universal application related to NCDs. The recent evidence showing that unhealthy maternal dietary choices predict subsequent childhood behavioral problems [70] is both groundbreaking and a source of optimism. It indicates that, despite the environmental challenges discussed earlier, things can be fixed. Indeed, there is an increasing sense of optimism regarding the opportunities for the prevention of mental disorders globally [281, 282].

Summary

Over the past several years, there has been a rapid growth in high-quality research related to nutrition and mental health. An area that has suffered neglect is finally starting to take shape, albeit in a preliminary fashion. Given the global changes described above—urbanicity, climate change and the globalization of the food industry resulting in profound shifts from traditional dietary patterns—there is a sense of urgency to bringing more efficiency and strength to the process of determining the ways in which overall dietary patterns, specific nutritional elements and/or multinutrient interventions can influence mental health.

Compromised mental health in both the broad sense (quality of life) and the more narrow clinical sense (disorders diagnosable based on the definitions published in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition) is highly comorbid with NCDs, and poor physical health is a particularly strong predictor of poor mental health [283]. Moreover, many of the high-prevalence NCDs share a multitude of underlying pathophysiological pathways with mental illnesses, including immune dysfunction and oxidative stress [59]. One need only pick a medical specialty, from cardiology and dermatology to gastroenterology and rheumatology, and it is plain to see how mental health presents itself as a variable of importance. However, nutrition is also emerging as a variable that uniquely interfaces with both mental health and some of the more notable conditions within these disciplines. Put another way, emerging research suggests that nutrition not only matters directly with regard to the conditions treated within various medical disciplines but also has the potential to influence mental outlook and mental disorders (improving or compromising, depending on intake) that can otherwise contribute to the ultimate outcomes of what are viewed as physical illnesses [284]. We cannot ignore this, particularly as it is becoming increasingly clear that diminished mental outlook and elevated perceptions of stress are drivers of unhealthy eating habits [285287].

It is not the purpose of our discussion here concerning the interface between nutritional psychiatry and physiological anthropology to systematically review the outcomes and/or potential pathophysiological pathways in minute detail, nor is it to provide a critical assessment of the state of the nutritional psychiatry science in the context of current evidence-based mental health interventions. Our primary message is that the field of nutritional psychiatry is rapidly developing and that, although for the moment it primarily involves global researchers in the fields of nutrition, mental health, population health and epidemiology, this list will surely expand.

Emerging groups such as the International Society for Nutritional Psychiatry Research aim to foster awareness, collaboration and ultimately meaningful clinical translation. Nutrition does not stand alone; it is intimately woven throughout social, cultural, economic, technological, behavioral and other (ancestral past vs present) environmental fabrics with which physiological anthropology concerns itself. As we have tried to make clear in this review, nutritional psychiatry is not a psychiatric specialty per se; on the contrary, the emerging research on the topic has been driven by an eclectic group of specialists in a wide variety of disciplines.

Time is no longer on the side of researchers. Global urbanization and an impending mental health crisis are a tandem juggernaut moving at rapid speed. Its destructive force will not be slowed down by additional silo-style research. Although growing in quality and providing cause for optimism, the current research remains scattered in various disciplines and lacks the collective push that is required in the process of translational medicine. Research studies published in isolation, no matter how elegant, ultimately make the job of translation and policy-making more difficult. We can surely conclude that things do not get better, in the scientific sense, by further neglect and disorganization of approach.