Introduction

Alzheimer’s Disease: Facts and Figures

Alzheimer’s disease (AD) is the most common form of dementia in late-life, accounting for 60–80 % of cases, and the sixth leading cause of death in the United States [1]. AD is a neurodegenerative disease characterized by progressive deterioration and death of neurons, which give rise to deficits in memory, language and reasoning, and impairment of daily living activities. Currently, an estimated 5.2 million Americans of all ages have AD. A recent report on the economic implications of the impending epidemic of AD indicates that, as the “baby boomer” generation ages and the population continues to increase, the number of people affected by AD will reach a staggering 13.8 million by 2050 in the United States alone [2, 3]. Age-related mild cognitive impairments may affect two to three times as many individuals [1]. Delaying symptoms onset by as little as 1 year could potentially lower AD prevalence by more than 9 million cases over the next 40 years [4].

Much effort is being devoted to the development of treatments for AD, which is now strongly supported by the use of biological markers (biomarkers) of disease. Biomarker studies have led to a major reconceptualization of AD, from diagnostic criteria to treatment strategies. First, the provisional diagnosis of AD is based on clinical history, neurological examination, cognitive testing, and structural neuroimaging [5]. However, the definitive diagnosis of AD is based on the postmortem detection of pathological lesions: amyloid-beta (Aβ) plaques in the extracellular space and blood vessels, intracellular neurofibrillary tangles, and neuronal loss in selectively vulnerable brain regions [5]. Brain imaging and other biomarkers of AD pathology are now largely being used to support the clinical diagnosis of AD by providing evidence for AD pathology in vivo [2, 3, 5].

For many years, the general understanding was that AD was a disease of old age. Recent breakthroughs in brain imaging techniques have reversed this paradigm by showing that the brain changes that lead to AD can be detected in predisposed individuals 20-30 years before clinical manifestations of disease become evident [6]. Brain changes in some at-risk individuals can be seen as early as infancy [7]. The early appearance of pathological lesions and the progressive nature of cognitive deterioration in AD led several working groups to revise the diagnostic criteria for AD to include in vivo biological markers of disease. This resulted in defining three stages of AD: (1) a preclinical stage of AD (no impairment in cognition; biomarker evidence for AD, which may encompass the lifespan of the individual up until the senior years); (2) mild cognitive impairment (MCI) due to AD (impairment on memory or other domains of cognition; biomarker evidence for AD); and (3) dementia due to AD (dementia; biomarker evidence for AD) [2, 3, 5]. As discussed by Sperling et al. [3], the concept of a preclinical phase of AD should not be too foreign, because it is widely acknowledged that several illnesses, such as cancer or heart disease, can be diagnosed prior to symptoms onset by using laboratory tests or medical indicators of future disease, in a similar way that hypercholesterolemia and atherosclerosis can result in narrowing of coronary arteries that is detectable before myocardial infarction. Advances in the early detection of chronic disease are largely responsible for a true postponement of mortality and a marked change in the demographics of aging, nearly doubling the average life expectancy over the past century [8]. A similar “early detection” framework is ideally applicable to AD, because AD has now the potential for being diagnosed preclinically by the use of biomarkers.

Treating Alzheimer’s Disease

Currently, pharmacological treatments for AD are limited. Available medications only lessen or stabilize symptoms for a limited amount of time, and disease-modifying treatments targeting Aβ removal or hindering its deposition are still under development [9]. Moreover, laboratory work and recent disappointing clinical trials raise the possibility that therapeutic interventions applied earlier in the course of AD would be more likely to achieve disease modification [9]. By the time patients come in for diagnosis, too much irreversible brain damage has likely already occurred for any treatments to be effective. As such, the preclinical phase of AD provides a critical opportunity for interventions that are instituted before irreversible neuronal loss, when the potential for preservation of brain function is at the greatest.

AD is divided in two major forms: early-onset and late-onset. The rare early-onset form of AD (~1 % of AD cases) is characterized by autosomal-dominant genetic mutations with almost complete penetrance, symptoms onset before age 60 years, and genetically induced Aβ overproduction [10]. Two major AD prevention initiatives are underway to test active and passive Aβ immunotherapy in mutation carriers [11].

On the other hand, the most common form of late-onset AD (onset after age 60 years), which accounts for >99 % of the total AD population, is a multifactorial disease of unknown origin that most likely develops from the complex interplay between genetic and environmental factors [10]. Late-onset AD has been associated with several risk factors, including demographics (i.e., older age, female gender, lower education), inheritance (i.e., first-degree family history, the Apolipoprotein E, APOE, ε4 allele), medical status (i.e., cardiovascular disease, hypertension), cognitive status (i.e., subjective and objective cognitive decline), environment (i.e., pollutants, environmental toxins), and lifestyle (poor diet, lack of physical or social activities, chronic stress; Table 1). The predictive value of such risk factors remains to be established, and their presence may not be enough to justify the potential risk of pharmaceutical interventions in asymptomatic individuals. There has been growing interest in identifying alternative treatment strategies that do not rely on pharmacological intervention, but rather on ameliorating individual medical and lifestyle risk factors as a way of delaying or preventing the onset of AD symptoms. Increasing evidence suggests that diet, a major modifiable lifestyle factor, may play a significant role in preventing or delaying cognitive decline and risk for dementia.

Table 1 Known risk factors for memory decline and Alzheimer’s disease

Despite studies in favor of a single or a few nutrients in the prevention of AD, the translation to formal clinical trials testing has been largely disappointing [1217]. For instance, prospective studies overall showed an association between higher intakes of vitamin E or α-tocopherol equivalents and reduced incidence of AD [18]. However, some clinical trials showed no clinical benefits of vitamin E supplementation [15], whereas a recent double-blind, placebo-controlled, parallel-group, randomized clinical trial showed that, among patients with mild to moderate AD, 2000 IU/d of alpha tocopherol resulted in slower functional decline compared with placebo [19]. Clinical benefits were independent of treatment with memantine [19]. Likewise, despite the well-established association between B vitamins intake and lower risk of dementia [20], B vitamin supplementation did not slow cognitive decline in clinical trials of mild to moderate AD patients [12]. However, recent clinical trials that used biomarkers as the primary outcome showed that high-dose B vitamins supplements slowed brain atrophy rates in MCI [21, 22].

We offer that biomarkers of AD are the most useful measures in the short term to detect associations between dietary nutrients and AD risk and to monitor treatment efficacy, well in advance of any clinical or cognitive symptom development. Longer-term studies will be needed to determine whether the effects of dietary nutrients on AD biomarkers results in actual disease prevention. We describe the latest research on diet, nutrition, and risk for AD, with a particular emphasis on biomarker findings. (This article does not contain any studies with human or animal subjects performed by any of the authors.)

Biomarkers of Alzheimer’s Disease

Brain imaging techniques have been developed to visualize and quantify the progressive accumulation of amyloid-beta (Aβ) plaques, neurofibrillary tangles, and neuronal loss that occurs during the preclinical stages of AD, leading to changes in brain structure and function. These methods include in vivo positron emission tomography (PET) of brain Aβ load and neuronal glucose metabolism, and magnetic resonance imaging (MRI) measures of structural brain volume (i.e., atrophy), perfusion, resting state connectivity, and others. Among these, structural MRI, amyloid, and metabolic PET have an established early detection potential [6] and have been used to examine diet and nutrition effects on brain AD biomarkers.

Not yet 10 years ago, PET tracers for fibrillar Aβ (i.e., a major constituent of senile plaques) have been developed, which made the in vivo detection of AD pathology a reality. Several Aβ tracers are available, including the most widely utilized N-methyl-[11C]2-(4’-methylaminophenyl)-6-hydroxybenzothiazole (Pittsburgh compound-B, PiB), and the U.S. Food and Drug Administration-approved 18F-Flutemetamol, 18F-Florbetapir, and 18F-Florbetaben [23]. These tracers bind fibrillar Aβ with high affinity, showing consistently increased tracer retention in parieto-temporal, frontal, and posterior cingulate cortex of AD and mild cognitive impairment (MCI) patients versus age-matched healthy controls [2426]. Tracer retention in these regions is consistent with the distribution of Aβ plaques observed at postmortem [27] and predicts future decline from normal cognition to AD [2830].

PET imaging with 2-[18F]fluoro-2-Deoxy-D-glucose (FDG) measures resting-state cerebral metabolic rate of glucose (CMRglc), a proxy for neuronal activity [31]. AD-related synaptic dysfunction and loss induce a reduction in neuronal energy demand that results in decreased CMRglc [31]. Structural MRI is an integral part of the clinical assessment of AD, and MRI-defined atrophy of medial temporal lobes (including hippocampus and entorhinal cortex) is now considered to be a valid diagnostic marker of early AD [32]. FDG-PET and MRI have long been used to visualize neuronal loss in AD, which is known to originate in the medial temporal lobes during the normal stages of cognition and spread to posterior cingulate, parieto-temporal, and frontal cortices along with AD progression [33]. Several studies have shown progressive CMRglc reductions and atrophic changes in AD-vulnerable regions several years prior to dementia onset [3440].

Of great relevance to the early detection of AD, brain biomarker abnormalities were detected in young to old individuals with risk factors for late-onset AD, such as subjective memory decline [41, 42], a first-degree family history [4345], and the APOE ε4 genotype [7, 4648]. These biomarkers are therefore ideal to serve as indicators of AD risk in testing for associations between diet and AD risk during the preclinical phase of AD.

Diet, Nutrition, and Risk of Alzheimer’s Disease

“We are what we eat”: the nutritional content of what we eat determines the composition of our cell membranes, bone marrow, blood, and hormones and therefore is the foundation upon which our body and brain are built. Whereas the importance of nutrition in health is well understood, the specific effects of nutrition on brain aging are less so, as the biological mechanisms underlying the relationship between dietary nutrients, brain aging, and AD are largely unexplored. Understanding how diet and nutrition promote healthy brain aging in people at increased risk for AD is critical prior to implementing dietary recommendations for prevention and treatment.

A distinction must be made between diet and nutrition; diet refers to patterns of foods eaten, whereas nutrition refers to the components of the foods that one may absorb. Most studies that looked at the associations between diet, nutrition, and AD risk have focused either on dietary patterns (i.e., food combinations) or specific dietary nutrients. Studies that reported associations between dietary nutrients and AD risk with the use of biomarkers are the main focus of this review and are discussed below (for other reviews see [20, 4951]).

Dietary Patterns

The majority of epidemiological studies of diet and AD have focused on detecting associations between adherence to specific dietary patterns and risk of decline to AD. There is consensus that higher adherence to a Mediterranean diet (MeDi)-type pattern is associated with slower cognitive decline, reduced risk of progression from MCI to AD, and reduced mortality in AD patients [5256]. These effects were independent of other risk factors, including physical activity [57] and vascular comorbidity [58].

However, the biological mechanisms underlying the hypothesized brain-protective effects of the MeDi have not been clearly established. A few MRI studies reported associations between lower MeDi adherence and increased cerebrovascular disease burden (i.e., white matter lesions) in the elderly [59, 60]. A structural MRI study of cognitively normal adults showed an association between lower MeDi adherence and increased cortical thinning (i.e., atrophy) in key AD-regions, such as posterior cingulate, entorhinal, and orbitofrontal cortex [61] (Fig. 1A). Results were independent of possible risk factors for late-onset AD, such as age, gender, education, APOE genotype, family history, as well as body max index (BMI), insulin resistance, and hypertension [61]. Overall, biomarker studies provide a pathophysiological substrate to clinical data and suggest that the MeDi may modulate AD risk through its effects on neuronal integrity. Clinical trials are needed to formally test whether MeDi interventions would result in reduced atrophy rates or other biomarkers’ improvement, as well as significant reduction in, or prevention of, symptoms of cognitive decline.

Other studies used statistical models to derive food combinations associated with specific outcomes (decline to AD vs. stable normal cognition), without a priori selection of food groups like in the MeDi [49]. Despite differences in analytic approaches, dietary patterns characterized by higher intakes of fruits, vegetables, fish, nuts and legumes, and lower intake of meat, high-fat dairies, and sweets have been consistently associated to reduced risk for AD [49]. These findings remain to be validated by the use of biomarkers.

Nutrient Patterns

A large body of prospective studies measured dietary nutrients by means of self-reported, semiquantitative food frequency questionnaires (SFFQ), and provided evidence for associations between increased intake of omega 3 (ω3) polyunsaturated fatty acids (PUFA) [6264], B vitamins [13, 6567], vitamins A, C, and E [18, 6870], various trace elements [71, 72], and better cognitive functioning or lower AD risk in the elderly. These clinical studies provided the basis for extending the investigation to biomarkers of AD to better clarify what protective effects these nutrients had on brain.

A community-based study of healthy elderly (age >65 years) examined plasma Aβ40 and Aβ42 levels and dietary intake of 10 nutrients associated with cognitive aging and AD, including saturated fats, monounsaturated fats, ω-3 and ω-6 PUFA, β-carotene, vitamin B12, vitamin C, vitamin D, vitamin E, and folate [73]. After adjusting for age, gender, ethnicity, education, caloric intake, and APOE genotype, higher intake of ω3-PUFA was associated with lower levels of plasma Aβ42 [73]. None of the other nutrients was associated with Aβ measures. However, peripheral Aβ measures may not be accurate indicators of Aβ concentration in brain.

A brain imaging PiB- and FDG-PET study examined the associations between the same 10 nutrients as in [73], brain Aβ load, and CMRglc in cognitively normal adults (mean age 54 years) with and without risk factors for AD. On PiB-PET, higher intake of vitamin B12, vitamin D, and ω3-PUFA was associated with lower Aβ load in posterior cingulate, parieto-temporal and frontal regions [74••]. On FDG-PET, higher intake of β-carotene and folate was associated with higher CMRglc in AD-vulnerable regions, whereas higher consumption of saturated fats was associated with reduced CMRglc [74••]. A significant impact of risk factors, such as gender, APOE, and family history, was noted on the associations between nutrients and CMRglc. Specifically, women, individuals with positive family history, and APOE ε4 carriers showed stronger associations between CMRglc and nutrient intake than their risk-free counterparts [74••]. These results suggest that genetic risk in conjunction with unhealthy eating habits may potentiate genetic predisposition. On the other hand, the association between Aβ load and dietary nutrients was not exacerbated in presence of these risk factors, although this may vary in older populations with more substantial Aβ deposition.

Given the interactive nature of nutrient action and cellular metabolism, principal component analysis (PCA) has been used to generate nutrient patterns (NPs), which capture the interactive effect of nutrients in combination. Multimodality brain imaging studies tested PCA-derived NPs for associations with Aß deposition on PiB-PET, CMRglc on FDG-PET, and gray matter volumes (GMV) on MRI in 25- to 72-year-old cognitively normal adults [75]. Five distinct NPs were extracted from a panel of 35 nutrients related to AD or cognitive function. A first NP characterized by intake of vitamin B12, vitamin D, and zinc was favorably associated with all biomarkers, in that the higher intake of these nutrients, the lower Aβ load, and the higher CMRglc and GMV in AD-vulnerable regions (Fig. 1B and C). Additionally, CMRglc and GMV were negatively associated with a second NP characterized by intake of saturated, trans-saturated fats, cholesterol, and sodium. Finally, CMRglc was positively associated with two additional NPs: one characterized by intake of vitamin E, ω-3, and ω-6 PUFA, and the other by intake of vitamin A, vitamin C, several carotenoids (α- and β-carotene, β-cryptoxanthin, lutein, zeaxanthin), and dietary fibers [75]. The identified “brain-protective” NPs were correlated to higher intake of vegetables, fruit, whole grains, fish, low fat dairies, and nuts, and lower intake of sweets, fried potatoes, processed meat, high-fat dairies, and butter [75] (Fig. 2).

Fig. 1
figure 1

Associations between dietary patterns, nutrient patterns and brain biomarkers of Alzheimer’s disease. A. Left side of panel: Reduced gray matter volumes in cognitively normal (NL) individuals showing lower vs. higher adherence to a Mediterranean-diet (MeDi) pattern (MeDi- vs. MeDi+), in orbitofrontal gyrus (OFG), entorhinal cortex (EC), and posterior cingulate gyrus (PCG) [61]. Right side of panel: Three-dimensional representations of the three regions showing gray matter volume reductions in NL MeDi- compared with MeDi+ (OFG, EC, PCG). Regions of interest are visible in the medial view of the gray matter surface. Only the right hemisphere is shown. B. Statistical parametric maps showing beneficial associations between intake of Vitamin B12, Vitamin D, and zinc, and three major AD biomarkers: brain glucose metabolism on FDG-PET (top), gray matter volumes on MRI (middle), and Aβ load on PiB-PET (bottom). Results are represented on a color-coded scale at p < 0.001 and displayed onto a standardized MRI. Results are adjusted for age, gender, education, BMI, APOE, family history, and total caloric intake [75]. C. Correlations between a nutrient pattern characterized by intake of Vitamin B12, Vitamin D, and zinc and Aβ load in the posterior cingulate cortex of NL individuals aged 25-72 years [75]

Fig. 2
figure 2

Neuronutrition. Hypothetical model of brain-protective nutritional recommendations resulting from available research studies of linking dietary patterns, nutrients and biomarkers of AD in cognitively normal adults

These findings are consistent with previous epidemiological and animal studies showing brain-supporting effects of the NPs identified by the use of biomarkers. The NP associated with all biomarkers included neuroprotective vitamin B12 [13, 6567], vitamin D [76, 77], and zinc, one of the most important transitional metals for human metabolism that is involved in Aβ adhesiveness and amyloid precursor protein synthesis [78]. Higher intake of saturated and trans-fats was associated with reduced CMRglc and GMV, consistent with the notion that “bad fats” have negative effects on cognitive function [62, 79]. CMRglc was positively associated with intake of vitamin E, an anti-oxidant, ω-3 and ω-6 PUFA, which are known for their neuroprotective properties through anti-inflammatory, antioxidant, and energy metabolism pathways [80]. CMRglc also was positively associated with intake of vitamins A and C, several carotenes, and dietary fibers. These nutrients are known to have beneficial effects via their antioxidant and Aβ anti-oligomerization effects [81, 82] and dietary fibers help to regulate glucose levels [83].

A limitation to the above studies is that nutrient intake was estimated based on SFFQs. SFFQs have become a well-accepted method for assessment of usual food choices and nutrient intake and have been validated for determination of nutrient intake in the elderly and young adults [8486]. However, this method may not provide the necessary precision to accurately estimate the intake of specific nutrients, as it is subject to faulty recall of dietary intake and portion size, often estimated over a long amount of time such as over the prior year. Direct, quantitative measurements of plasma nutrients are better suited to provide quantitative and reliable measures of bioavailable macro- and micronutrient levels. For instance, vitamin B12 intake may be challenging to characterize by dietary intake assessment as its serum level can be affected by gastrointestinal malabsorption syndromes, such as those seen with pernicious anemia, Crohn’s disease, or gastric bypass surgery; by excessive alcohol intake; or by common medications, including antacids and antidiabetic agents. Likewise, a significant component to vitamin D serum levels is sunlight exposure as well as dietary intake. Malabsorption syndromes may also affect vitamin D bioavailability, as well as conditions, such as obesity or renal insufficiency, which may interfere with vitamin D extraction or conversion to its active form.

Plasma Nutrients

Very few studies examined the relationships between plasma nutrients and brain biomarkers of AD in NL individuals. A cross-sectional MRI study looked at plasma nutrient patterns in NL elderly of fairly advanced age (>85 years) [87•]. Two plasma PCA-derived NPs associated with more favorable cognitive and MRI measures were identified: one NP high in plasma B vitamins (B1, B2, B6, folate, and B12), vitamin C, vitamin D, and vitamin E; and another NP high in plasma ω3-PUFA. Higher intake of these nutrients was associated with better executive and visuospatial function, larger brain volumes, and lower cerebrovascular burden [87•]. A third NP characterized by high plasma trans-fat levels was associated with less favorable cognitive function and reduced brain volume [84].

A few longitudinal MRI studies investigated the associations between plasma ω3-PUFA and brain volumes in nondemented elderly, showing a strong correlation between higher baseline ω3-PUFA levels and lower atrophy rates up to 8 years later, especially in medial temporal lobes [8890].

Given the strong associations between plasma nutrients and MRI changes, a recent randomized, double-blind, controlled trial of high-dose B vitamins supplementation (vitamin B6, B9, B12) in MCI patients focused on the change in MRI volumes as the primary outcome [21, 22]. The treated group showed lower rates of whole brain atrophy over 2 years compared with placebo [22]. The treatment response was somewhat confined to participants with high homocysteine (>13 μmol/L), who showed 53 % lower atrophy rates compared to placebo [22]. A follow-up study demonstrated that the B-vitamin treatment reduced, by as much as sevenfold, the cerebral atrophy in brain regions specifically vulnerable to AD, such as the medial temporal lobe [21]. A causal Bayesian network analysis indicated the following chain of events: B vitamins lowered homocysteine, which directly led to a decrease in atrophy rates over 2 years, thereby slowing cognitive decline [21]. It remains to be determined whether B vitamins would be beneficial to patients with normal homocysteine levels and whether lowering homocysteine would reduce brain atrophy rates in cognitively normal elderly, thus prior to the relatively late MCI stage of AD. Nutritional supplementation may have stronger effects on Aβ deposition or metabolic activity, which are altered prior to structural changes in AD [6].

A 4-week randomized, controlled, clinical trial tested the effects of a high-saturated fat/high-glycemic index (HIGH) versus a low-saturated fat/low-glycemic index (LOW) diet on cerebrospinal fluid (CSF) Aβ levels and cognition in healthy elderly and MCI patients[91]. The LOW diet improved delayed visual memory and increased Aβ42 concentration in MCI [91], contrary to the pathologic pattern of lowered CSF Aβ42 typically observed in AD. For both MCI and controls, the HIGH diet increased and the LOW diet decreased plasma lipids, insulin, and CSF F2-isoprostane concentrations (i.e., a marker of inflammation) [91].

Conclusions

AD has been reconceptualized as a potentially preventable illness. Because pharmacological treatments are limited, there is increasing interest in implementing brain-protective lifestyle changes during the preclinical phase of AD. AD biomarkers are useful tools to examine the effects of lifestyle on AD-risk prior to irreversible neuronal loss and subsequent onset of clinical symptoms, at a time when the potential for disease modification is the greatest.

A growing body of literature provides evidence for dietary and nutrient patterns associated with biomarkers of AD risk, thus indicating that dietary nutrients may modulate AD-risk and cognitive performance through their effects on Aβ deposition and associated neuronal injury several years prior to possible symptoms onset. Longitudinal studies with larger samples are needed to elucidate the molecular mechanisms through which dietary nutrients modulate AD risk, as well as to provide preliminary data for much needed clinical trials that will capitalize on the use of biomarkers. While early clinical trial results were largely disappointing, recent trials in MCI patients provided encouraging evidence that dietary interventions can affect AD pathology. Homocysteine-lowering B vitamins attenuated AD-related brain atrophy [21, 22], and a low saturated fat/glycemic index diet improved Aβ CSF composition in MCI patients at high risk for AD [91]. Inflammatory markers offer another interesting target for clinical trials, because there is increasing recognition that low levels of systemically increased inflammation associated with negative lifestyle habits promote AD pathology and cognitive decline [9294].

Future clinical trials will need to take into account the participants’ intellectual and physical activity levels, because these lifestyle factors have been consistently associated with increased physical and mental health throughout life and reduced risk of AD [9597].

Finally, biochemical individuality is another concept of great relevance [98]. Because we are all genetically and biologically unique, no one treatment or intervention may work for everybody. Personalized medicine will likely be most useful to assess specific individual risk factors for AD, especially modifiable ones, such as nutrition and other lifestyle factors. Addressing these risk factors early may go a long way towards disease delay or prevention, thus greatly reducing the cost of AD, both in terms of health care spending and in human suffering.

In conclusion, current studies support further exploration of dietary behaviors and nutritional status for the prevention of AD. Evidence for an association between poor nutritional status and increased neuronal vulnerability during aging could open up new avenues for future investigations of nutritional interventions decades prior to possible AD symptoms, with immediate impact on national health and clinical practice. A major goal in AD research will be to define dietary and nutritional patterns that promote cognitive health [99, 100•], in the same manner that dietary approaches for hypertension have been derived [101]. We offer that developing brain-healthy eating patterns is key to successful aging. Nutritional recommendations that are focused on supporting brain and cognitive function are sorely needed for the prevention of age-related diseases like AD.