Introduction

A steady but certain shift towards the integration of evidence-based tools to enhance traditional methods of dietary assessment is currently emerging in the field of nutrition science. This is noteworthy, because much of our understanding of dietary associations in relation to Alzheimer’s disease (AD) and other late onset dementia (LOD) forms has derived from self-administered dietary assessments which are prone to biases and reporting errors.

Specifically, the integration of -omics technologies into the evaluation of dietary exposures heralds a new era of research that can be more robust with potential to improve reliability and offer novel insights to pre-disease indicators and preventive targets in cognitive aging and dementia.

Recent years saw a plethora of publications from non-pharmacological interventional studies such as the FINGER randomized controlled trial (RCT) providing evidence that diet, as part of a multi-domain lifestyle intervention, can contribute to significantly mitigate decline in cognitive performance and potentially delay the onset or progression of AD and other types of dementia (1). These promising results, based on simultaneous management of several vascular and lifestyle related risk factors, in which the intervention group were assigned a multidomain intervention consisting of diet, exercise, cognitive training, vascular risk monitoring versus the control group assigned general health advice, showed efficacy after a 2 year period in improving cognitive performance in at-risk, but cognitively unimpaired and/or mildly impaired individuals (2). This was pivotal because interest in lifestyle modifications to reduce disease prevalence was affirmed and personalised strategies/optimal timing became discussion points.

The increasing awareness that pathological changes in AD-LODs develop many years prior to clinical disease onset and the results of several observational and interventional studies has led the field to acknowledge that the pre-clinical or early clinical stages are the optimal time points for intervention; thus, attention has shifted towards health-promoting behaviours such as healthy diet, maintaining an optimal weight, physical and mental activities and social interactions, in addition to potential pharmacological therapies, if and when novel disease modifying medicines of proven efficacy and cost effectiveness become available.

Diet and nutrition are viable targets for strategies aimed at preserving cognitive health in older adults and have become a focus in dementia related studies. Moreover, there has been a gradual shift from single nutrient analyses towards dietary pattern analyses, reflecting trends in nutritional epidemiology, in which synergistic effects of food combinations and possible nutrient interactions are deemed more informative (3).

Despite accruing evidence from a wealth of epidemiological studies, showing that adherence to healthy dietary patterns, such as the Mediterranean, Nordic, DASH, MIND or anti-inflammatory diets may lend neuroprotective effects, and more recently, the ketogenic diet, in its capacity to alter brain metabolism (47), these findings have not translated uniformly into dietary guidelines, to improve cognitive health and disease burden reduction in older adults, due to inconsistencies in the literature.

Figure 1
figure 1

From urine collection to dietary assessment pipeline

1.Urine collection with timepoint 2. Sample analysis 3. Generation of one metabolic profile per urine sample 4. Mathematical modelling generates score classifying dietary profile in relation to the adherence to dietary recommendations

In order to understand the precise effects of dietary intake on cognitive function and their mechanism of action, dietary assessment methodologies must be able to measure dietary intake as accurately as possible. As it stands, current methods of dietary assessment typically collect self-reported data through food-frequency questionnaire (FFQ), 24-hour recalls, or food diaries which are dependent on subject recall and cognitive functioning; all self-reported methodologies have considerable scope for measurement error and inherent bias, with under-reporting biased towards unhealthy foods and dietary energy intake and over-reporting towards healthier foods (8).

Also, there is potential error in estimating portion sizes and the limitation of the number of foods and dishes that have been directly analysed. Reporting is known to deteriorate further in the obese and likely to include significant inaccuracies in older populations. Prevalence of misreporting has been estimated as between 30–88% in epidemiological surveys (9). In large scale studies, use of FFQ’s are commonly applied, but since these are tailored to suit the population they serve, there is wide variation in number and range of items appearing on food lists.

Another limiting factor with traditional dietary assessment models is the limited number of nutrients that have been measured accurately in food. Evidence suggests that bioactive molecules in food such as polyphenols may play a role, but these molecules are not measured in most nutritional data sets. This limits the scope of understanding between the chemical composition of food and cognitive associations.

Reliability of dietary assessment is further compromised by analytical methods applying either a priori methodologies using constructed scores based on an underlying hypothesis and dietary guidelines which do not reflect entire dietary intake or a posteriori methods, applying data reduction techniques, such as principal component analysis, factor analysis, or cluster analysis to categorise on intercorrelations which provides insight to shared characteristics within a population but has limited comparability and reproducibility in other population samples. Both approaches are limited by pre-defined selections of food and nutrient groupings resulting in varying interpretations of dietary exposures and disease risk.

In recent years, the advent of online dietary assessment tools has made data collection and analysis much easier, albeit not mitigating the biases, as outlined above; also including technological competence in older adults, recently noted as a possible confounder for reporting accuracy; a feasibility study on use of online dietary recalls among older adults indicated that participants who completed multiple recalls reported higher self-confidence with technology and a higher technology readiness score than those who did not complete any recalls (10). There is, still, a need for evidence-based tools to be evaluated for validation and reliability in study outcomes and clarity in mechanisms that might be protective against dementia. We therefore propose a focus on applying metabolomics as a validation tool and framework for investigating the immediate or cumulative effects of diet on cognitive status and decline.

Metabolomics as a validation tool

The implementation of high-throughput -omics technologies in dietary assessments holds promise for evidence-based data, by providing objective measures of dietary intake in targeted or untargeted analyses, thus mitigating the risks of bias and subjectivity of self-administered data collecting methods.

Such technologies include Nuclear Magnetic Resonance (NMR) spectroscopy and Mass Spectroscopy (MS) based techniques using Liquid Chromatography (LC-MS) and Gas Chromatography (GC-MS) for improved separation.

A novel approach, developed to assess dietary intake against metabolic profiles, has been the adoption of NMR spectroscopy in urinary analyses to detect concentrations of small molecule metabolites which reflect “actual” rather than “estimate” measures of food intake. In this way, distinguishable metabolites can be used to validate dietary intake of FFQs and dietary patterns (8).

For example, a higher intake of fish consumption or protein in a FFQ can be validated by protein related metabolites in the metabolomic profile or other metabolites such as proline betaine, a marker of citrus fruit intake (11). Established dietary biomarkers, such as urinary sodium or nitrogen balance can track intake of specific nutrients, however metabolic profiling goes further in providing insight to metabolomic response of overall dietary intake following digestion, absorption and metabolism, providing insight to functional relationships between diet and health outcomes (12).

Table 1 NMR Metabolomics validation studies of self-reported dietary pattern intake

In addition to the feasibility of using metabolic profiles to validate dietary patterns, merging -omics with dietary assessments provides an opportunity to monitor and objectively assess dietary intake against healthy eating targets using urine composition; thus, enabling the quantification and monitoring of the potential effect of the adherence to or changes in dietary pattern in response to risk reduction strategies.

Several nutrition studies have explored how metabolomics can establish accurate associations between diet and disease risk to predict health outcomes. In a randomised, controlled trial, four dietary patterns, administered in controlled feeding conditions, revealed diet-discriminatory metabolomic profiles associated with different degrees of non-communicable disease risk, based on compliance to the WHO recommended healthy diets (8). The model was validated using internal and external cohort data with 24-hour recalls and found to associate with predicted scores of dietary profiles derived from the urinary metabolic profiles.

Applying the same methodology, a recent study demonstrated agreement between urinary metabolic biomarkers and self-reported data built on a model of 46 urinary metabolites, paired with 24-hour dietary recalls from 1,848 US individuals to accurately differentiate between healthy and unhealthy dietary patterns (13). Additionally, in a RCT of subjects with hypertension, the presence of specific metabolic markers suggested a mediating response to pre-disease markers in the gut microbiome (14).

Furthermore, an integrated metabolomic approach across platforms (NMR, LC-MS and GC-MS) may allow to accurately assess thousands of molecules associated with food intake that are not captured in current food tables.

The integration of nutritional metabolomics into dementia research invites new possibilities; improved reliability in dietary assessments might prove valuable but may only be the tip of the iceberg. If we consider that the ‘functional nutriome’, used to describe “chemically-defined diet-derived molecular species” (13) and expression of metabolic phenotype, may affect diet and disease risk, this has relevance in the context of AD risk and precision nutrition. We know that multiple processes leading to cognitive decline and dementia begin many years prior to the onset of cognitive decline; in determining whether there are specific metabolomic markers present in the earliest stage of cognitive decline which are observed in the food-derived metabolome, would be an important step in identifying predictive signatures of disease risk for earlier diagnosis in AD and other LODs. Furthermore, metabolite profiles can differentiate dietary patterns, so theoretically diet can be interrogated without dependency on self-reported dietary assessments.

Biofluid used for metabolomic dietary assessment

The vast majority of current metabolomic-based dietary assessment use either plasma or urine, however evidence on comparability of the two biofluids in relation to diet is limited. For urine, protocols are beginning to emerge which give guidance on the use of spot samples. Evidence suggests the first void urine gives the most relevant information (15). Moreover, the application of metabolomics to measurement molecular profile of food allows molecules to be traced from consumption to metabolism endpoints.

However, a major limitation of current metabolomic dietary assessments is that it is not possible to assess carbohydrate intake fully, due to the lack of relevant biomarkers. Given that carbohydrates are a major source of energy in most diets this is a limitation and an area of active research. Another limitation relates to the timeframe of the biofluid sample report; since it has been suggested that the first void urine relates to the previous day’s dietary intake (16).

Dementia-related metabolomic studies

The National Institute on Aging-Alzheimer’s Association (NIA-AA) research guidelines for AD and cognitive decline due to AD pathology, emphasize the need for the implementation of biochemical markers and validation measures to unify findings across the variation of diverse methodologies (17), under the revised NIA-AA Research Framework biomarker-based AD criteria for research purposes of amyloid and tau abnormal accumulation and neurodegeneration AT(N) (18).

The advancement of AD biomarkers has gained momentum in recent years in determining the AT(N) status in vivo, through the use of positron emission tomography (PET), cerebrospinal fluid (CSF), plasma-based assays and magnetic resonance imaging (MRI) studies, as core indicators of disease pathology (19). Moreover, metabolites in different biofluids (serum, plasma, CSF, urine) are proving to be promising indicators of alterations in lipids, amino acids, hormones and other circulating metabolites and their potential associations with cognitive performance change (20), (21) and as precision medicine tools contributing to the classification of patients into cognitive status subgroups based on metabolite signatures (22).

However, the literature is lacking examples of diet-dementia related studies and reference models applying nutritional metabolomics in cognitive and dietary pattern associations. Results from one case-control study linked a baseline serum signature of 22 metabolites with subsequent cognitive decline over 12 years and suggested specific foods (coffee, cocoa and fish) may be protective (23). Despite the novelty of the findings, the study was not validated in other cohorts, so reproducibility is still unclear, delineating the need for additional high-quality studies that apply complementary metabolomic platforms and approaches to identity predictive signatures of AD risk and preventive targets in cognitive decline and dementia.

To conclude, the emergence of improved measurements of ageing and dietary biomarkers represents a potentially exciting new development in dementia research. The enrichment of dietary assessments through the application of novel -omics technologies provides an opportunity for greater confidence and reliability in self-reported measures, whilst allowing to better understand the metabolomic responses in relation to dietary and cognitive associations. As Alzheimer’s and other late onset dementia forms are multi-faceted diseases, evidence-based and multi-pronged approaches are required in disentangling the respective roles of modifiable factors in disease development. It is our view that future research aimed at exploring the role of diet in brain health and dementia prevention should leverage emerging innovative high-throughput technologies, such as metabolomics, that may more sensitively and accurately inform on the functional nutriome.