Introduction

Since the early days of nutritional genomics research, there has been interest in the possibility of using the growing knowledge related to gene–diet interactions to provide personalized dietary advice based on the individual genotype (Kaput and Raymond 2006). However, scientists have debated when the timing would be appropriate for translating the outcomes of this research into public health action. A major question in this respect is whether the scientific evidence is sufficiently strong to offer personalized nutritional advice based on genotypic information. This article brings together three voices to discuss this question from a scientific as well as an ethical perspective. It is based on two contrasting scientific voices. The first of them asks whether the science is strong enough and argues that the evidence base for translating the outcomes of nutrigenomics research into personalized nutritional advice is immature. The second, on the other hand, asks: If not now, when? The ethical comment, finally, considers the ethical aspects of deciding how to proceed in the face of such uncertainty. The article provides an ethical context for the EU FP7 project Personalised nutrition: an integrated analysis of opportunities and challenges, known as Food4Me, which is attempting to provide empirical evidence of the utility of a personalized approach to nutritional advice.

Personalized nutrition: is the scientific evidence strong enough?

Recent studies concerning the interactions between nutrition and the genome have yielded promising results. They have revealed much about the ways in which individual genotypes modulate the responses to dietary factors and have provided rich mechanistic insights into how nutrients and other components of foods regulate gene expression as well as cell and tissue functions (the science of nutrigenomics). In addition, technological advances have driven down the costs and improved the reliability and availability of personal genome testing (PGT). A recent survey of public awareness of and interest in PGT in the United Kingdom found that only 13 % of respondents knew about PGT (Cherkas et al. 2010). However, once it had been explained to them, 93 % of respondents claimed that they would be interested in having a free PGT “to encourage them to adopt a healthier lifestyle if found to be at high genetic risk of a disease” (Cherkas et al. 2010). However, other researchers have determined that low levels of “genetic literacy,” particularly among those most at risk of common complex diseases, are a current barrier to the communication of genotype-based risk information (McBride et al. 2010).

There is compelling evidence that each individual’s health is determined by interactions between his or her fixed genotype and nutrition (and other environmental exposures) together with the effects of stochastic events—a hypothesis that is conceptualized in the “health pendulum” (Mathers 2002). This phenotypic plasticity is the mechanistic basis on which lifestyle-based interventions aimed at improving health and well-being can be developed. To date, most nutritional interventions have been generic (population level), with limited attempts to stratify or personalize these interventions. Such personalization could be achieved by consideration of each individual’s dietary, phenotypic, or genotypic characteristics. Since behavior change is key to any health improvement from dietary interventions, the important question to be addressed is the following: will personalized nutrition produce larger, more appropriate, more sustained, more cost-effective behavior change and greater gains in health and well-being than can be achieved by conventional dietary advice? Focusing on personalization using genotypic information, the major questions include the following:

  • Is our current understanding of diet–gene–health relationships sufficiently robust as a basis for offering useful genotype-based dietary advice?

  • If people are offered genotype-based dietary advice, are they more likely to change their eating behavior in more healthful ways? (see Bouwman 2009).

Assessment of our understanding of diet–gene–health relationships

There is convincing evidence that the risk of common diet-related diseases, such as cardiovascular disease (CVD), type 2 diabetes, osteoporosis, dementia, and some cancers, is influenced by genetic factors and that carrying specific genetic variants can modulate individual biological responses to nutrients. However, knowledge in this area is fragmentary, and very few diet–gene–health relationships have been tested for causality in human intervention studies (Joost et al. 2007).

For example, higher intakes of oily fish (or fish oil) are associated with lower risk of CVD through mechanisms that may include lowering of plasma triacylglycerol (TAG) concentrations by the long-chain polyunsaturated fatty acids, eicosapentaenoic (EPA), and docosahexaenoic (DHA). However, when people are given extra EPA and DHA under controlled conditions, there is substantial inter-individual heterogeneity in the TAG response, with some individuals showing an increased, not decreased, TAG concentration (Madden et al. 2011). To obtain proof-of-principle that some of this inter-individual variation in response is due to genotype, adult participants were prospectively genotyped for apolipoprotein E (APOE) before recruitment into a randomized controlled trial (RCT), in which they were given two doses of fish oil (Caslake et al. 2008). This demonstrated that both the APOE genotype and gender determined the TAG response to fish oil supplementation (Caslake et al. 2008).

The literature was comprehensively searched to establish the strength of the evidence for the impact of genotype on the fish oil–CVD risk relationship (Madden et al. 2011). This review revealed the following: (1) there is a distinct lack of information on the factors that determine inter-individual responsiveness to fish oil; (2) few diet–gene–health relationships have been confirmed in independent studies; and (3) there is a paucity of RCTs that used prospective genotyping. In addition, it was concluded that considering single genes (and gene variants) may be too simplistic (Madden et al. 2011). This and other studies demonstrate that though diet–gene–health relationships are undoubtedly important, they remain poorly understood (Joost et al. 2007).

Behavioral responses to genotype-based information

Limited empirical evidence addresses the question: “Will people change their eating behavior in healthier ways if they are offered genotype-based dietary advice?” A recent systematic review investigated the effectiveness of DNA-based advice in changing behavior with respect to diseases for which risk could plausibly be reduced by behavioral change. Only 14 papers were found, which reported the results from seven clinical studies (two papers reported on the same trial) and six analog studies, that is, studies in which participants were asked to imagine their responses to genotypic information (Marteau et al. 2010). Of these, just two studies assessed the effects on dietary behavior.

The first study tested the hypothesis that disclosing evidence of a genetic mutation in individuals with a clinical diagnosis of familial hypercholesterolemia (FH) would reduce the patient’s perception of control over the disease and adherence to risk-reducing behavior, including dietary behavior (Marteau et al. 2004). No mutation was discovered in some study participants who acted as controls, although the study inclusion criteria required that all participants be patients who suffered from FH. Six months later, compared with controls (no mutation discovered), significantly (p < 0.02) fewer participants who discovered that they had a mutation believed that “eating a lower fat diet would reduce my cholesterol level.”(Marteau et al. 2004) In contrast, more participants who discovered that they had a mutation believed that “taking medication would reduce my cholesterol level” (p = 0.06) (Marteau et al. 2004). In summary, genetic testing that confirmed the diagnosis of FH seemed to weaken the participants’ belief in the effectiveness of dietary change (Marteau et al. 2004).

The Risk Evaluation and Education for Alzheimer’s Disease (REVEAL) study examined the impact of disclosure of APOE ε4 status on behavior change in the adult offspring of parents with Alzheimer’s disease (AD) (Chao et al. 2008). Because a parent suffered from the disease, all study participants were at a higher-than-average AD risk and carrying the APOE ε4 variant added further to their AD risk. All REVEAL participants were given a numerical estimate of their AD risk; they were divided into three groups—those who were not given any genotypic information (controls) and those who were tested for APOE ε4 and those who had their APOE ε4 status communicated to them (positive or negative). One year later, participants were asked about changes in their behavior, including changes in diet and exercise, that would be expected to reduce AD risk (Chao et al. 2008). Similar proportions of positive behavior changes were reported among controls and among participants who told that they were APOE ε4 negative. However, a positive behavior change was reported approximately twice as often among participants who told that they were APOE ε4 positive (Chao et al. 2008). Nevertheless, this finding should not be interpreted as unequivocal evidence that genotype-based evidence can motivate positive behavior change. Participants who were APOE ε4 positive had received a higher overall numerical AD risk score, and it is possible that it was the greater risk score, as distinct from the genotypic information, that enhanced the motivation for a behavior change (Fanshawe et al. 2008).

Following a systematic review, Marteau et al. (2010) drew attention to the weak evidence about the effects of communicating DNA-based information on risk-reducing behavior. They stated, “Claims that receiving DNA-based test results motivates people to change their behavior are not supported by the evidence,” and they called for larger and better-quality RCTs (Marteau et al. 2010).

In summary, the evidence base for translating the outcomes of nutrigenomics research into personalized nutritional advice is immature. In addition to significant gaps in the relevant basic science, there is limited evidence that genotype-based dietary advice will motivate appropriate behavior changes and that interventions based on such advice will be more cost-effective than conventional population-level interventions (Hall et al. 2010). Filling these gaps will require larger, better-designed RCTs.

Personalized nutrition: if not now, when?

It is sometimes asked whether personal genetics is ready for “prime time” (Haga et al. 2003; Khoury 2010). If prime time means that personal genetics should be broadly adopted and reimbursed by insurance or public health services, the answer is probably no. But that does not mean that it is not ready at all: some specific gene–diet interactions should probably be seriously considered by expert committees (see examples below for MTHFR and GST genes); and there are others for which the evidence of probable benefit is sufficient for it to be communicated by scientists and health professionals to the public. The goal of personalized nutrition is not to substitute the official guidelines but to enhance or modify them for the individual where there is available evidence to do so. This is not a new development, but a practice that is old as the guidelines themselves: overweight people are advised to consume fewer calories than the recommended intake; lactose-intolerant individuals are advised to avoid or limit their intake of fresh dairy products; now, we have the opportunity to consider the evidence from gene–diet interaction studies.

Nutrigenetics is part of a wider debate about personal genetics, which began with the launch of Sciona Ltd. in the United Kingdom in 2001 (Sciona 2001) and continued following the launch of companies such as 23andMe, deCode, and Navigenics in 2007. The overriding question is about clinical utility—can the results of nutritional genetics studies be translated into beneficial dietary advice that would not be available without the use of genetic information? To understand what may or may not be possible, it is necessary to keep personalized nutrition in its appropriate context and not to conflate it with clinical genetics, disease prediction, or disease therapy. Nutrigenetics does not use genetic information in the same way as classical genetics does; it does not calculate disease risk based on association studies (such as 23andme-type services) but uses precise information based on specific gene–diet interactions. Often, the genetic variants are functional, which means that they have effects on proteins (such as reduced enzyme activity or altered transporter levels) that have been demonstrated to modify individual responses to dietary components. This important distinction is often overlooked: nutrigenetics operates at the level of genetic influence on biological processes and is not required to provide any information beyond metabolic information (Paynter et al. 2010).

Healthy eating is not a straightforward proposition in the modern world, and expert committees charged with the responsibility of making dietary recommendations have to do so in the context of complex and incomplete information. To declare that it is too early to include genetics in nutritional advice is not simply to adopt a wait-and-see attitude: it is to actively recommend that normal healthy people follow conventional nutritional guidelines based on epidemiological and other evidence and set aside the evidence for certain gene–diet interactions. The level of evidence for genetically influenced nutritional advice should be assessed according to the same standards as traditional nutritional advice, but this does not often occur. Although the evidence of gene–diet interactions is not denied by experts, the question is this: is it of a sufficient level to be used now? The apparently cautious “it’s too early” approach is to decide that the evidence in support of the following generic daily recommendations (FSA 2007) is of higher quality than any of the “genes and nutrition” evidence:

  • 200 μg folic acid

  • 40 mg vitamin C

  • No more than 6 g salt

  • At least five portions of a variety of fruit and vegetables

  • No more than 11 % of energy from saturated fat

Some authors feel that until genetic-based advice has been proven to be beneficial, the standard guidelines should be followed, and often this means being proven by the gold standard—the randomized clinical trial. Though some studies have reported clinical benefits of lifestyle interventions in diseased or higher-risk individuals—for example in preventing type 2 diabetes (Perreault et al. 2012)—no single element of the recommendations from the Food Standards Agency (FSA) of the United Kingdom has been proven by clinical trial to prevent or delay disease in healthy people. The evidence for these recommendations has been derived from epidemiological observations, small intervention trials, and clinical trials of biomarkers. The FSA recommends specific limits on salt and saturated fat intake with the aim of preventing hypertension and an imbalanced low-density/high-density lipoprotein profile, which are risk factors for CVD, among others. The FSA recommendations are ultimately, of course, aimed at disease prevention, but although lowering dietary salt and saturated fats have positive effects on hypertension and lipid profiles, as demonstrated in clinical trials of healthy populations, there are no actual trial data that “prove” a consequent reduction in disease of these dietary interventions (Furberg 2012; Mitka 2012).

The lack of data on ultimate disease causality and prevention should not be a surprise. Nutrition is highly complex, and its effects on long-term health begin even before birth. It is not possible to take a nutritional element in isolation and test efficacy in disease prevention as if it were a new drug or surgical procedure. The same is true for nutrigenetics, which is sometimes wrongly held to a higher standard than ordinary nutritional advice (e.g., Haga et al. 2003; Wood 2008); this results in disregarding high-quality evidence for several gene–diet interactions. The most widely studied is that between the MTHFR gene C677T polymorphism, folic acid, and homocysteine. The following has been reliably demonstrated (Homocysteine-Lowering Trialists’ Collaboration 2005):

  • The 677T version of the enzyme has only about 35 % of the activity of the 677C version.

  • A low folate status leads to high homocysteine levels in TT individuals. (For example 200 μg folic acid has been shown in many trials to be insufficient to maintain homocysteine levels below the risk level in this genetic group.)

  • It is accepted that increasing folate intake to 400–600 μg per day will keep homocysteine levels below the risk level in most TT individuals.

  • There is no reliable evidence of harm, and these levels are well below the advised upper limits.

  • Homocysteine is accepted as an independent risk factor for cardiovascular and other diseases; though causality has not been proven, the evidence is high (Wald et al. 2006).

For a healthcare practitioner, to reject this evidence is to recommend that individuals with two copies of the slow enzyme need only 200 μg folic acid per day regardless of the fact that their homocysteine levels will be likely to remain high for years or decades. It is to decide that chronic high homocysteine is likely to be less harmful than 400–600 μg folic acid per day. Neither position has been proven by clinical trial in healthy individuals to prevent or delay disease, and it is not likely that such positions will be proven since, apart from major compliance problems, such a trial would require very large numbers of participants and would last too many years. Over 10 homocysteine-lowering trials have been carried out, and they are often cited by way of refuting any benefit of homocysteine reduction in primary prevention. However, these were all short-term trials in older people already suffering from (mainly) CVD and taking several medications, and they measured the incidence of further cardiovascular events. None of the trials were carried out on healthy people. The only possible conclusion from these studies was that over the trial period, there was no apparent benefit in lowering homocysteine in ill people, i.e., secondary prevention. The results are unlikely to be relevant to primary prevention, and there are also good arguments why only tentative conclusions can be drawn from these trials in any case (Wald et al. 2011). Despite this the majority opinion in the clinical world is to conclude that the “results do not support the use of folic acid … as a preventive treatment” (Lonn et al. 2006), and “randomized trials of vitamin therapy with folate, vitamin B6, vitamin B12 … demonstrated that none … are effective for preventing cardiovascular disease … in the general population” (Tice 2010).

In comparison, various salt-lowering trials had similar negative results on disease prevention but the common medical response is that since the RCTs were performed in patients with existing heart disease, the results “… although applicable to heart failure patients, lack public health relevance: primary prevention” (Alderman 2010). This is inconsistent. There are no fundamental differences in the homocysteine- and salt-lowering trials, but the interpretations of the outcomes are radically different.

Another very well studied gene–diet interaction involves cruciferous vegetables, which were shown to be associated with reduced lung cancer in GSTT1- and GSTM1-null individuals, but not in individuals who had working copies of both genes (Brennan et al. 2005). Other authors have demonstrated gene–diet-dependent effects on reduced DNA damage (Palli et al. 2004), reduced prostate cancer risk (Steinbrecher et al. 2010), and increased levels of GST alpha (Lampe et al. 2000). These interactions among genotype, cruciferous vegetables, and lung cancer risk have also been confirmed in a systematic analysis (Lam et al. 2009). In a recent review (McCann et al. 2010), the authors assessed the evidence and made the following statement:

It can be concluded from the majority of these analyses that cruciferous vegetables are likely to play an important role in cancer prevention, the strength of which may be dependent to some extent upon exposure to other carcinogens and genotypes for GSTs.

But with regard to their overall conclusion, they wrote thus:

However, we do not believe that nutrigenetics is a doorway to individualized genotyping for risk assessment and dietary counseling … Finally, it should be noted that regardless of one’s genotype a balanced diet high in fruits, vegetables, and whole grains and low in meat and fats may be beneficial for overall health and well-being and prevention of numerous diseases … the public health message of consumption of a healthy diet should not be influenced by knowledge of one’s genetic makeup.

This article is a review of several studies of gene–diet interactions as well as those on GST and cruciferous vegetables, and not all the studies gave consistent results; however, the intended meaning of the final sentence is not completely clear. One interpretation is that having assessed all the evidence, the authors believe that this evidence is still not sufficiently reliable to allow GST gene–cruciferous interactions to be incorporated into any sort of nutritional advice communicated to the public. If this is the correct interpretation, the next question is what level of evidence would be sufficient to inform GST-null individuals about the probable increased cancer protection provided by a particular type of vegetable for carriers of their genotype?

In summary, the use of genotypic information in personalized nutrition has been subjected to some skepticism for various reasons, including exaggerated health claims, mistaken interpretations, genuinely exploitative products, and the difficulty of proving cause and effect (as in any type of nutrition research). Nutrition research and candidate-gene association studies have produced many inconsistent results over the decades. This is not related to the overall quality of the research, rather it is because of the complexity of the effects of nutrition on long-term health. The situation though is improving, as genotyping costs have dramatically decreased, the increasing inclusion of genetics in many nutritional studies over the last few years has been one of the factors that has helped to bring more clarity (Grimaldi 2010).

Nutrigenetics is part of the information that contributes to personalized nutrition as a whole. Where there is supporting evidence, it should be added to other phenotypic information (such as health status, ethnicity, and gender), and genetic evidence should be assessed at the same level as phenotypic evidence. Preliminary studies suggest that including genetic information may be useful in long-term weight loss (Arkadianos et al. 2007), and a recent randomized control trial reported that genotype-based personalized dietary advice was better understood and more likely to be followed than general dietary advice (Nielsen and El-Sohemy 2012). There is good evidence of some clinical and personal utility with respect to genotype-based personalized nutrition, and this should be made more widely available to allow individual decisions to be made.

Can ethics help to solve this problem?

Plea for a precautionary approach

A common ethical approach to situations where there is doubt about the right choice of action is to suggest a cost–benefit analysis. How can such an analysis be applied in the case under discussion? Costs and benefits cannot be calculated in general, but they must be estimated for a specific action where the consequences may be foreseen. It is thus evident that a cost–benefit analysis in the present case must relate to an understanding of the specific consequences of gene–nutrition or lifestyle–nutrition interaction and specific nutritional advice that may influence such consequences.

In many cases, we have no or only limited knowledge of the consequences of alternative actions. The precautionary principle has often been used in handling such situations. However, this principle is understood in many ways; it has been widely discussed, and its usefulness has been questioned. Can it be helpful as a tool in personalizing nutrition? A common and basic understanding of the principle is that we should exercise caution in avoiding actions where we cannot foresee the risks. However, when the risks are unknown, it is difficult to apply the notion of risk avoidance. How can we avoid unknown risks? With this problem, it has been suggested that it is useful to distinguish between three different situations: (1) where the risk level is well known; (2) where there is uncertainty about the level and character of the risks involved; and (3) ignorance, where the risks are unknown (COMEST 2005).

It has rightly been argued that in the case of ignorance, precautions are irrelevant. We cannot know whether something is safe or not without some experience of risk levels (Wildavsky 1988). Wildavsky argues that in the case of ignorance, small-risk taking, followed by stepwise evaluation, is a safer course than avoiding risk.

One interpretation of the precautionary principle that may be useful here concerns prudent housekeeping (Boehmer-Christiansen 1994). In good housekeeping budgeting, predictions have a certain range of uncertainty. It is then a good strategy to underestimate income and other benefits and overestimate expenses and risks within the relevant range of uncertainty. The result will be a cautious estimation of the balance between benefits and risks, and it will also allow steps to be taken to modify that balance by increasing the benefits and diminishing the risks as appropriate. In this interpretation, it is reasonable to understand the precautionary principle as a complement to, rather than a substitute for, cost–benefit analysis (Dana 2003).

The recipient understands personalized nutritional advice as being a prediction (based on probability and limited knowledge) that the recipient who follows this advice has a better chance of improving their health rather than following general advice about healthy eating. The short history of personalized nutrition indicates that the advice offered has often been questionable either because the knowledge base was too limited or because the advice promised too much or was difficult to understand or apply. The articles that follow in this section of the special issue examine further details related to this discussion.

The above considerations show that to date nutrigenomics has been able not only to create expectations, but also to offer important pieces of new knowledge; however, our understanding of the whole picture of diet–gene interactions is still fragmentary. For example, knowledge is limited about the interactions among different genes and about the net effects on health outcome of multiple gene variants interacting with nutrition. There is good evidence that diet (and other lifestyle exposures) has an impact on epigenetic factors, though how this impact affects health remains poorly understood (Mathers et al. 2010). In addition, the important step from knowledge about a diet–gene interaction to the development of clear advice on changes in nutritional intake or other behavioral change has proven complex. The precautionary introduction of variables to be used for personalized dietary advice calls for a careful selection of single nucleotide polymorphisms, biomarkers, or other factors with sufficiently strong evidence that appropriate behavioral change is likely to produce positive health effects.

Much remains to be elucidated about the importance of food for well-being in a wide sense and about individual responses to tailor-made dietary advice, and this lack of knowledge creates a complex ethical situation (Görman 2006). This lack also makes it important to observe carefully behavioral responses to the outcomes of personal genome testing and of personalized dietary advice based on genotypic information. It may be appropriate to undertake such monitoring initially with only a limited amount of advice, followed by a stepwise evaluation of increasingly complex advice. A precautionary approach should therefore involve adjusting the advice to account for unforeseen behavioral and psychological effects.

Plea for respect for autonomy

Such estimations as risk–benefit analyses are carried out by such experts as scientists, medical doctors, or other authorities for the benefit of the recipients of the medical advice. To give advice based upon these analyses may thus be perceived as a way of telling people what they ought to do. This may involve a certain amount of paternalism, expressing an attitude of superiority over others. From an ethical point of view, this is a theoretically interesting, but also a questionable, way of dealing with a situation where advice is given to others to support them in improving their lives. A great many efforts in ethics as well as applied psychology aim at suggesting alternative approaches. Three examples can illustrate this point.

In the sphere of ethical politics, value-driven documents related to interventions in the health field strongly point out the respect for the integrity, freedom, and dignity of all human beings as a central value. One influential example is the Convention on Human Rights and Biomedicine, issued by the Council of Europe. The explanatory report of this convention makes it clear that paternalism may come in conflict with the above value and that one ambition of the convention is to restrain paternalist approaches (Council of Europe 1997).

In ethical theory, similar questions are often brought up by means of the concept of autonomy, that is, the capacity to be one’s own person. In the influential moral psychology of Immanuel Kant motivation, individual freedom, and autonomy belong closely together. Kant understood individual freedom as a situation where a person is bound only by their own will, not by the will of someone else. Such freedom is crucial for individual autonomy (Kant 2003; Schneewind 1998). In modern medical ethics, respect for individual autonomy, understood as the right of each person to decide by themselves, is commonly considered an important value (Beauchamp and Childress 2008).

In psychology, the influential ideas of Carl Rogers and his followers are based upon the understanding of the individual human being as a self-structuring value-driven organism. The person-centered therapy derived from his theories is focused on finding and mirroring this capacity in the individual instead of trying to impose the thoughts of others. As Rogers wrote, “A person cannot teach another person directly; a person can only facilitate another’s learning” (Rogers 1951). Today, person-centered therapy is a widely used approach in psychotherapy, often described as characterized by genuineness, unconditional positive regard, and empathetic understanding. Several studies indicate that a person-centered approach to behavioral change is efficient and beneficial (Cooper et al. 2010).

With respect to personalization of nutrition, it is evident that such values as integrity and autonomy are relevant since the goal of dietary advice is a change in lifestyle. These values may be respected by developing methods for nutritional advice that focus on supporting the autonomous choice of each person. Advice can help in understanding the causality and benefits of adjusting lifestyle and nutrition to individual properties. However, the person who receives the advice is the only one who can integrate such a choice within their personal value system. The significance of autonomy, trust, and trustworthiness dealt with in this article is further discussed in the following articles in this special issue.

Conclusion

There is convincing evidence that common diet-related diseases are influenced by genetic factors, but knowledge in this area is fragmentary and few relationships have been tested for causality. The evidence that genotype-based dietary advice will motivate appropriate behavior changes is also limited. However, traditional nutritional advice is not always based upon causality but also on observational epidemiological studies. In several specific cases of gene–diet interaction, it may be more beneficial for identifiable groups of individuals with specific genotypes to follow personalized nutritional advice rather than general dietary recommendations. From an ethical perspective, a precautionary approach is to be recommended, where personalized dietary advice is offered only for variables with sufficiently strong evidence for health effects, followed by a stepwise evaluation of unforeseen behavioral and psychological effects. When offering such advice, paternalism should be restrained, and the focus should be on supporting the autonomous choice of each individual.