Whilst, as mentioned above, many of the dietary requirements to support bone health in the athlete are likely to be largely the same as those supporting bone health in the general population, there are some dietary/nutritional challenges specific to the athlete. The remainder of this review will focus upon what we consider to be the most pertinent, namely: energy availability, low carbohydrate availability, protein intake, vitamin D intake and dermal calcium and sodium losses. The review will also briefly cover the effects of feeding around exercise on bone metabolism.
Energy Availability
Energy availability can be described as the amount of ingested energy remaining to support basic bodily functions and physiological processes, including growth, immune function, locomotion, and thermoregulation, once the energy needed for exercise has been utilised [25]. For a good overview of the myriad effects of low energy availability in the athlete, we direct the reader to the recent review by Logue et al. [26]; herein, we focus specifically on the potential effects of low energy availability on the bone. One of the major problems of identifying those athlete populations at risk of low energy availability and of identifying the causal links between low energy availability and bone health is the significant difficulty in collecting accurate data on energy intake and energy expenditure (particularly during more intermittent types of exercise) [27].
The low energy availabilities experienced by some athletes can have adverse effects on bone [28], including acute bony injuries and longer-term reduced bone mass and strength. It seems that many highly active individuals, particularly elite and recreational endurance athletes, might have some difficulties in matching their dietary energy intakes to their exercise energy expenditure, which inevitably results in low energy availability [29, 30]. It is clear that this is also an issue that can affect male athletes as well as female athletes [31].
Ihle and Loucks [32] were among the first to directly investigate the effects of low energy availability on bone metabolism in healthy young women by inducing three reduced energy availabilities [compared to an energy balanced control at 45 kcal·kg of lean body mass (LBM)−1·day−1], 30, 20 and 10 kcal·kgLBM−1·day−1, in an independent-groups design, using both dietary manipulation and exercise. Whilst more moderate restrictions of energy availability (20 and 30 kcal·kgLBM−1·day−1) modulated bone formation [as determined by osteocalcin (OC) and carboxy-terminal propeptide of type 1 procollagen (P1CP) concentrations], no effect was shown upon bone resorption (as determined by N-terminal telopeptide [NTX] concentrations). OC concentrations were reduced by − 0.9 ± 0.3 ng·mL−1 at 30 kcal·kgLBM−1·day−1 and − 2.4 ± 0.5 ng·mL−1 at 20 kcal·kgLBM−1·day−1, and P1CP concentrations were reduced by − 16 ± 8 ng·mL−1 at 30 kcal·kgLBM−1·day−1 and − 28 ± 8 ng·mL−1 at 20 kcal·kgLBM−1·day−1. More severe reductions in energy availability (at 10 kcal·kgLBM−1·day−1) produced the dual effect of reducing bone formation (OC − 2.3 ± 0.5 ng·mL−1; P1CP − 48 ± 13 ng·mL−1) and increasing bone resorption (NTX +17 ± 4 nM BCE/mM Cr). Although the relevance of some of these markers of bone metabolism was questioned (they would not be considered the optimal markers of bone resorption and formation to use today [33]), this paper has been instrumental in raising the awareness of potential problems for the bone when energy availability is low.
It is common for athletes to experience low energy availabilities of a similar order of magnitude to those used by Ihle and Loucks [32]. Indeed, Thong et al. [34] reported that amenorrhoeic athletes have energy availabilities of ~ 16 kcal·kgLBM−1·day−1, which formed part of the rationale for the recent studies conducted by our research group [35, 36]. In the first of these studies [35], reducing energy availability to 15 kcal·kgLBM−1·day−1 over 5 days resulted in decreased bone formation [as determined by total procollagen type 1 N-terminal propeptide (P1NP)] and increased bone resorption [as determined by C-terminal telopeptide (β-CTX)] in women, but not in men. Despite this, examination of the individual data showed that some men responded to lower energy availability with a decrease in bone formation. Whilst this is in no way conclusive, there is the possibility that lower energy availability will affect bone metabolism by decreasing bone formation in men, but that it might take a lower level of energy availability to produce this response than in women. This would be an interesting avenue for future research.
One of the issues with examining the effects of reduced energy availability on bone metabolism in athletes and athletic populations in the laboratory is that this is usually achieved via a reduction in dietary intake and an increase in exercise energy expenditure. Whilst this is probably relevant, it does not allow us to determine whether the effect of low energy availability on bone might be more as a result of dietary restriction or as a result of high exercise energy expenditures (or whether this makes no difference). Recently, we have examined the effects of 3 days of low energy availability, again at 15 kcal·kgLBM−1·day−1, achieved by either diet or exercise, on bone turnover markers in active, eumenorrhoeic women [36]. Low energy availability achieved through dietary energy restriction resulted in decreased bone formation, with no concomitant change in bone resorption. Low energy availability achieved through exercise alone, on the other hand, did not significantly alter bone metabolism. Taken together, these results might suggest some bone protective effect of the mechanical loading induced by exercise in the short term, even when this might result in low energy availability. These results also suggest that the athlete must focus on adequate dietary intake during hard training periods.
Given the potential for low energy availability to negatively influence the short-term responses of bone, it would seem sensible to suggest that if this state was maintained over longer periods, more serious consequences might be experienced. This raises an important, but as yet unanswered, question over whether it is the magnitude of the low energy availability (i.e. there is a threshold below which there is a negative effect on the bone) that is important or whether it is more an issue of continuous low energy availability over time that negatively influences bone health. Certainly, it would seem that bone metabolism recovers relatively quickly from short-term low energy availability [37], whilst several cross-sectional studies have shown that those maintaining low energy availability over time have lower bone mass, poorer bone structure and/or an altered bone metabolic profile compared with those who do not experience low energy availability [11, 38,39,40,41]. Added to this is the evidence from the many studies conducted since 2007 relating to the female athlete triad [25, 42]. More recently, this same group has also suggested the potential for a similar syndrome in male athletes (referred to as the male athlete triad; see Tenforde et al. [43]), which mirrors the suggestions made relating to the occurrence of impaired bone health as a result of low energy availability, by the relative energy deficiency in sport (RED-S) paradigm [44, 45]. Whilst further discussion of these conditions (the male athlete triad and RED-S) is vitally important and would be highly relevant herein, these topics are covered more extensively in another article within this supplement.
Whilst it might seem sensible to suggest to the athlete that maintaining an energy availability of 45 kcal·kgLBM−1·day−1 over time is necessary to optimise their bone health and protect against bony injuries, it is probably an unrealistic target for many athletes. Certainly, it seems unlikely that elite endurance athletes (male or female) would be able to attain these levels of energy availability given the high energy expenditures induced by training and the limited time for refuelling that their demanding training schedules allow. Another complication here is that endurance athletes might be directly opposed to trying to maintain a balanced energy intake, since many consider an energy deficit as essential to drive the endurance phenotype. Taken together, these points highlight the difficulty in maintaining balanced energy availabilities for the promotion of bone health in the endurance athlete when stacked against the competing interests of optimising their sporting performance. As such, further research is needed to identify whether or not there is a means to maintain bone health without compromising training practices to optimise endurance performance. One possibility might be to periodise low energy availability into the training cycle to develop the endurance phenotype without the need to have constantly low energy availability, a recent approach suggested by Stellingwerff [46].
Further research is also required to tease out the nuances of the effects of energy and nutrient availability on bone. In the laboratory, energy intake is often limited by simply determining habitual dietary energy intake and then cutting this intake down by a certain percentage. The issue with this is that nutrient intake is also reduced by the same relative amount, which begs the question of whether the effects on bone are wholly energy availability dependent or whether the concomitant reduction in the availability of carbohydrate, protein, calcium, vitamin D and other micronutrients also contributes to the negative impact on bone. In addition, there might also be an interaction between elements of the female athlete triad and certain nutrients that could exacerbate the effects on bone. For example, iron deficiency might directly interact with reduced energy availability to further disrupt thyroid function and to suppress anabolic factors for bone formation, as recently postulated by Petkus et al. [47].
Low Carbohydrate Availability
There is evidence to suggest that some athletes (particularly endurance athletes) might benefit from either lower carbohydrate diets or low-carbohydrate/high-fat diets in terms of their performance, in addition to the proposed benefits for body composition [48, 49]. This, however, remains a matter of some contention, given that historically carbohydrate intake would provide the largest contribution to energy intake in the athlete’s diet and that low-carbohydrate diets could present a risk for a low energy availability state. Whilst no studies have directly examined the effects of low carbohydrate availability on bone health parameters in athletes, it has been shown that carbohydrate feeding can reduce bone turnover [50]. Bjarnason et al. [50] reported a reduction of around 50% in bone resorption marker (β-CTX) concentrations following the administration of a standard 75-g oral glucose tolerance test. Similarly, the provision of carbohydrate has been shown to attenuate the bone resorption response to acute exercise in athletes involved in an 8-day overloaded endurance training trial [51]. Sale et al. [52] also showed a modest post-exercise reduction in PINP and β-CTX with carbohydrate feeding immediately before, during and immediately after a 120-min treadmill run in recreationally active individuals.
There is some more direct information to suggest that following a low-carbohydrate diet would negatively affect bone health, albeit from animal models and when concomitantly followed with a high-fat diet [53]. Bielohuby et al. [53] measured bone growth, BMD and bone turnover in growing rats fed for 4 weeks on either normal chow (9% fat, 33% protein, and 58% carbohydrates) or on two different low-carbohydrate/high-fat diets (1: 66% fat, 33% protein, and 1% carbohydrates; 2: 94.5% fat, 4.2% protein, and 1.3% carbohydrates). They showed that longitudinal growth, BMD, and bone mechanical properties were all impaired by both low-carbohydrate/high-fat diets, which they suggested was potentially mediated by the reductions in insulin-like growth factor 1 (IGF-1) levels shown. Bone formation markers and the expression of transcription factors influencing osteoblastogenesis were also reduced on the low-carbohydrate/high-fat diets, which the authors suggested might indicate a lower rate of mesenchymal stem cell differentiation to osteoblasts. Conversely, in humans, albeit osteoarthritis patients and not athletes, there was no effect on bone turnover (as assessed by urinary N-telopeptide and bone-specific alkaline phosphatase concentrations) when patients were fed less than 20 g of carbohydrate per day for 1 month and then less than 40 g of carbohydrate per day for the next 2 months [54].
As such, there might be some suggestion that following a low-carbohydrate diet acutely, chronically or even periodically might negatively influence the athlete’s bone health, but this is by no means certain. Future research work is required to determine whether low-carbohydrate dietary practices would negatively impact the bone health of athletes in the longer term.
Protein Intake
Athletes are often recommended to consume more protein than is recommended for the general population, in order to support the additional demands of athletic training. The recommendations for athletes is to consume between 1.2 and 1.6 g·kgbw·day−1, although under certain circumstances this recommendation might increase to 2.2 g·kgbw·day−1 [55], which is higher than the 0.8 g·kgbw·day−1 recommended to the general population. This may result in a conflict of interest, as there is a long-held belief that higher protein intakes may have a negative influence on bone health [56, 57], a topic that has recently been covered in detail by Dolan and Sale [58]; herein we will summarise the salient points. The ‘acid-ash hypothesis’ [59] suggests that animal proteins are acidic (essentially having a high potential renal acid load) and, as such, provide a significant challenge to the maintenance of acid–base balance by disrupting the body’s pH, which is critical to the maintenance of homeostasis. The theory suggests that, in order to protect the homeostatic state, the body increases the availability of alkaline minerals, such as calcium, most of which are stored within the bone tissue. Indeed, around 99% of the calcium stored within the body is stored within the bone and so any requirement for the release of calcium into the circulation to counteract the effects of increased acidity is likely to result in the resorption of bone [59]. The calcium released from the bone in order to counteract a high potential renal acid load is also associated with increased losses of calcium in the urine, along with lower BMD and an increased rate of bone loss [60]. Taken together, the results of these studies would suggest that, as a result of the acid-ash hypothesis, an athlete consuming a high (particularly animal) protein diet would run the risk of inducing demineralisation of the bone over the longer term with potential adverse effects on bone health.
Taken alone, however, this theory does not provide a fully balanced account of the potential influences of a high protein intake on bone. The main negative effect of a high animal protein diet on bone according to the acid-ash hypothesis relies upon the clear assumption that the calcium used to neutralise the high potential renal acid load resulting from animal protein consumption comes from the bone and that any excess calcium subsequently excreted in the urine comes from the bone. This might not, however, be the case given that Kerstetter et al. [61] have shown that higher protein intakes resulted in an increase in the amount of calcium that is absorbed from foods, and, as such, the increased urinary calcium levels with high animal protein intakes might well come from increased calcium availability instead. Of further consideration is the fact that dietary acid load could just as easily be influenced by a reduction in the intake of alkaline foods, such as fruits and vegetables, as by an increase in the intake of acidic foods, such as animal proteins. This would compound the issue, especially given that alkaline foods are also rich in a wide range of micro- and phyto-nutrients that are important for bone health [21]. Therefore, it is possible that the poorer bone outcomes reported in those consuming an acidic diet [60] were not due to high protein, but were as a result of a shortage of nutrient rich fruits and vegetables. This gives further support to the point made in Sect. 2 that athletes should consume fruits and particularly green leafy vegetables to support their bone health.
It is equally important to consider the possibility that protein is, in fact, beneficial and not harmful to bone (for a review, see Dolan and Sale [58]). Bone tissue is ~ 50% protein by volume and about a third by mass [62], given that it is an important constituent of the structural matrix of bone [63]. As such, athletes need to consume sufficient protein to support the increased rate of bone turnover caused by athletic training. Additionally, protein ingestion increases the production of a number of hormones and growth factors, such as IGF-1, which are also involved in the formation of bone. Of further relevance for the athlete is the fact that higher protein intakes also support the development of muscle mass and function [64]; the associated increases in muscular force would likely act upon the bone to enhance bone mass and strength [65].
On the balance of the available evidence it would seem unlikely that higher animal protein intakes, in the amounts recommended to athletes, are harmful to bone health. This is evidenced by the results of a number of studies (albeit not in athletes per se) that have been well summarised and statistically combined in high-quality meta-analyses (as summarised by Rizzoli et al. [66]). It might, however, be sensible to recommend to athletes that they maintain adequate calcium during periods of higher protein consumption to be sure of no negative effects on the bone. A small positive effect of protein on BMD and fracture risk has been identified, suggesting that the protein intakes of athletes, which are usually in excess of the recommended daily allowance, might be ultimately beneficial to the bone, although this requires further specific research.
Vitamin D Intake
Numerous studies in the last 5–10 years have identified athlete groups who have deficient or insufficient levels of circulating vitamin D [67], although the specific definitions of vitamin D deficiency and insufficiency have been debated. Whilst there is broad agreement that vitamin D deficiency is defined as a serum 25-hydroxyvitamin D [25(OH)D] level below 25 nmol·L−1 [68, 69], there is no consensus as to what constitutes insufficiency or indeed the optimal vitamin D status. The Institute of Medicine Report [70] on dietary reference intakes for vitamin D suggested that 25(OH)D levels of 40 nmol·L−1 were sufficient to cover the requirements of ~ 50% of the US population, with levels of 50 nmol·L−1 being enough to cover the requirements of at least 97.5% of the US population.
Given the well-identified link between low vitamin D levels (serum 25-hydroxyvitamin D [25OHD] levels below 25 nmol·L−1) and bone, where it plays an important role in calcium and phosphorus regulation in the body, it is highly likely that athletes who are deficient in vitamin D would be at a greater risk of low bone mass and bone injuries [71], such as stress fractures.
Whilst the causes of vitamin D deficiency in the general population are clearly multifactorial, it is most likely that the main cause in the athletic population is a reduction of ultraviolet B radiation absorption into the skin, which is the major source of vitamin D [72, 73]. Whilst this seems fairly obvious in relation to those athletes who largely train and compete indoors and those who live and train in latitudes furthest from the equator, it might also be of relevance to those who train and compete outside, but who have to wear a significant amount of equipment (e.g. jockeys) or who choose to use high sun protection factor sunscreen or sunblock (which rightly relates to considerations over the protection of the athlete’s skin from damage).
A direct relationship between serum vitamin D levels and musculoskeletal outcomes is relatively clear [69] and makes sense given the important role for vitamin D in calcium and phosphorus metabolism. Miller et al. [74] examined the vitamin D concentrations in 53 patients with radiographically confirmed stress fractures, with 44 of these patients having serum vitamin D levels of less than 40 ng·mL−1. Similarly, Maroon et al. [75] showed that vitamin D levels were significantly lower in professional American Football players having suffered at least one bone fracture when compared to those players with no fractures. Conversely, female Navy recruits receiving 2000 mg calcium plus 800 IU of vitamin D per day had a 20% lower incidence of stress fracture than the recruits receiving the placebo [76]. Whilst not directly causal, low-fat dairy products and the major nutrients in milk (calcium, vitamin D, and protein) were associated with greater bone gains and lower stress fracture rates in young female runners [77]. Interestingly, a higher potassium intake was also associated with greater gains in hip and whole-body BMD.
It would seem relatively clear that the avoidance of vitamin D deficiency and insufficiency is important for the athlete to protect their bone health. In theory, this is relatively straight forward, and achieving serum vitamin D levels, through dietary supplementation, above 50 nmol·L−1 would most likely prove protective [78], although a clear target for vitamin D in the prevention of bone injury prevention remains unknown.
Dermal Calcium and Sodium Losses
Athletes who undertake a high volume of prolonged exercise, particularly when that exercise is not weight bearing, are at risk of having lower BMDs [79, 80]. One of the potential contributors to this might be an increase in bone resorption mediated by the activation of parathyroid hormone due to reductions in serum calcium levels, which, in turn, occur as the result of dermal calcium losses [81]. It is likely that the level of dermal calcium loss required to cause a decline in serum calcium concentrations, which is sufficient to activate parathyroid hormone secretion and thus bone demineralisation, would only occur during prolonged hard exercise. Given that calcium plays an important role in many cellular processes that occur while exercising, the body vigorously defends serum calcium concentrations, predominantly by the demineralisation of bone, which, in turn could lead to a reduction in bone mass over time. As such, Barry et al. [81] proposed that supplementing with calcium before or during exercise might compensate for dermal calcium losses and defend the serum calcium level, meaning that there would be no concomitant increase in parathyroid hormone release or bone resorption.
Barry et al. [81] examined whether calcium supplementation, either before or during cycling exercise, reduced the exercise-induced increases in parathyroid hormone and bone resorption (as determined by β-CTX). Twenty male endurance athletes completed a 35-km cycling time trial on three occasions having consumed either (1) 1000 mg of calcium 20 min before exercise and a placebo during exercise; (2) a placebo before exercise and 250 mg of calcium every 15 min during exercise; or (3) a placebo before and during exercise. The results showed that when 1000 mg of calcium was ingested as a single bolus prior to exercise, there was an attenuated parathyroid hormone response to the subsequent exercise bout. There was a smaller attenuation of the parathyroid hormone response when calcium was supplemented during exercise, and this did not reach statistical significance.
Following on from this work, others have also shown that pre-exercise calcium consumption, this time in the form of a high-calcium meal (~ 1350 mg), attenuated the subsequent response of both parathyroid hormone and bone resorption (assessed via β-CTX concentrations) to a 90-min cycling bout in competitive female cyclists [82]. Although these results suggest that pre-exercise calcium consumption/supplementation may represent an optimal strategy for preventing bone resorption during exercise, the chronic effects of this nutritional strategy on BMD are yet to be researched. It is unlikely that the amount of calcium lost in the sweat would be significant enough to cause perturbations to calcium homeostasis to the extent that it would affect bone metabolism, unless the sweat rate was fairly high and/or the duration of sweat loss prolonged. As such, it is likely that this would be of primary concern for the endurance and ultra-endurance athlete, but perhaps also for any athlete who uses dehydration mechanisms to ‘make weight’. This latter possibility has not been explored and future research is required.
In line with this, there is also the possibility that the challenge to fluid and sodium homeostasis that would occur under these circumstances might influence bone metabolism and health. This, to our knowledge, has not been directly or well-studied in relation to the athlete, but there is some suggestion from the osteoporosis focussed literature suggesting that bone might be negatively affected by hyponatraemia. Verbalis et al. [83] examined the effects of using a syndrome of inappropriate antidiuretic hormone secretion rodent model to show that 3 months of hyponatraemia (~ 30% compared with normonatraemic controls) significantly reduced the BMD of excised femurs and reduced both trabecular and cortical bone, purportedly via an increase in bone resorption and a decrease in bone formation. The same paper also reported on a cross-sectional analysis of human adults from the Third National Health and Nutrition Examination Survey, showing that mild hyponatraemia was associated with significantly increased odds of osteoporosis, in line with the rodent data presented. This might be explained by novel sodium signalling mechanisms in osteoclasts resulting in the release of sodium from bone stores during prolonged hyponatraemia [84].