Is bone quality associated with collagen age?
- First Online:
- Cite this article as:
- Leeming, D.J., Henriksen, K., Byrjalsen, I. et al. Osteoporos Int (2009) 20: 1461. doi:10.1007/s00198-009-0904-3
- 206 Views
The World Health Organization defines osteoporosis as a systemic disease characterized by decreased bone tissue mass and microarchitectural deterioration, resulting in increased fracture risk. Since this statement, a significant amount of data has been generated showing that these two factors do not cover all risks for fracture. Other independent clinical factors, such as age, as well as aspects related to qualitative changes in bone tissue, are believed to play an important role. The term “bone quality” encompasses a variety of parameters, including the extent of mineralization, the number and distribution of microfractures, the extent of osteocyte apoptosis, and changes in collagen properties. The major mechanism controlling these qualitative factors is bone remodeling, which is tightly regulated by the osteoclast/osteoblast activity. We focus on the relationship between bone remodeling and changes in collagen properties, especially the extent of one posttranslational modification. In vivo, measurements of the ratio between native and isomerized C-telopeptides of type I collagen provides an index of bone matrix age. Current preclinical and clinical studies suggests that this urinary ratio provides information about bone strength and fracture risk independent of bone mineral density and that it responds differently according to the type of therapy regulating bone turnover.
KeywordsAntiresorptivesBone qualityIsomerizationOsteoporosisType I collagen
In the description of osteoporosis presented in 1990 at the Consensus Development Conference, the disease was defined as a decrease in bone mineral density (BMD) and a microarchitectural deterioration of bone tissue leading to an increase in bone fragility and susceptibility to fracture . Bone mineral density values at least 2.5 standard deviations below the mean levels in young adults are used to identify persons with osteoporosis . Although numerous studies have shown that measurement of BMD alone is not sensitive enough to identify the majority of women who will sustain a fracture, the definition of osteoporosis has not changed [3–8]. Recently, a model combining BMD with clinical risk factors was proposed to improve the assessment of fracture probability . This model, however, did not include factors related to bone remodeling and bone matrix quality.
Bone mineral density is currently the most important determinant of fracture risk [10, 11]. Yet, several studies have shown that up to 50% of persons who experience fracture have a BMD value above the level of 2.5 standard deviations below the reference-population mean, or a T score ≥2.5 [3, 4, 12]. Older persons can have up to a tenfold increased 10-year fracture risk in comparison with younger individuals with the same BMD : For example, an 80-year-old person with the same BMD as a 50-year-old one has a fivefold higher risk of hip fracture . Thus, in individuals with comparable BMD, fracture risks are not the same. In addition, more than 50% of all incident fractures occur in women with osteopenia, as defined by a BMD T score ≥2.5 and ≤1 ; at-risk women in this group will not be detected by applying the World Health Organization BMD definition of osteoporosis.
The bisphosphonate alendronate increases hip and spine BMD, leading to a lowering of the risk of vertebral fracture , nevertheless a meta-analysis showed that change in spine BMD accounted for only 16% of the vertebral fracture reduction . Studies of the selective estrogen receptor modulator (SERM) raloxifene demonstrated that an increase in femoral neck BMD accounted for only 4% of vertebral fracture risk reduction . Calcitonin treatment results in only a 1.2% increase in BMD of the lumbar spine, but it is associated with a significant 33% reduction in vertebral fractures . Finally, changes in BMD associated with teriparatide (parathyroid hormone) account for less than 40% of its effect in reducing vertebral fracture risk . Together, these data demonstrate that changes in BMD with osteoporosis treatments only partially explain fracture risk reductions and that additional independent factors contribute to the clinical efficacy of these therapies.
Bone quality has been extensively discussed in the literature [17–21]. However, the definition of bone quality remains imprecise and in vivo assessment of its various components is nearly impossible to perform, as such assessment relies mainly on invasive techniques, which consequently are not generally applicable in clinical trials. In this review, we use the term “bone quality” as the quality of the bone matrix thus comprising mineralization, posttranslational changes in the collagen molecule that occur with age, and microscopic tissue damage which all are parameters affected by bone remodeling. Recent developments in the field of biochemical marker research have indicated that age-related changes in bone matrix molecules are associated with bone fracture resistance independent of BMD and may thus allow for the monitoring of some aspects of bone matrix quality in a manner that is noninvasive [17, 18, 22–29]. In this review, we emphasize the likely connection between bone matrix quality and bone remodeling, highlighting the importance of bone matrix age. Because determination of the age-related α to β isomerization in the C-telopeptide of type I collagen (αα/ββ CTX ratio) is currently the only method available for indexing mean bone age noninvasive [22, 23, 27], we discuss in detail the association between this index, bone matrix quality, and bone remodeling.
What is bone matrix quality?
Bone matrix is composed of mineral and organic matrix, mostly type I collagen. Bone matrix quality thus comprises the organization and extent of mineralization, structural changes in type I collagen molecules, and microscopic tissue damage occurring in this tissue. Central events regulating these properties of bone matrix quality are controlled by the rate of bone remodeling [21, 30–33]. The mineralization of the bone matrix is dependent on the age of the bones, with older bones being more mineralized because of alterations in bone-remodeling rates . The mode of action needed to increase the bone matrix strength is dependent on the existing level of mineralization of a given bone. If the mineralization is low initially, an increase in mineralization would most likely benefit the bone strength in contrast to an already highly mineralized bone where an increase in mineralization most likely would decrease the strength of the bone [34–36]. The specific effects of changes in bone mineralization are unknown, although a study has indicated that microcracks preferentially occur in the highly mineralized areas of bones, i.e., the old bone . Therefore, monitoring of the changes in bone mineralization, along with other parameters related to bone turnover, is essential to fully characterize bone matrix quality.
Also, microcracks accumulate in bones with slow remodeling, i.e., when the age of the bones increases [38–40]. This accumulation is associated with loss of structural properties such as bone stiffness and energy absorption [41, 42]. Although we are aware of the possible effects of number of microcracks and degree of mineralization on bone matrix quality, further discussion of these properties is beyond the scope of this review in which we focus on the effect of posttranslational changes on bone matrix quality.
Posttranslational changes with age
Bone remodeling rates control aging of the bone matrix
To maintain optimal quality of bone tissue, continuous remodeling of the bone matrix is essential, and the rate of remodeling controls the age of the bone matrix [17, 21, 22, 30–33, 43]. The important contribution of bone remodeling as an independent determinant of fracture risk in both untreated and antiresorptive-treated subjects has been demonstrated using systemic biochemical markers [7, 8, 54, 55]. Naturally occurring, rare genetic disorders of bone turnover also have revealed the effects of disrupted bone remodeling on fracture risk and bone matrix quality as changes in bone remodeling cause local alterations in bone tissue age (Fig. 2c). Osteopetrosis and pycnodysostosis are characterized by absent or low osteoclastic bone resorption, high bone mass, and poor bone matrix quality, leading to increased fracture risk [56, 57]. Conversely, in sclerotic diseases (e.g., sclerosteosis and Van Buchem disease), which result primarily from increased bone formation, the phenotype is associated with improved bone strength and lower fracture risk [58–60]. Similar findings have been reported in patients with autosomal dominant osteopetrosis type I; a high bone mass phenotype [61–63] associated with increased bone strength [63, 64]. In Paget’s disease patients have a locally high remodeling rate with increased fracture risk thus impaired bone matrix quality . These rare genetic disorders suggest that the expected association between high bone mass and improved bone strength is seen only when a certain level of osteoclastic bone resorption is maintained.
In summary, alteration of bone remodeling, particularly bone resorption, leads to pathological conditions characterized by alterations in bone matrix quality parameters, such as mineralization, and posttranslational modifications all suggested to be associated with bone quality. As a consequence, it increased fracture risk. This underlines the need for monitoring of bone remodeling and changes in bone matrix quality when assessing the effects of novel interventions on fracture risk.
Effects of osteoporotic treatments on bone matrix quality
A recent study in skeletally mature female beagle dogs compared treatment with clinically relevant doses of alendronate, risendronate, raloxifene, or vehicle for 1 year . The bisphosphonates decreased the αα/ββ CTX ratio by 29–56% in vertebral bone, compared with vehicle. In contrast, raloxifene did not change the age of the collagen matrix. Interestingly, the rate of bone turnover was significantly correlated with the concentration of the αα/ββ CTX ratio as well as other age-related protein modifications with a higher rate of bone remodeling associated with lower bone collagen maturation. This finding indicates that administration of bisphosphonates leads to a more aged collagen profile in vertebral trabecular bone, compared with bone from SERM dogs which probably was due to the higher suppression of bone remodeling by the bisphosphonates. Other studies of the effect of SERM on bone matrix quality parameters confirm the absence of increased aging in both cortical and trabecular bone from femurs [82, 83]. Several preclinical histomorphometric investigations have also proven that bisphosphonates reduce bone turnover to a much further extent than SERM and HRT [84, 85]. In summary, preclinical data highlight the important finding that clinically relevant doses of drugs that markedly suppress bone remodeling lead to increased aging of the bone matrix.
A 2-year study of healthy postmenopausal women evaluated the effect of antiresorptive treatment on the systemic αα/ββ CTX ratio as a measure of bone age . Participants were treated with bisphosphonates, raloxifene, or HRT up to 24 months. The bisphosphonates alendronate and ibandronate induced a significantly higher bone age profile than for SERM and HRT (38–52% vs. 3–15% reduction in αα/ββ CTX, respectively). These data indicate that the interventions have different effects on bone age, probably because of their different effects on bone remodeling. However, another study reported no change in the αα/ββ CTX ratio in patients treated with alendronate for 2 years , possibly because of a lesser reduction in bone remodeling. In a similar manner, the effect of oral calcitonin on the αα/ββ CTX ratio was assessed in postmenopausal women after 1 and 3 months of therapy . Bone resorption was reduced by 30% but the αα/ββ CTX ratio was unchanged, suggesting that calcitonin does not affect the age profile of bone matrix. This is most likely due to the fact that calcitonin decreases bone resorption but either does not reduce bone formation, or shows a markedly lower suppression of bone formation than other antiresorptive agents [87–89]. These data indicate that calcitonin does not eliminate the signaling between osteoclasts and osteoblasts which is referred to as “uncoupling” [90, 91]. Thus, bone formation is not largely decreased as seen with other drugs and the bone remodeling events remain.
The retrospective risedronate and alendronate cohort study included women with osteoporosis who received either risedronate or alendronate  carried out for comparison of fracture risk reduction in the two groups. In the risedronate group, the risk of nonvertebral and vertebral fractures was 18% and 43% lower, respectively, than in the alendronate group. A trend was observed as early as 3 months after the initiation of therapy; at 6 months, this difference was significant. Similar results were found in a meta-analysis of six trials of risedronate or alendronate therapy  in which the relative risk reduction for nonvertebral fractures was greater for risedronate than for alendronate. Although there are methodological limitations in the design of these analyses, the results suggest that risedronate may be more effective than alendronate in decreasing fracture risk, possibly because it reduces bone turnover to a lesser extent than alendronate and thereby preserves the bone quality properties of the matrix by a lesser increase in bone tissue age [77, 78, 94–96].
Bone remodeling is a key event in the maintenance of bone tissue mass, age, and quality. Assessment of the different components of bone matrix quality is currently hindered by the lack of specific, precise, and, importantly, noninvasive tests [23, 27]. However, the preliminary data discussed here suggest that the use of the urinary αα/ββ CTX ratio can provide valuable information on the age, and thus the quality of bone matrix and on its changes during treatment. The relationship between changes in and absolute levels of the αα/ββ CTX ratio, on the one hand, and the incidence of fracture in women receiving osteoporotic treatments, on the other hand, should be investigated, as associations may differ from those observed in untreated persons. Monitoring changes in the αα/ββ CTX ratio may then become a useful biological tool to assess the potential detrimental effects of some therapies, as there has been a concern that sustained long-term suppression of bone turnover may lead to increased bone fragility by compromising some aspects of bone matrix quality. It is interesting to note the renewed interest in drugs that modulate bone remodeling toward a more “steady-state” instead of leading to over suppression. It seems that a vast reduction in bone turnover in order to achieve gain in BMD compromises the quality of bone.
We propose that there is a relationship between the αα/ββ CTX ratio and the strength of the bone matrix; however, additional data are still needed to demonstrate the causality of this relationship. Whether the αα/ββ CTX is a better marker than other urinary markers for evaluation of fracture risk reduction has yet to be evaluated in clinical studies with fractures as the primary endpoint. However, it has been demonstrated that the ratio between the two biochemical markers αα CTX and ββ CTX provides us with a different picture than observing one of the markers alone [23, 27]. It provides a mean ratio of the bone matrix age that is resorbed in contrast to other bone remodeling markers that rather are index’s of the mean bone turnover, which will affect the bone matrix age.
We acknowledge the funding from the Danish “Ministry of Science, Technology and Science”.
Conflicts of interest
Leeming DJ, Henriksen K, Byrjalsen I, Qvist P, Madsen SH, and Karsdal MA are employees of Nordic Bioscience.