Cancer Causes & Control

, Volume 19, Issue 5, pp 527–535

Vitamin D insufficiency among African-Americans in the southeastern United States: implications for cancer disparities (United States)


  • Kathleen M. Egan
    • Vanderbilt University Medical Center and Vanderbilt-Ingram Cancer Center
    • Moffitt Cancer Center and Research Institute
    • Vanderbilt University Medical Center and Vanderbilt-Ingram Cancer Center
    • International Epidemiology Institute
  • Heather M. Munro
    • International Epidemiology Institute
  • Margaret K. Hargreaves
    • Meharry Medical College
  • Bruce W. Hollis
    • Medical University of South Carolina
  • William J. Blot
    • Vanderbilt University Medical Center and Vanderbilt-Ingram Cancer Center
    • International Epidemiology Institute
Original Paper

DOI: 10.1007/s10552-008-9115-z

Cite this article as:
Egan, K.M., Signorello, L.B., Munro, H.M. et al. Cancer Causes Control (2008) 19: 527. doi:10.1007/s10552-008-9115-z



To determine the prevalence and predictors of vitamin D insufficiency among black and white adult residents of the southern US.


A cross-sectional analysis of serum 25(OH)D levels using baseline blood samples from 395 Southern Community Cohort Study (SCCS) participants. Participants were African-American and white adults aged 40–79 who enrolled in the study from 2002–2004. We defined hypovitaminosis D as serum 25(OH)D levels ≤15 ng/ml.


Hypovitaminosis D prevalence was 45% among blacks and 11% among whites. Vitamin D intake from diet and supplements was associated with modest increases in circulating 25(OH)D (0.5–0.7 ng/ml per 100 IU increment), but hypovitaminosis D was found for 32% of blacks with intake ≥400 IU/day. Body mass index (BMI) was a strong predictor of risk for hypovitaminosis D among black women (OR = 6.5, 95% CI 1.7–25.1 for BMI ≥30 kg/m2 vs. 18–24.9 kg/m2). UVR exposure estimated by residential location was positively associated with 25(OH)D levels among all groups except white women.


Hypovitaminosis D was present in a substantial proportion of the African-American population studied, even in the South and among those meeting recommended dietary guidelines. Vitamin D should continue to be a studied target for ameliorating racial cancer disparities in the US.


Vitamin DRacial/ethnic disparitiesAfrican-AmericansNeoplasmsEpidemiology


In the past decade, increasing interest has been paid to the health effects of vitamin D. It has long been known that vitamin D plays a critical role in the maintenance of blood calcium and phosphorus, with deficiencies leading to rickets in children and osteomalacia in adults [1]. However, evidence has emerged that vitamin D deficiency might also contribute to other chronic diseases, including several types of cancer. The majority of cancers for which vitamin D has been implicated are more prevalent among African-Americans, a group also known to be at high risk for vitamin D deficiency [25]. Improving vitamin D status among African-Americans may therefore be a potential means for reducing the excess cancer burden among blacks in the US.

The full requirement for vitamin D can be obtained from sun exposure. Specific wavelengths in sunlight convert a cholesterol metabolite in the skin to provitamin D which is then converted by successive hydroxylations in the liver and kidney to 1,25(OH)2D (calcitriol), the biologically active steroidal form of vitamin D and the principal ligand of the vitamin D receptor (VDR). African-Americans are among groups (including the obese [6] and the elderly [7]) known to be at high risk for vitamin D deficiency because heavy melanization blocks the initial conversion of dehydrocholesterol to cholecalciferol in the skin [8]. It is estimated that a deeply pigmented black requires 10–50 times the exposure of a fair skinned white to achieve the same production of vitamin D [9]. US blacks are also reported to consume less milk and other foods fortified with vitamin D as well as nutritional supplements containing vitamin D [10]. Reduced endogenous synthesis combined with lower dietary intake contribute to the high rates of hypovitaminosis D and frank vitamin D deficiency in African-Americans.

Relatively few data are available on the prevalence of vitamin D deficiency among blacks living in the South, where sunlight driven vitamin D production might be expected to exceed that in other areas of the US. We examined serum levels of 25(OH)D, the major circulating form of vitamin D, and determinants of these levels in African American and white subjects enrolled in the Southern Community Cohort Study (SCCS).


Study population

The SCCS is a prospective cohort investigation initiated in 2001 to investigate the causes of cancer and other chronic diseases in socioeconomically disadvantaged and African-American populations historically excluded from large-scale health studies [11]. Adult subjects aged 40–79 are recruited in-person at community health centers (CHCs) and also by mail across rural and urban areas in the Southeast (Alabama, Arkansas, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, Virginia, and West Virginia). Enrollment in the cohort is ongoing, with over 75,000 subjects currently enrolled, with African-Americans comprising about two-thirds of the participants.

For the present study, subjects were sampled only from SCCS participants enrolled at CHCs. CHCs provide basic health services primarily to indigent or uninsured individuals, so that approximately 60% of CHC-enrolled SCCS participants have an annual household income less than $15,000, and about one-third have less than a high school education. From participants enrolled from March 2002 through October 2004 who donated a blood sample (n = 12,162), 396 were randomly selected for the present cross-sectional study. A 2 × 2×3 × 3 factorial design was employed, with 11 individuals selected within each of the 36 strata defined by race (black/white), sex, smoking status (current /former /never), and BMI (18–24.99 kg/m2, 25–29.99 kg/m2, 30–45 kg/m2). This design provided a balanced distribution across the factors, enhancing power to detect associations with vitamin D.

The SCCS was approved by Institutional Review Boards at Vanderbilt University and Meharry Medical College, and all participants provided written informed consent.

Data and blood collection

Baseline information was collected using a structured computer-assisted in-person interview conducted at the time of enrollment. Topics covered in the interview included demographics, health history, anthropometrics, tobacco and alcohol use, physical activity, and average diet over the past year. The dietary information was collected using a food frequency questionnaire (FFQ) including foods, beverages, and nutritional supplements developed specifically for the SCCS to include culturally and geographically appropriate foods [12].

Venous blood samples (20 ml) were collected by phlebotomists at the CHCs, excluding participants who were pregnant, weighed less than 100 pounds, or reported infection with hepatitis or HIV. Blood was collected from approximately 55% of the eligible participants, refrigerated, and shipped cold normally on the same day as collection to Vanderbilt University, where the samples were centrifuged and stored at −86°C to await analysis.

Measurement of serum vitamin D

A serum aliquot was retrieved for each of the 396 selected subjects for blinded assay. The vitamin D measurements were performed at the Medical University of South Carolina, using a radioimmunoassay method that measures serum levels of 25(OH)D. The assay is associated with high intra-assay reliability (intrapair coefficient of variation: 6–8%) [13]. The specimens had been stored for a period of up to 3.0 years (median 1.6 years) after blood collection. Measured 25(OH)D is not affected by storage times as long as 14 years (unpublished). No sample had been previously thawed.

Other measures of vitamin D exposure

Dietary intake of vitamin D was estimated from the reported intake of foods and supplements representing major dietary sources of the vitamin (liquid milk, cold cereal, tuna, eggs, multivitamins, and calcium supplements) in the baseline SCCS FFQ. Direct individual measures of sun exposure were not obtained from subjects. As a surrogate, we used ultraviolet radiation (UVR) measurements taken from the UV station geographically closest to the subject’s residence as an estimate of ambient UVR exposure. These measures were obtained from the Environmental Protection Agency which monitors ground level UV radiation at 58 sites across the US ( There were 10 UVR measurement sites included in this study. UVR measurements are converted to an index that estimates the erythemal (and vitamin D producing) intensity at given locations during the solar noon hour, with scores ranging from 0 (none) to 11+ (extreme). We used the average of the UVR scores recorded at the monitoring station during the 3-month period (91 days including day of enrollment) preceding each participant’s enrollment in the study. The number of measurements used to calculate the mean UVR score ranged from 72 to 91 (median: 90) because measurements were unavailable for some days at some stations. Mean UVR residential scores ranged from 1.04 to 8.52 (median: 5.43) among the subjects. We considered employment as a possible index of outdoor exposures. The majority of subjects (62%), however, reported being unemployed and few of the working subjects had potentially outdoor occupations (17 men and 1 woman reported construction work).

Statistical analysis

For the purposes of this analysis, we defined hypovitaminosis D as serum 25(OH)D levels ≤15 ng/ml (terming “insufficient” levels as 8–15 ng/ml and “deficient” levels as <8 ng/ml), consistent with many earlier investigations [4, 1416]. Mean 25(OH)D levels were compared across groups using two-sample t-tests. We used multiple linear regression models to explore the influence of UVR, dietary intake, and other factors on serum 25(OH)D levels. These models were constructed separately according to gender and race and included covariates for age (continuous), education (12+ vs. <12 years), body mass index (BMI, continuous), estimated dietary vitamin D intake (continuous), employment status (working vs. not working, with housewives considered nonworkers), alcohol drinking within the past month (any vs. none), current cigarette smoking (yes/no), and residential UVR (continuous score as described above). Using logistic regression, we also considered the predictive value of these factors on risk for hypovitaminosis D.

For the purposes of multivariate modeling (both linear and logistic as described above), we also explored the influence of season of blood collection (summer: June – September, Winter: December–March, Spring/Fall: April, May, October, November) on vitamin D status, but found this variable to be nearly interchangeable, and highly collinear, with UVR score. Because of this collinearity, UVR score and season of blood collection could not both be included in the models, and we chose to use UVR score because it reflects both the time of year and the general geographical location of the participant. The final results were not materially different in models that substituted season of blood collection for UVR score.


Table 1 and Fig. 1 show the distributions of serum 25(OH)D and other characteristics of the 395 subjects (one sample was lost due to lab error). Mean 25(OH)D levels were substantially lower in blacks than whites, for men (19.0 vs. 28.5 ng/ml, pt-test < 0.001) and women (15.6 vs. 25.6 ng/ml, pt-test < 0.001). The upper quartile of the distribution of 25(OH)D levels among blacks was only slightly above the lower quartile cutpoint among whites. Mean levels were also modestly higher in men than women, by about 22% in blacks (pt-test = 0.005) and 11% in whites (pt-test = 0.05). Whereas 25(OH)D levels had a normal distribution in whites (Shapiro-Wilk test for normality p = 0.07), levels in blacks were skewed to the left (p < 0.001) reflecting low vitamin D stores in a large proportion of the black subjects (Fig. 1).
Table 1

Descriptive characteristics and serum 25(OH)D levels (ng/ml) in a sample of 395 participants enrolled in the Southern Community Cohort Study




Black (n = 99)

White (n = 99)

Black (n = 99)

White (n = 98)

Serum 25(OH)D levels (ng/ml)

    Median (Q1–Q3)

17.0 (12.6–24.1)

27.8 (20.5–36.3)

14.2 (8.9–20.2)

25.9 (17.3–31.9)

    % Normal (>15)





    % Insufficient (8–15)





    % Deficient (<8)





Mean (S.D.) age, (years)

50.3 (8.3)

53.8 (9.2)

51.6 (9.7)

54.7 (10.0)

Mean (S.D.) body mass indexb (kg/m2)

28.0 (5.2)

28.3 (5.6)

28.7 (5.9)

28.2 (6.0)

Mean (S.D.) average daily dietary vitamin D intake (IU)a

226 (216)

234 (221)

241 (227)

304 (248)

Mean (S.D.) residential UVR score

5.3 (1.9)

5.3 (2.1)

5.3 (1.8)

5.4 (2.0)

Season of enrollmentc

    % In summer





    % In spring/fall





    % In winter





% Current smokerb





% Using alcohol in past month





% Currently working





% With <high school education





Abbreviations: Q1, quartile 1; Q3, quartile 3; IU, International Units; UVR, ultraviolet radiation

aFrom both food and supplement sources

bMatching factor

cSummer: June–September; Spring/Fall: April, May, October, November; Winter: December–March
Fig. 1

Distribution of serum 25(OH)D (ng/ml) according to race

Overall, 28% of subjects were classified as hypovitaminosis D, with large differences observed between blacks and whites (Table 1). The crude proportion of subjects with hypovitaminosis D was four times higher in blacks (45%) than in whites (11%) (p < 0.001), with a disparity seen for both males and females. Frank vitamin D deficiency (<8 ng/ml) was observed in 20 blacks (10%) and three whites (1.5%) (Fisher’s exact p < 0.001), and occurred three times more often in black females (15 of 99; 15%) than black males (5 of 99; 5%). Average daily intake of vitamin D (from food and supplements) was marginally significantly higher among white than black women (pt-test = 0.06), but showed little racial variation among men (pt-test = 0.80 comparing blacks and whites). Because of their comparable geographic distributions, estimated residential (ambient) levels of UVR were similar for blacks and whites, and the study’s enrollment of both blacks and whites from the same community health centers year-round resulted in only minor differences in the season of enrollment/blood collection by race (Table 1).

Serum concentrations of 25(OH)D were generally higher among subjects enrolled from May–October than from November–April (Fig. 2), except among white females in whom there was no seasonal variation, even when finer seasonal stratification was employed (mean 25(OH)D in May-August was 25.1 ng/ml vs. 25.9 ng/ml in November–February, p = 0.76). Results from multivariate linear regression models further demonstrated the effect of sunlight on circulating 25(OH)D levels (Table 2), with ambient UVR associated with significantly increased levels in all groups except white females. The strongest association was for black males, where a one-unit increase in UVR score was associated with an increase of 2.4 ng/ml in 25(OH)D. Each 100 IU increment of dietary vitamin D intake predicted serum level increases of 0.5–0.7 ng/ml consistently across the groups, although these findings were not always statistically significant. The 25(OH)D levels were inversely related to BMI, but with significant declines with increasing BMI only for white males and black females. Smoking was also inversely linked to serum 25(OH)D, especially among men, while alcohol consumption was linked to higher serum levels among whites and lower serum levels among blacks, although none of these associations reached statistical significance except for smoking among white males. For men, but not women, having a high school or greater education was associated with significantly lower 25(OH)D levels on the order of 4–6 ng/ml. Subjects who were currently working tended to have lower serum 25(OH)D, with the exception of black males (which was the employed group most likely to perform construction work 10/32 = 31% vs. 7/40 = 18% of white men). For blacks, factors in the model explained 32% (for males) and 18% (for females) of the variation in serum 25(OH)D levels (based on the adjusted R2). In contrast, for white women the model had essentially no explanatory power.
Fig. 2

Mean serum 25(OH)D levels (ng/ml) in relation to season of enrollment

Table 2

Linear regression predictors of serum 25(OH)D levels



Black male (n = 99)

White male (n = 98)

Black female (n = 96)

White female (n = 96)


p value


p value


p value


p value

Residential UVR score

per 1 unit









Dietary vitamin D intake

per 100 IU









Body mass index

per 1 kg/m2









Current cigarette smoking

Yes vs. no









Alcohol use in past month

Yes vs. no










≥High school versus less









Currently working

Yes vs. no










per 1 year









Adjusted model R-squared






Abbreviations: UVR, ultraviolet radiation; IU, International Units

aParameter estimate from linear regression model including all covariates shown in this table, interpreted as the average change in serum 25(OH)D level (ng/ml) associated with the unit increment shown for each variable

The crude lower concentration of serum 25(OH)D observed in blacks (Table 1) was not explained by racial differences in UVR exposure, dietary intake or the other variables shown in Table 2. In sex-specific linear regression models including both blacks and whites, black race was associated with 25(OH)D levels approximately 10 ng/ml lower for both males (10.3 ng/ml) and females (9.3 ng/ml).

The odds of hypovitaminosis D were about eight times higher for blacks than whites after adjusting for ambient UVR, diet, and the set of other covariates (multivariate odds ratio, OR, from logistic regression = 8.0; 95% CI 4.5–14.5; p < 0.001). For black women (the subgroup with the highest prevalence of hypovitaminosis D—54%), adequate dietary intake had only a marginal effect on vitamin D status. Among black women with estimated dietary intake ≥400 IU (the recommended intake for adults aged ≥50 years [17]), 41% had 25(OH)D levels ≤15 ng/ml. For black men with dietary intake ≥400 IU, the prevalence of hypovitaminosis D was 23%, so that 32% of African-Americans overall at this level of dietary intake were categorized as hypovitaminosis D.

Using multivariate logistic regression, predictors of hypovitaminosis D were examined among black subjects (data not shown in table). A one-unit increase in ambient UVR score was associated with a statistically significant 30–40% reduction in risk [OR = 0.6, 95% CI (0.5–0.8) for black men; OR = 0.7, 95% CI (0.5–0.9) for black women]. Vitamin D from diet was also protective: a 100 IU increase in intake was linked to a 20% lower risk of hypovitaminosis D in both males and females [OR = 0.8, 95% CI (0.6–1.0) for black men; OR = 0.8, 95% CI (0.7–1.0) for black women]. For women only, each 1 kg/m2 increase in BMI was associated with a 10% increased risk (p = 0.01). The OR for hypovitaminosis D was 6.5 (95% CI 1.7–25.1) for obese (≥30 kg/m2) versus healthy weight (18–24.9 kg/m2) black women. Insufficient or deficient 25(OH)D levels were found in 76% of obese black women (but only 15% of obese white women). Other factors related to hypovitaminosis D among the African-American subjects with marginal significance included education among men [OR = 2.8, 95% CI 1.0–8.0, for ≥ high school] and alcohol use among women (OR = 2.7, 95% CI 0.9–8.3 for use in the last month).


The present study documents a large disparity in the prevalence of hypovitaminosis D between blacks and whites in this population of relatively low-income adults in the southeastern US. The findings presented may be underestimates of the true prevalence of vitamin D inadequacy in all subgroups studied: although the serum 25(OH)D level optimal for health remains controversial [18, 19], others [2] have suggested that the cutoff for classifying deficiency should be 20 ng/ml, and a recent evaluation of studies on bone mineral density and other health outcomes linked to vitamin D suggests that the minimum recommended level should be raised to at least 30 ng/ml [20]. Using this higher threshold of 30 ng/ml, a majority of white (62%) and nearly all of the black (89%) subjects in this study would be considered at risk for vitamin D related disorders.

Although we lacked specific information on individual sun exposures, ambient UVR was an important and highly significant predictor of vitamin D stores, except in white women. The association with ambient UVR held despite the crudeness of the index, based on measurements taken at the single monitoring station that was geographically closest to the participant’s residence but often a substantial distance away. Associations we observed with employment status, education and perhaps some of the other variables assessed likely reflect individual sun exposures not captured by the ambient UVR indices. For example, for men, higher education (that was strongly inversely related to vitamin D levels) would be associated with the opportunity for mainly indoor higher paying jobs.

Casual direct exposure of the skin to sunlight can generally provide sufficient vitamin D in the summer months. However, vitamin D photosynthesis is attenuated at higher latitudes, especially in winter months when little synthesis can occur through skin exposure throughout much of US [21]. Previous studies have documented a high prevalence of vitamin D inadequacy among blacks living in temperate climates [3, 4, 14]. However, the current study documents poor vitamin D status even among blacks living in the southern US (mean latitude of subject residence: 33) where endogenous vitamin D production is possible for a greater part of the year.

We found modest contributions to serum 25(OH)D from diet that were approximately the same for blacks and whites. Vitamin D occurs naturally in a limited number of foods (mainly oily fish and eggs), and fortified dairy foods, cold cereals, and increasingly other foods provide additional quantities. In national surveys, dairy food consumption is typically lower among African-Americans compared to whites [10], which has been attributed to higher rates of lactose intolerance among blacks as well as cultural preferences [22, 23]. In addition, blacks are reported to use nutritional supplements less frequently than whites [24]. In our study sample, reported milk consumption was somewhat lower among blacks, but multivitamin use was similar (33% of blacks and 36% of whites reported multivitamin use at least 2–3 times per week). Nutritional supplementation, consumption of vitamin D-fortified juices and grain products, adherence to dietary strategies that minimize malabsorption symptoms, and controlled use of UV-light exposure, have all been advocated to maintain optimal serum 25(OH)D in African-Americans and other high-risk groups [2527]. It is noteworthy that a considerable prevalence of hypovitaminosis D was observed in our study even for African-Americans with seemingly adequate dietary intake by current standards.

The associations we detected between serum 25(OH)D level and dietary vitamin D intake estimated from the FFQ provide some validation of the dietary and nutritional supplement information provided by SCCS participants. We recognize, however, that dietary intake estimated using this method involves non-trivial measurement error. This, coupled with the misclassification in individual sun exposure (the primary source of vitamin D), likely accounts for our prediction models explaining at most 32% (in black males) of the total variance in circulating 25(OH)D levels. In white females, our prediction models held little explanatory power. Although speculative, it may be that individual behaviors including sun tanning and use of sun-blocking agents, for which we lacked information, are more important determinants of serum vitamin D concentrations in white women than in other subjects.

In previous work evaluating the relationship between body mass index [4] and percent body fat [28] and circulating 25(OH)D, the inverse association was reported to be stronger for white than black women. In contrast, we found a stronger relationship for black women, among whom obesity was related to a more than 6-fold risk for hypovitaminosis D. Obesity has been noted in previous studies to be a risk factor for vitamin D deficiency and secondary hyperparathyroidism in African-Americans [14]. The inverse association with obesity may reflect more restricted sun exposure in the obese, the sequestering of vitamin D in adipose tissue, or reduced biosynthesis in the skin [6].

A strength of the current analysis is its base in the large SCCS cohort where a number of factors, including enrollment site, socioeconomic status, and to some extent lifestyle factors are by design similar by race [11], facilitating more direct comparisons between blacks and whites. Although the range of vitamin D levels we observed are applicable to the SCCS study population, they may not be applicable to all persons living in the southeastern US. A large proportion of SCCS participants are of low income and many are unemployed. The impact of these characteristics is not clear, but could result in greater access to midday sun exposures that increase vitamin D stores. The fact that mean 25(OH)D levels measured in our subjects are in line with those reported in other large or population-based US samples (e.g., in black female (18.1 ng/ml), white female (30.4 ng/ml), black male (20.9 ng/ml), and white male (33.2 ng/ml) participants in NHANES III [29] and in white female controls sampled from the Nurses’ Health Study [30] (27.1 ng/ml)) does suggest, however, that our subjects are likely representative of the underlying population.

Animal and in vitro studies are strongly supportive of a potential cancer protective influence for vitamin D. A wide variety of normal and cancer cells, including those of the breast, colon, and prostate, express the VDR [31], and these same tissues have also been shown to possess specific hydroxylases that convert 25(OH)D to calcitriol [32, 33], suggesting paracrine and autocrine functions for vitamin D in tissues not directly involved in mineral homeostasis. Calcitriol promotes cell differentiation and apoptosis [34] and counteracts the growth-signaling effects of insulin-like growth factors [35]. Gene expression profiling studies also suggest direct genoprotective influences for vitamin D, including functions in DNA repair, phase II metabolism of xenobiotics, and the induction of genes that limit accumulation of reactive oxygen species in the cell [34]. Vitamin D has also been shown in vitro to reduce tumor invasion and angiogenesis, suggesting functions in metastasis and tumor control [3638].

Mortality rates for cancers at many sites (including colon, rectum, breast, prostate, and ovary) are higher at northern latitudes, where populations are exposed to the least amount of natural light [39, 40]. A number of epidemiologic studies have also inversely linked cancer risk with dietary sources or blood levels of vitamin D or with cumulative sun exposure [41, 42], and recent studies have linked prostate and breast cancer risk to genetic variation in the VDR [4345]. The strongest evidence from seroepidemiologic studies has accumulated for colorectal cancer [4648], although randomized intervention trials have produced mixed results [49, 50]. There is, however, growing evidence that vitamin D may be an important factor in colorectal cancer incidence and in the disparate rates between black and white Americans. African-Americans have the highest incidence and mortality rates from colorectal cancer among all US ethnic groups [51] and worse overall and stage-specific survival rates compared to whites [52]. Inequalities in healthcare delivery may contribute to these patterns [5356], but a poorer outcome in blacks has been noted even among persons with equal access to health care and receiving equivalent treatments [54, 55]. Although these health disparities are likely to be multifactorial in cause, given the mounting evidence for a chemopreventive role of vitamin D in colorectal cancer [46, 47] and potentially beneficial influence of adequate vitamin D levels on survival [46, 47, 57], at least some of the noted disparity could be related to lower vitamin D stores among US blacks. Vitamin D may likewise contribute to the notably higher incidence and mortality from prostate cancer among African-Americans [58, 59].

While further research will help clarify the influence of vitamin D on cancer and other chronic diseases, vitamin D is likely to become an increasingly important target for improving health and reducing racial health disparities in the US. Our planned follow-up of the SCCS cohort for cancer incidence and mortality will provide a unique opportunity to examine the association between vitamin D and cancers that disproportionately affect blacks. While the results reported herein cannot be used to directly estimate risk of cancer according to vitamin D status, the findings clearly show distinctive racial disparities in, and in correlates of, vitamin D levels, and offer additional incentive for further research into its potential as a cancer preventative agent.


The Southern Community Cohort Study is funded by a grant from the National Cancer Institute (R01 CA092447).

Copyright information

© Springer Science+Business Media B.V. 2008