Which of the following conditions is easily treated because it is responsive to dietary changes?

The EARs, RDAs, and AIs for vitamin D are shown in Table 5-3 by life stage group. The identical EARs across age groups are notable and, as discussed below, reflect the concordance of serum 25OHD levels with the integrated bone health outcomes as well as the lack of an age effect on the simulated dose–response. Studies used to estimate these values have been included in Chapter 4 in the review of potential indicators.

While at the outset the consideration of vitamin D requirements recognizes that humans are physiologically capable of obtaining vitamin D through exposure to sunlight, the estimation of DRIs for vitamin D immediately requires a plethora of related considerations ranging from factors that affect and alter sun exposure and vitamin D synthesis, to public health recommendations regarding the need to limit sun exposure to avoid cancer risk. Just as importantly, the available data have not sufficiently explored the relationship between total intake of vitamin D per se and health outcomes. In short, a dose–response relationship between vitamin D intake and bone health is lacking. Rather, measures of serum 25OHD levels as a biomarker of exposure (i.e., intake) are more prevalent.

After considering the available evidence, including data published after the 2009 analysis by the Agency for Healthcare Research and Quality (Chung et al., 2009), hereafter referred to as AHRQ-Tufts, the committee concluded:

  • A dose–response relationship can be simulated based on serum 25OHD measures. That is, serum 25OHD levels can reflect intake, and there are studies that relate bone health outcomes to serum 25OHD levels, as described in Chapter 4.

  • Newer data provide the ability to link vitamin D intakes to the change in serum 25OHD level under conditions of minimal sun exposure, thereby reducing the confounding introduced by the effect of sun exposure on serum 25OHD concentrations. These data also provide an approach for estimating dietary reference values related to intakes that will achieve targeted serum 25OHD concentrations, albeit without regard to the contributions from sun exposure.

Generally, association studies that use a biomarker of exposure in relation to health outcomes can present challenges when establishing reference values. Such measures are not necessarily valid or reliable markers, and they can be subject to considerable confounding by a host of variables. In the case of vitamin D, there are certain factors that allow more confidence in using this measure in the estimation of reference values. Specific deficiencies of vitamin D lead to recognized, measurable deficiency states with adverse effects on the indicator of interest, in this case bone health as evidenced by rickets and osteomalacia. The next consideration is whether the biomarker is an accurate reflection of intake. In the case of serum 25OHD concentrations, despite the lack of clarity about the impact of a number of variables on serum 25OHD concentrations, the measure can be reasonably associated with total intake when sunlight exposure is minimal.

On this basis, serum 25OHD concentrations were used to simulate a dose–response relationship for bone health. Next, the available data— notably those obtained under conditions of limited sun exposure—were integrated in order to estimate a total intake that would result in the desired serum 25OHD relative to measures of bone health. This step-wise process for simulating a dose–response relationship for vitamin D considered, first, the relevance to this study of the confounding introduced by 25OHD assay methodologies and related measurement problems, including “assay drift.” Next, the data from three bodies of evidence described in Chapter 3—the relationship between calcium absorption and serum 25OHD levels; serum 25OHD levels and bone health in children; and serum 25OHD levels and bone health in older adults—were summarized and used to specify a dose–response curve for serum 25OHD. Interestingly, concordance of serum 25OHD levels and bone health for median requirements emerged across all age groups. Finally, the relationship between changes in vitamin D intake and changes in serum 25OHD concentrations was considered.

In considering serum 25OHD levels as reported by various studies, the committee was aware of the so-called “assay drift” associated with longitudinal comparison of assay results collected in the National Health and Nutrition Examination Survey (NHANES), as well as the large inter-laboratory variation worldwide (Garter et al., 2010) and the differences in performance characteristics between the various antibody-based and liquid chromatography (LC)-based assays. Although it was reported that a consistent assay bias was recognized within the NHANES data for certain time periods (2000–2006)1, this assay drift as described in Chapter 3 is small in comparison with the inter-laboratory variation or the methodological differences observed in data from the Vitamin D External Quality Assurance Scheme (DEQAS) (Garter et al., 2010).

Accordingly, for the purposes of this study, a correction of data based on knowledge of assay drift was neither practical nor necessary for the determination of DRI values. The NHANES assay drift applies to certain data analyzed within a known time frame (2004–2006), but at the same time other data using similar methods might have experienced drift that was unknown and therefore could not be accounted for or corrected. Moreover, the dispersion of serum 25OHD levels across the range of vitamin D intakes is very large, as exemplified by data from Millen et al. (2010).

Although methodological issues contribute to uncertainty in comparing data among studies, the differences in serum 25OHD over time due to assay drift are relatively small and thus inconsequential when viewed relative to other sources of biological variation. In essence, assay drift is considered to be a component of the noise within the signal, and one of the contributors to uncertainty. But for DRI purposes it did not require re-evaluation or normalization of data. Regarding NHANES data specifically as they were used by the committee as a basis for the intake assessment (Chapter 7), the ramifications of “assay drift” are more significant for longitudinal comparisons, which were not a component of the intake assessment.

The evidence presented in Chapter 4 allows the following conclusions about serum 25OHD concentrations relative to DRI development:

  • Calcium absorption

    Given that an identified key role of vitamin D is to enhance calcium absorption, evidence regarding the level of serum 25OHD associated with maximal calcium absorption is relevant to establishing a dose–response relationship for serum 25OHD level and bone health outcomes. As outlined in Chapter 4, for both children and adults there was a trend toward maximal calcium absorption between serum 25OHD levels of 30 and 50 nmol/L, with no clear evidence of further benefit above 50 nmol/L.

  • Rickets

    In the face of adequate calcium, the risk of rickets increases below a serum 25OHD level of 30 nmol/L and is minimal when serum 25OHD levels range between 30 and 50 nmol/L. Moreover, when calcium intakes are inadequate, vitamin D supplementation to the point of serum 25OHD concentrations up to and beyond 75 nmol/L has no effect.

  • Serum 25OHD level and fracture risk: Randomized clinical trials using adults

    Because available trials often administered relatively high doses of vitamin D, serum 25OHD concentrations varied considerably. Although some studies suggested that serum 25OHD concentrations of approximately 40 nmol/L are sufficient to meet bone health requirements for most people, findings from other studies suggested that levels of 50 nmol/L and higher were consistent with bone health. Given that causality has been established between changes in serum 25OHD levels and bone health outcomes, information from observational studies can be useful in determining the dose–response relationship.

  • Serum 25OHD level and fracture risk: Observational studies using adults

    Melhus et al. (2010) found that serum 25OHD levels below 40 nmol/L predicted modestly increased risk of fracture in elderly men, but there was no additional risk reduction above 40 nmol/L, suggesting maximum population coverage at 40 nmol/L. In contrast, Ensrud et al. (2009) observed that men with 25OHD levels below 50 nmol/L had greater subsequent rates of femoral bone loss, and there was no additional benefit from serum 25OHD concentrations higher than 50 nmol/L, suggesting maximum population coverage at 50 nmol/L. Still other studies suggested that somewhat higher serum 25OHD concentrations were needed to provide maximum population coverage. For example, Cauley et al. (2008), in a prospective cohort study, reported that serum 25OHD concentrations in the range of 60 to 70 nmol/L were associated with the lowest risk of hip fracture; above this level, risk was reported to increase, but not significantly. Looker and Mussolino (2008), using NHANES data, found that, among individuals with serum 25OHD levels above 60 nmol/L, the risk of hip fracture was reduced by one-third. The van Schoor et al. (2008) study reported that in more than 1,300 community-dwelling men and women ages 65 to 75 years, serum 25OHD levels less than or equal to 30 nmol/L were associated with a greater risk of fracture. Cauley et al. (2010) noted that men in the MrOs cohort with levels of serum 25OHD less than 50 nmol/L experienced a significant increase in hip fracture risk that was attenuated somewhat when considering hip BMD.

  • Osteomalacia from postmortem observational study

    Data from the work of Priemel et al. (2010) have been used by the committee to support a serum 25OHD level of 50 nmol/L as providing coverage for at least 97.5 percent of the population. The data, however, do not allow specification of serum 25OHD levels above which half of the population is protected from osteomalacia and half is at risk; rather the evidence indicated that even relatively low serum 25OHD levels were not associated with the specified measures of osteomalacia, mostly likely owing to the impact of calcium intake. This is consistent with a number of studies, both from trials and from observational work, indicating that vitamin D alone appears to have little effect on bone health outcomes; it is most effective when coupled with calcium.

The wide variation in the precise relationship of serum 25OHD levels to any specific outcome for bone health is evident in the discussion above and the conclusion of the 2007 AHRQ report (Cranney et al., 2007; hereafter referred to as AHRQ-Ottawa) that a specific threshold serum 25OHD level could not be established for rickets. Nonetheless, the committee found a striking concordance of the data surrounding serum 25OHD levels across several of the specific outcomes and across age groups, which, in turn, allows an estimation of serum 25OHD concentrations that are consistent with an EAR- and RDA-type reference value when the indicators of bone health are integrated (see Figure 5-1). As shown above, the levels range between 30 and 50 nmol/L, respectively, for the EAR and the RDA. Further, the higher level of 75 nmol/L proposed by some as “optimal” and hence consistent with an RDA-type reference value is not well supported.

The congruence of the data links serum 25OHD levels below 30 nmol/L with the following outcomes: increased risk of rickets, impaired fractional calcium absorption, and decreased bone mineral content (BMC) in children and adolescents; increased risk of osteomalacia and impaired fetal skeletal outcomes; impaired fractional calcium absorption and an increased risk of osteomalacia in young and middle-aged adults; and impaired fractional calcium absorption and fracture risk in older adults. Similarly, for all age groups, there appears to be little causal evidence of additional benefit to any of these indicators of bone health at serum 25OHD levels above 50 nmol/L, suggesting that this level is consistent with an RDA-type reference value in that this value appears to cover the needs of 97.5 percent of the population. For some bone health outcomes, such as BMD in adults, the results of the available RCT(s) show a negative relationship between serum 25OHD level and outcome, and the available observational studies yield mixed results. In addition, for several of these specific outcomes, the RCTs that show benefit for what is generally a single tested dose of supplemental vitamin D do not allow inference of intermediate levels of 25OHD in serum between the placebo and dose. When evaluating the congruence of the data, the committee, therefore, looked at the lowest effective dose and the achieved serum 25OHD level. Uncertainty does exist for the selected serum 25OHD levels consistent with an EAR- and RDA-type level; this uncertainty stems from the wide range of effects and relationships and the lack of a relevant dose–response relationship.

Overall, when the data are examined for an EAR-type of serum 25OHD concentration—that is, a median type of value, a level above which approximately half the population might meet requirements and below which one-half might not—the data do not specifically provide such information, although this value can be concluded to lie between 30 and 50 nmol/L for all age groups. This is likely due to the unique inter-relationship between calcium and vitamin D. At lower levels of vitamin D, there appears to be a compensation on the part of calcium, and calcium intake can overcome the marginal levels of vitamin D. Calcium appears to be the more critical nutrient in the case of bone health, and therefore has an impact the dose–response relationship. Therefore, calcium or lack thereof may “drive” the need for vitamin D.

In the case of vitamin D—or more precisely serum 25OHD concentrations— the data, especially for adults, do not lend themselves readily to the usual DRI model, which is based on the assumption that data concerning a median intake will be as available or even more prevalent than data concerning coverage for most of the population. The standard model specifies, based on the assumption of a normal distribution for requirements, that the average or median requirement (i.e., the EAR) is used to calculate the RDA. This unanticipated situation is primarily evident for adults for whom it is not possible to estimate the level of 25OHD in serum at which 50 percent of the population is at increased risk of osteomalacia. Rather, in this case, the data allow a better estimation of the serum 25OHD level that likely covers most persons in the population. In children and adolescents, however, and to some extent in adults, the integration of these indicators as shown in Figure 5-1 enables an approximation of a level of serum 25OHD at which the risk of adverse bone health outcomes increases; however, there is uncertainty associated with this value given the limitations of the data at present. Thus, for children and adolescents, a serum 25OHD level of 40 nmol/L from the middle of the range of 30 to 50 nmol/L, at which risk to the population is increasing, was selected to serve as the targeted level for a median dietary requirements. For adults, the evidence that most are covered by a serum 25OHD level of 50 nmol/L is used as the starting point, and a value of 40 nmol/L is estimated as the targeted level for a median dietary requirement.

Overall, as shown in Figure 5-1, the data suggest that 50 nmol/L can be set as the serum 25OHD level that coincides with the level that would cover the needs of 97.5 percent of the population. The serum 25OHD level of 40 nmol/L serum 25OHD is consistent with the median requirement. The lower end of the requirement range is consistent with 30 nmol/L, and deficiency symptoms may appear at levels less than 30 nmol/L depending upon a range of factors. What remains is to ascertain the level of vitamin D intake that would achieve these levels of 25OHD in serum.

As diet is not necessarily the only source of vitamin D for the body, it would be ideal if the relative contribution made by sunlight to the overall serum 25OHD levels could be quantified, thereby clearing the path to better estimate total intakes of the nutrient needed to maintain a specified serum 25OHD level associated with the health outcome. In fact, however, the examination of data related to dietary recommendations about vitamin D is complicated by the confounding that sun exposure introduces, especially because the factors that affect sun exposure—such as skin pigmentation, genetics, latitude, use of sunscreens, cultural differences in dress, etc.—are not clearly measured and controlled for in research studies and in some cases not fully understood. Further, and just as critically, vitamin D requirements cannot be based on an accepted or “recommended” level of sun exposure as a means to meet vitamin D requirements, because existing public health concerns about sun exposure and skin cancer preclude this possibility. The absence of studies to explore whether a minimal-risk ultraviolet B (UVB) exposure relative to skin cancer exists to enable vitamin D production has been noted (Brannon et al., 2008).

Instead, the best remaining approach is to describe the relationship between total intake and serum 25OHD levels under conditions of minimal sun exposure. In doing so, the committee made the assumption that the outcomes, therefore, would reflect only a very small component attributable to sun exposure as would occur naturally in free-living individuals in winter in the northern hemisphere. This approach to DRI development requires that persons who use the DRI values for health policy or public health applications adjust their considerations relative to adequacy of the diet based on whether the population of interest is minimally, moderately, or highly exposed to sunlight. As mentioned previously, the potential contribution from body stores remains unknown and thus introduces uncertainty. Further, the application of the DRIs relative to assessing the adequacy of vitamin D intake/exposure for the population (foods, supplements, and sun exposure) would benefit from consideration of the serum 25OHD concentrations in the population of interest.

The committee examined information from controlled trials in younger and older adults and in children that could be used in the simulation to describe the relationship between vitamin D intake and changes in serum 25OHD concentrations. Of interest was the condition of minimal sun exposure, which occurs in northern latitudes and in Antarctica during their respective winters. The focus was clinical trials in Europeans or North Americans in which baseline total intake was measured or could be reliably estimated using peer-reviewed published data on baseline intakes of the population studied. In this way, the total intake of vitamin D (baseline plus supplement) was known or could be reliably estimated at latitudes greater than 50°N during late fall (October) through early spring (April) or in Antarctica during its fall (March) through its winter (October). These studies are summarized in Table 5-4. Studies needed to report measured serum 25OHD levels as means or medians with estimates of variance (standard deviation [SD], CI, or inter-quartile ranges) are included. Some studies in the United State at 40°N to 46°N were identified that met all inclusion criteria except that of latitude. These are also included in Table 5-4.

In reviewing these studies, most of which were published in the past 2 years, the committee noted the variability in the declines in serum 25OHD levels during the winter seasons in the respective hemispheres and the existence of a non-linear response to doses of vitamin D. These are discussed below prior to the description of the simulated dose–response analysis.

Winter season change in serum 25OHD levels across age groups As shown in Figure 5-2, the serum 25OHD levels of the placebo groups in the studies conducted with children (Viljakainen et al., 2006) and with younger, middle-aged, and older adults (Cashman et al., 2008, 2009; Smith et al., 2009) decreased over a wide range during the winter season at each latitude. In one study where participants started the season with lower baseline serum 25OHD levels (i.e., 36 nmol/L), the concentrations decreased only slightly (i.e., to 34 nmol/L) (Smith et al., 2009). However, in other studies where participants began the season with higher baseline serum 25OHD levels (i.e., 57 to 66 nmol/L, respectively) the serum 25OHD levels decreased more (i.e., to 34 and 43 nmol/L, respectively) (Viljakainen et al., 2006; Cashman et al., 2008, 2009), compared with those participants with lower baseline levels. In short, the decline in serum 25OHD levels in the placebo arm of these studies appears to be greatest when initial serum 25OHD levels are higher. Slightly higher intake of vitamin D (of approximately 10 to 150 IU/day, compared with other studies) in the study with the lowest baseline serum 25OHD levels (Smith et al., 2009) may have accounted for the attenuated reduction in serum 25OHD level.

A similar trend exists across many of the studies with a placebo group, as summarized in Table 5-4. Declines of 3 to 13 nmol/L in serum 25OHD level are reported for those with baseline levels from 36 to 47 nmol/L. Larger declines in serum 25OHD levels of 8 to 62 nmol/L are reported for those with baseline levels of 64 to 96 nmol/L. However, considerable variability exists in the seasonal decline in serum 25OHD level in winter months, as demonstrated by the increases of 1 nmol/L in some participants with baseline serum 25OHD levels of 33 nmol/L at latitudes above 50°N (Larsen et al., 2004), and increases of 4.6 to 10.8 nmol/L from a baseline of 48.9 to 61.9 nmol/L in some participants at latitudes above 42°N (Harris et al., 2002; Nelson et al., 2009).

These observations suggest that the assumption of minimal sun exposure was met. Further, they suggest that during the winter season small intakes of vitamin D may play a role in attenuating the winter decline in serum 25OHD levels in those with lower baseline serum 25OHD levels. They also suggest that the kinetics of vitamin D turnover or mobilization from stores may differ in those who have lower baseline serum 25OHD levels. Further, it is possible that the greater decline of serum 25OHD levels in those with higher baseline levels could, perhaps, also represent regression to the mean, at least in part. At this time, it is not possible to clarify which of these possibilities occur.

Non-linear response to vitamin D dosing The available data suggest a non-linear response of serum 25OHD above baseline levels to doses of vitamin D for all age groups. Non-linear response to doses of vitamin D (total or IU/kg) is also reported in mice (Fleet et al., 2008) and rats (Anderson et al., 2008; Fleet et al., 2008), demonstrating the biological plausibility of a non-linear response of serum 25OHD concentrations to vitamin D intake. It is noted that AHRQ-Ottawa and Heaney et al. (2003) reported a linear relationship between serum 25OHD levels and vitamin D dosing that ranges from 0.7 nmol/L per 40 IU (Heaney et al., 2003) to 1 to 2 nmol/L per 100 IU (AHRQ-Ottawa). Notably, AHRQ-Ottawa found heterogeneity that remained after adjusting for dose. However, in the studies considered by the committee, there is a steeper rise in serum 25OHD levels when vitamin D dosing is less than 1,000 IU/day of vitamin D. A slower, more flattened response is seen when doses of 1,000 IU/day or higher are administered. In short, regardless of baseline intakes or serum 25OHD levels, under conditions of dosing the increment in serum 25OHD above baseline differs depending upon whether the dose was above or below 1,000 IU/day. This is evidenced by examining several studies in young, middle-aged and older adults.

Smith et al. (2009) in Antarctica found a low serum 25OHD level of 37 nmol/L in men and women during the winter season (June to September). The rise in serum 25OHD levels with doses of 400, 1,000, and 2,000 IU/day after 13 and 20 weeks was 2.1, 0.8 and 0.54 nmol/L per 40 IU/day, respectively. In two other studies at latitudes of 52°N to 55°N during winter, the rise in serum 25OHD levels in response to 200, 400, or 600 IU of vitamin D per day with serum 25OHD baseline levels of 37 to 42 nmol/L was examined in young and older individuals. The average rise in serum 25OHD levels was equivalent to approximately 2.3 nmol/L for an intake of 40 IU vitamin D3 per day without a difference due to age (Cashman et al., 2008, 2009). Others also found that age does not influence the change in serum 25OHD level in response to vitamin D intake (Harris and Dawson-Hughes, 2002). When the dose is 1,000 IU/day or higher, the rise in serum 25OHD level in individuals of all ages is approximately 1 nmol/L for a 40 IU/day intake, which is similar to the response to vitamin D intake found in the AHRQ-Ottawa analysis.

A regression analysis of the relationship between serum 25OHD level and total intake of vitamin D during the winter season at latitudes above 49.5°N or in Antarctica, a period of low sun and UVB exposure, was carried out for each of three age groups—children and adolescents, young and middle-aged adults, and older adults. This approach differs from the others such as the study reported by Heaney et al. (2003) in that total vitamin D intake and not just a supplemental dose of vitamin D was considered, and because we show a non-linear response to total intake rather than the linear response published previously. The interest for this report was an approach that would be relevant to determining the intake needed to achieve the serum 25OHD levels consistent with an EAR- and RDA-type value. The regression analysis using a mixed effect model was preceded by a log transformation of the total vitamin D intake data because the log transformation was the best curvilinear fit. The model controlled for the effect of study clustering by including study as a random effect. Controlling for study effect using a random effect was needed because the interclass correlation of the variance due to study effect compared with the total variance was very high, approximately 95 percent overall, with about 88 percent for children and adolescents, 95 percent for young and middle-aged adults, and 96 percent for older adults. The regression was set for a y0 intercept of 0 nmol of 25OHD per liter of serum, consistent with the biological reality preventing a negative value for achieved serum 25OHD levels. Baseline serum 25OHD levels did not have significant effect, and was, therefore, not included in the analysis.

The outcome is presented in Figure 5-3. Importantly, age did not significantly affect the response of serum 25OHD level to log vitamin D intake. Neither the main effect of age (p = 0.162) nor the interaction term between age and the log of total vitamin D intake (p = 0.142) was significant. Thus, there was no effect of age in the response of serum 25OHD level to total intake among the three age groups—children and adolescents, young and middle-aged adults, or older adults. This finding suggests that across ages under conditions of minimal sun exposure, similar intakes of vitamin D result in similar serum 25OHD concentrations, as shown in Figure 5-4.

Because there was no age effect in the response of serum 25OHD level to total intake of vitamin D, a single, combined regression analysis with study as a random effect was carried out. This resulted in the predictive equation of achieved 25OHD in nmol/L = 9.9 ln (total vitamin D intake) with predicted CIs of y = 8.7 ln total vitamin D intake) and upper interval of y = 11.2 ln (total vitamin intake), as specified in Figure 5-4.

The committee also analyzed the achieved 25OHD with total vitamin D intake at latitudes between 40°N to 49°N during the winter (data shown in Table 5-4 above) for which assumption of minimal sun exposure may not be as fully met as at latitudes above 49.5°N or in Antarctica during the winter. The approach was the same as described above for the simulated dose–response in which achieved serum 25OHD level was analyzed at latitudes above 49.5°S. The interclass correlation was large, approximately 80 percent, and study effect was again included as a random effect in the mixed effects model. Age did not affect achieved serum 25OHD level relative to log total vitamin D intake (p = 0.09 for main effect and p = 0.6 for the interaction of age and log total vitamin D intake), although the data available for children was limited to one study. Therefore, a combined analysis of all age groups at the lower latitudes was conducted. The predicted achieved serum 25OHD level was y = 12.3 ln (total vitamin D intake), which explained 45 percent of the within-study variability and 96.6 percent of the between-study variability. The predicted upper and lower CIs for achieved serum 25OHD levels were y = 10.1 ln (total vitamin D intake) and y = 14.5 ln (total vitamin D intake). There was a significant difference between lower and higher latitudes (p = 0.000 for the main effect and p = 0.021) for the interaction of latitude and ln (total vitamin D intake). Compared to the simulated dose–response at higher latitudes, the achieved serum 25OHD level at lower latitudes was 24 percent greater for the same total intake as that achieved at higher latitudes. Of note, less of the within-study variance at lower latitudes was explained by the total vitamin D intake (45 percent) compared to that explained (72 percent) for the higher latitudes. Taken together, these results suggest that sun exposure may be more than minimal at lower latitudes, as anticipated. Thus, the committee used the simulated dose–response at the higher latitudes to ensure minimal sun exposure to ensure as little contribution from endogenous production as the evidence allows.

Given the lack of an age effect in the response of the achieved serum 25OHD levels to any total intake of vitamin D, the intake to achieve the EAR-type value of 40 nmol/L was the same across all groups. An intake of 400 IU is associated with a predicted mean circulating 25OHD level of 59 nmol/L in children and adolescents, young and middle-aged adults, and older adults with a lower predicted CI of approximately 52 nmol/L. An intake of 600 IU/day predicts a mean serum 25OHD level of 63 nmol/L in children, adults, and older adults with a lower predicted CI of 56 nmol/L. Although this suggests that intakes of 400 and 600 IU would over-shoot the targeted serum 25OHD concentrations, there is considerable uncertainty in this simulated dose–response relationship that needs to be taken into account. This includes: (1) the large inter-study variance, which is most pronounced in older persons; (2) predicted lower CIs for each age group resulting in an achieved serum 25OHD level of 36 to 46 nmol/L for a 400 IU/day intake and a 38 to 49 nmol/L for a 600 IU/day intake (as shown in Figure 5-4), even though there is no significant age effect; (3) the uncertainties in the comparability of the serum 25OHD levels measured with different assays across these studies; and (4) the uncertainty surrounding the predicted CIs of this relationship. Given these limitations and the uncertainties, the committee selected the estimated intakes needed in a fashion that would err on the side of the specified intake “overshooting” the targeted serum value to ensure that the specified levels of intake achieved the desired serum 25OHD levels of 40 and 50 nmol/L. This approach is used despite possible contributions to serum 25OHD from sun exposure that could not be taken into account.

The DRIs for adequacy for vitamin D have been introduced previously in Table 5-3. The rationale for each is presented in the discussions below.

Data are not sufficient to establish an EAR for infants less than 1 year of age, and therefore an AI has been developed. Unlike the case for calcium, the content of human milk does not shed light on the vitamin D requirements of infants, as breast milk is not a meaningful source of vitamin D.

The AI for the 0 to 6 months and 6 to 12 months life stage groups is set at 400 IU of vitamin D per day. There are very limited data beyond the conclusion that maintaining serum 25OHD concentrations in this life stage group above 30 nmol/L, and more likely closer to 50 nmol/L, appears to cover adequately the needs of the majority of the infants and support normal bone accretion. There are no data to suggest that older infants would benefit from higher intakes.

Intakes in the range of 400 IU/day appear consistent with maintenance of the desirable serum 25OHD concentrations. There are no reports of a clinical deficiency in infants receiving 400 IU of vitamin D per day, and an intake of 400 IU/day appears to maintain a serum 25OHD level generally above 50 nmol/L in infants (Greer et al., 1982; Rothberg et al., 1982; Ala-Houhala, 1985; Ala-Houhala et al., 1988; Greer and Marshall, 1989; Hollis and Wagner, 2004). There are differences in the volume of milk or formula intake during this 12-month period, with newborns taking in less than older infants. The AI of 400 IU/day, therefore, represents an overall intake for the first year of life, and may vary across the life stages; it also assumes early introduction of a supplement for breast-fed babies. In the case of exclusive formula feeding, there is an assumption of a gradual increase in intake to 800 to 1,000 mL/day during infancy, which for most standard formulas provides about 400 IU/day. Note is made of the case reports concerning the development of rickets among dark-skinned infants who are exclusively breast-fed and not provided a vitamin D supplement (see Chapter 8).

For these life stage groups, ensuring normal, healthy bone accretion is central to the DRI values. The requirement distribution developed using serum 25OHD concentrations and the intakes estimated to achieve such concentrations are the basis for the reference values.

For very young children in this life stage group, virtually no data are available to link vitamin D nutriture directly to measures related to bone health outcomes. AHRQ-Ottawa examined the relationship between vitamin D and rickets in children 0 to 5 years of age but found no studies that evaluated BMC, BMD, or fractures in comparison with measures of vitamin D intake. Likewise, AHRQ-Tufts found no studies that update AHRQ-Ottawa.

AHRQ-Ottawa did consider serum 25OHD concentrations in the context of the onset of rickets in newborns through children 5 years of age and identified serum concentrations below 27.5 nmol/L as being consistently associated with rickets. However, many of the relevant studies were from developing countries where calcium intake is low; therefore, for these studies, the onset of rickets was associated with higher levels of 25OHD in serum, likely due to low calcium intakes. Specker et al. (1992) has concluded that serum concentrations of approximately 27 to 30 nmol/L places the infant at an increased risk for developing rickets, although the measure is not diagnostic of the disease.

Although the prevention of rickets can be a factor in establishing reference values, it is important to seek measures that are consistent with favorable bone health outcomes. Maximizing calcium absorption, especially for this life stage group, is therefore a reasonable parameter to take into account. Here, as with rickets, serum 25OHD measures are the only data available and there are no direct measures of vitamin D intake. Abrams et al. (2009) conducted calcium absorption studies in 251 children ranging in age from 4.9 to 16.7 years and found that children with serum 25OHD levels of 28 to 50 nmol/L had higher fractional calcium absorption than children with serum 25OHD levels at or greater than 50 nmol/L, suggesting again at the least that maximal calcium absorption is reached at 50 nmol/L. Fractional calcium absorption did not increase with serum 25OHD concentration levels above 50 nmol/L. The findings are consistent with the conclusions reached previously concerning serum 25OHD levels associated with maximum population coverage. Further, as rickets in populations that are not calcium deficient occurs at serum 25OHD levels below 30 nmol/L, it is reasonable to assume that 40 nmol/L is associated with an average requirement.

Serum 25OHD concentrations of 40 to 50 nmol/L would ideally coincide with bone health benefits such as positive effects on BMC and BMD. AHRQ-Ottawa found that there was fair evidence that circulating 25OHD levels are associated with a positive change in BMD and BMC in studies in older children and adolescents. The serum 25OHD concentrations varied from 30 to 83 nmol/L. A study conducted by Viljakainen et al. (2006) reported that vitamin D intakes of 200 and 400 IU/day in adolescent girls were associated with positive BMC measures at serum 25OHD levels of 50 nmol/L and above. This is consistent with conclusions inferred from calcium absorption studies and, in turn, with the ability to cover the requirements for nearly all in the population. A relatively wide range of total vitamin D intakes reportedly achieved serum 25OHD concentrations between approximately 40 and 60 nmol/L, but most intakes were between about 350 and 600 IU/day. The variability in the data cannot be readily attributed to differences in sun exposure because the studies were all conducted in northern locations during primarily winter months.

Taken as a body of evidence and in the absence of measures that directly relate total intake to health outcomes, the information concerning serum 25OHD concentrations associated with rickets prevention, calcium absorption, and positive effects on BMC measures are consistent with discussions above concerning a requirement distribution based on serum 25OHD concentrations. They support the conclusion that an average requirement for vitamin D for these life stage groups is associated with the achievement of concentrations of 25OHD in serum of 40 nmol/L. Further, they support the conclusion that the requirements for nearly all children and adolescents are covered when serum 25OHD concentrations reach 50 nmol/L. These findings are universally applicable across all children and adolescents from 1 to 18 years of age.

The analysis conducted, described above, indicates that an intake of vitamin D of 400 IU/day achieves serum concentrations of 40 nmol/L, and this intake is therefore set as the EAR for persons 1 to 3 years, 4 to 8 years, 9 to 13 years, and 14 to 18 years of age. As this requirement distribution appears to be normally distributed, the assumption of another 30 percent to cover nearly all the population (i.e., 97.5 percent) is appropriate and consistent with a serum 25OHD level of approximately 50 nmol/L as the target for an RDA value. Based on the same analysis relating serum 25OHD levels to intake, an intake of 600 IU/day is set as the RDA. These reference values assume minimal sun exposure.

For these life stage groups, bone maintenance is the focus. The requirement distribution based on serum 25OHD concentrations and the intakes estimated to achieve such concentrations are the basis for the reference values. As described below, the available data have provided more information about intakes and serum 25OHD levels consistent with an RDA value than they have for an EAR value.

Data relating bone health outcomes to vitamin D intake are generally limited for adults 19 to 50 years of ages. Although bone mass measures are, of course, studied in this population, consideration of the dose–response relationship between vitamin D and bone health are not usually included in such studies. In fact, there are no randomized trials in this age group, and whatever data are available come from association studies. The results are inconsistent, in part because the confounding inherent in observational studies.

Serum 25OHD concentrations relative to calcium absorption, therefore, provide an important basis for DRI development for vitamin D for these life stage groups. The conclusions described above indicating that calcium absorption is maximal at serum 25OHD concentrations between 30 and 50 nmol/L with no consistent increase in calcium absorption above approximately 50 nmol/L are informative in estimating the relevant EAR and RDA values for vitamin D for these life stage groups.

In contrast, although data from a very recent study (Priemel et al., 2010) based on post-mortem analysis of the relationship between serum 25OHD levels and osteomalacia and re-examined by the committee (as described above) suggest a serum 25OHD level that would cover the needs of approximately 97.5 percent of the population, they also reveal that a level of serum 25OHD consistent with an average requirement is somewhat elusive. That is, serum 25OHD levels of approximately 40 nmol/L to even 30 nmol/L might be expected to be consistent with coverage for no more than half of the population (i.e., a mean/median value). But, in the Priemel et al. (2010) report, even at serum 25OHD levels well below 30 nmol/L more than half of the population studied failed to demonstrate osteomalacia as defined histologically in the study. In essence, these data, which admittedly have limitations, suggest that for some adults the need for vitamin D is extremely low. This is likely due to the very strong interrelationship between calcium and vitamin D; it may even suggest that calcium is the “driver” nutrient relative to bone health, and that calcium is able to more readily overcome lower levels of vitamin D for the purposes of bone health, while vitamin D is likely unable to compensate for a lack of calcium. This finding underscores the uncertainties that are introduced by the calcium-vitamin D interrelationship.

For the purposes of ensuring public health in the face of uncertainty and providing a reference value for stakeholders, a prudent approach is to begin the consideration of the DRIs for these age groups with the level of 25OHD in serum that is consistent with coverage of the requirement of nearly all adults in this age range, that is, 50 nmol/L. Taken together with calcium absorption and BMD, and assuming a normal distribution of requirements, given no evidence that the distribution is not normal, a serum level of 40 nmol/L can be set as consistent with a median requirement. This modified approach is bolstered by—and consistent with—the relationship between serum 25OHD levels and calcium absorption, in which serum 25OHD levels of between 30 and 50 nmol/L were consistent with maximal calcium absorption. Based on these considerations as well as the intake versus serum response analysis described above, an EAR of 400 IU/day and an RDA of 600 IU/day are established for adults 19 to 50 years of age. These DRI values assume minimal sun exposure.

For persons in these life stage groups of 51 through 70 years and >70 years, the ability to maintain bone mass and reduce the level of bone loss is the primary focus for DRI development. Evidence related to fracture risk becomes central. For this reason, DRIs for adults >70 years of age are discussed first, followed by DRIs for adults 51 through 70 years of age.

The discussions above concerning serum 25OHD levels in relation to bone health indicate that several newer studies have helped to elucidate a relationship between serum 25OHD concentrations and bone health benefits based on measures of calcium absorption and osteomalacia for a wide age range of adults. These data when used for the purposes of DRI development—coupled with the approximation of intake associated with serum 25OHD concentrations derived from the simulation analysis carried out by the committee—provide a basis for an EAR for young and middle-aged adults of 400 IU/day vitamin D consistent with a serum 25OHD concentration of 40 nmol/L, and for an RDA of 600 IU/day consistent with a serum 25OHD concentration of 50 nmol/L. However, for adults more than 70 years of age, the number of unknowns associated with the physiology of normal aging, coupled with the level of variability around the average requirement for this group that such factors may introduce, all of which may affect the estimation of the RDA (the level of intake needed to cover 97.5 percent of the population) causes a closer examination of the level of intake appropriate for an RDA value.

For this life stage group (> 70 years), the reduction in fracture risk is the most important indicator of interest, not only because of the actual event, but also because of the high mortality and morbidity associated with fractures. The factors that may have an impact on fracture risk range from functional status to neurological, metabolic, and physical determinants. Such factors enhance uncertainties about vitamin D nutriture. Changes such as impaired renal function, less efficient synthesis of vitamin D in skin, lower endogenous production of active vitamin D, increased PTH as well as age-related changes in body composition affect the daily requirement of vitamin D. Moreover, a sizeable proportion of this population can be categorized as frail compared with other age groups, and the concerns for bone health are increased. Factors of increased institutionalization also come into play. Although there is insufficient evidence to point to any one of these factors as a contributor to increasing the variability at which 97.5 percent coverage of the population occurs, when taken as a group of unknowns, it would be inappropriate to ignore the concern when considering the level of vitamin D commensurate with an RDA for this group.

For this reason, the level of uncertainty should be taken into account during the specification of the RDA for vitamin D for persons more than 70 years of age. There are very few data that are relevant to adjusting for such uncertainty. There are no dose–response data that would allow comparisons for adults more than 70 years of age regarding the effects of intakes of 600 IU of vitamin D per day with that of a higher level of intake such as 800 or 1,000 IU/day. Moreover, the evidence for fracture risk in relation to vitamin D intake for this older life stage is confounded by study protocols that do not allow separation of the effect of calcium from vitamin D; as discussed previously there is reasonably compelling evidence that calcium alone in this age group can modestly reduce the risk of fracture. Therefore, it is not surprising that the inclusion of calcium with vitamin D treatment generally, albeit not consistently, reduces the risk of fractures among the oldest adults, especially when vitamin D nutriture is considered in the context of serum 25OHD concentrations (Tang et al., 2007; Avenell et al., 2009; AHRQ-Tufts, Tang et al., 2007). Even the 10 trials that examined vitamin D alone (Lips et al., 1996; Peacock et al., 2000; Meyer et al., 2002; Trivedi et al., 2003; Avenell et al., 2004; Harwood et al., 2004; Grant et al., 2005; Law et al., 2006; Lyons et al., 2007; Smith et al., 2007), when pooled by Avenell et al. (2009), showed no statistically significant effect on fracture risk. As shown in Table 5-5, which is focused on studies with subjects more than 70 years of age and vitamin D intakes as opposed to serum 25OHD concentrations, such studies are generally non-significant for fracture risk on the basis of both vitamin D alone and vitamin D with calcium. The exception is Trivedi et al. (2003), which examined vitamin D supplementation and fracture risk in a population of men and women of average age 75 years. In any case, interpretation of these data is complicated by the unknowns surrounding the background intake of vitamin D over and above the supplemented dose.

The large study (n = 2,686) carried out by Trivedi et al. (2003) included more men than women (suggesting that the included population was actually at lower risk for fracture than would have been the case if the study had focused predominantly on women) and was longitudinal (5 years), including repeat measures on the same individual. The amount of vitamin D used for treatment was the equivalent of 800 IU/day, although it was administered as a 100,000 IU dose every 4 months for the duration of the study. Although this may limit somewhat the applicability of the study for DRI purposes, it is not as large as the 500,000 IU dose once yearly used by others (e.g., Sanders et al., 2010). Under these circumstances, the work of Trivedi et al. (2003) is helpful in taking uncertainty into account.

The reason not to dismiss the effect of 800 IU of vitamin D per day as an aberration because of a lack of dose–response data, even in the face of data generally not supportive of an effect of vitamin D alone regarding reduced fracture risk for the oldest adults, is that persons more than 70 years are a very diverse group. This group is undergoing a number of physiological changes with aging that could have an impact on and increase the variability around an average requirement, particularly in light of the known and high variability of these physiological changes among aging individuals. If this is assumed to be the case, then it is likely that the RDA for persons more than 70 years of age would be higher due to this variability. In addition, there is insufficient evidence to provide assurances that 600 IU/day vitamin D is as effective as 800 IU/day. By comparing the projected RDA based on the simulation analysis (600 IU/day) with the available evidence indicating benefit at 800 IU of vitamin D per day, taking into account the uncertainties would result in an estimation of an RDA of approximately one-third higher than the simulation analysis suggests. Overall, this is a small increase that is not known to increase the possibility of adverse events while providing a certain level of caution for this particularly vulnerable and potentially frail segment of the population. This approach is predicated on caution in the face of uncertainties, and it is anticipated that newer data in the future will help to clarify the uncertainties surrounding the level of intake of vitamin D that could be expected to cover 97.5 percent of persons over the age of 70 years.

The EAR of 400 IU/day and RDA of 800 IU/day for this life stage group, consistent with the DRIs for other life stage groups, assume minimal sun exposure.

A question in establishing an EAR and RDA for this life stage group is the relevance of vitamin D in affecting bone loss due to the onset of menopause. Men in this life stage group have not yet reached the levels of bone loss and fracture rates associated with aging as manifested in persons more than 70 years of age and, unlike their female counterparts, they are not experiencing significant bone loss due to menopause. However, a portion—in fact perhaps the majority—of women in this life stage group are likely to be experiencing some degree of bone loss due to menopause.

As discussed above for adults more than 70 years of age, the available data do not suggest that median requirements increase with aging, resulting in support for an EAR of 400 IU/day, the same as for younger adults. Likewise, the EAR for both women and men in the 51 through 70 year life stage group is set at 400 IU of vitamin D per day.

With respect to women 51 through 70 years of age, fracture risk is lower than it is later in life; and as such, it is not entirely congruent with the situation for adults more than 70 years of age. Further, findings for this age group are at best mixed, but are generally not supportive of an effect of vitamin D alone on bone health. Although the AHRQ analyses of studies using vitamin D alone found the results to be inconsistent for a relationship with reduction in fracture risk, more recent studies have trended toward no significant effects (Bunout et al., 2006; Burleigh et al., 2007; Lyons et al., 2007; Avenell et al., 2009b). For those studies showing benefit for BMD with a vitamin D and calcium combination, interpretation is confounded by the effects of calcium especially since calcium alone appears to have at least a modest effect on BMD. The report from the WHI (Jackson et al., 2006), a very large cohort study, has limited applicability to the question of the effect of vitamin D on bone health among women because of relatively high levels of calcium intake (baseline mean calcium intake of approximately 1,150 mg/day at randomization plus 1,000 mg/day supplement) and the confounding due to hormone replacement therapy. Given these data plus the inability to extrapolate the variability seen in the requirements surrounding persons 70 or more years of age to this life stage group, the RDA for women 51 through 70 years of age is set at 600 IU of vitamin D per day, the same level as that for younger adults. With respect to men 51 through 70 years of age, there is also no basis to deviate from the RDA set for younger adults. The available evidence for men is extremely limited, and there are not data to suggest that bone health is enhanced by vitamin D intake among men in this life stage group. An RDA of 600 IU/day is established for these men.

The DRIs for these two life stage groups assume minimal sun exposure.

Pregnancy The EAR for non-pregnant women and adolescents is appropriate for pregnant women and adolescents based on: (1) AHRQ-Ottawa's finding of insufficient evidence on the association of serum 25OHD level with maternal BMD during pregnancy and (2) the 1 available RCT (Delvin et al., 1986) and 14 observational studies reviewed in Chapter 4 regarding vitamin D deficiency and genetic absence of the vitamin D receptor (VDR) or 1α-hydroxyalase, which all demonstrate no effect of maternal 25OHD level on fetal calcium homeostasis or skeletal outcomes. Of the limited number (i.e., four) of observational studies that suggest an influence of maternal serum 25OHD levels on the offspring's skeletal outcomes later in life (so-called developmental programming), one study reports associations consistent with an EAR-type value of approximately 40 nmol/L below which negative fetal skeletal outcomes were reported (Viljakainen et al., 2010), and another reports an RDA-type value of 50 nmol/L late in gestation above which reduced skeletal BMC was not seen in offspring at 9 years of age (Javaid et al., 2006). In addition, development of the fetal skeleton without dependence on maternal vitamin D is also biologically plausible as indicated by the studies in animal models in rats, mice, pigs, and sheep (see review in Chapter 3). Finally, there is no evidence that the vitamin D requirements of pregnant adolescents differ from those of non-pregnant adolescents.

The EAR is thus 400 IU of vitamin D per day for pregnant women and adolescents. Likewise, the RDA values for non-pregnant women and adolescents are applicable, providing an RDA of 600 IU/day for each group.

Lactation The EAR for non-lactating women and adolescents is appropriate for lactating women and adolescents based on evidence from RCTs (Rothberg et al., 1982; Ala-Houhala, 1985; Ala-Houhala et al., 1988; Kalkwarf et al., 1996; Hollis and Wagner, 2004; Basile et al., 2006; Wagner et al., 2006; Saadi et al., 2007), which are consistent with observational data (Cancela et al., 1986; Okonofua et al., 1987; Takeuchi et al., 1989; Kent et al., 1990; Alfaham et al., 1995; Sowers et al., 1998) that increased maternal vitamin D intakes increase maternal serum 25OHD levels, with no effect on the neonatal serum 25OHD levels of breast-fed infants unless the maternal intake of vitamin D is extremely high (i.e., 4,000 to 6,400 IU/day) (Wagner et al., 2006). Observational studies report no relationship between maternal serum 25OHD levels and BMD (Ghannam et al., 1999) or breast milk calcium content (Prentice et al., 1997). Also, there is no evidence that lactating adolescents require any more vitamin D or higher serum 25OHD levels than non-lactating adolescents. The EAR is thus 400 IU of vitamin D per day for lactating women and adolescents. Likewise, the RDA values for non-lactating women and adolescents are applicable, providing an RDA of 600 IU/day for each group.