Let’s start by reviewing the Dietary Reference Intakes (DRI) established by the institute of medicine.
1. The Estimated Average Requirement (EAR) is the daily intake value that is estimated to meet the requirement in half of the apparently healthy individuals in a life stage or gender group.
2. The Recommended Dietary Allowance (RDA) is the average daily intake level that is sufficient to meet the nutrient requirement of nearly all (97 to 98 percent) individuals in a life stage and gender group. The RDA is intended to be used as a goal for daily intake by individuals.
3. The Tolerable Upper Intake Level (UL) is the highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in the specified life stage group. The UL applies to chronic daily use.
For men, ages 19 and above, the EAR for iron is 6 mg and for women 19 and above it is 8.1 mg. This means, generally, that iron consumption at six milligrams per day for men and 8.1 mg for women would produce a deficiency or deficient iron status in half of that group. To determine the RDA the institute of medicine takes two standard deviations from the consumption data and adds that to the EAR. Thankfully, this is not a review of statistics, but the standard deviation is a calculation of how much each individual data point varies from the average or mean – a plus or minus value. The institute of medicine assumes intake follows a normal distribution curve and therefore about 95 percent of a population falls within two standard deviations (2 times the plus or minus value) – that is the rationale for adding two standard deviations to the EAR to get the RDA. There is a little more statistical magic to get that number to 97-98% but I digress.
The RDA for iron intake for men ages 19 and above is 8 mg and for women 19 to 50 yrs is 18 mg, so there's really not a lot more being consumed at the RDA level compared to the level that can only meet the needs of half the group. So clearly, this is the low end of consumption. The weakness of the RDA is that the daily goal for intake is so low. The largest difference between men and women for iron consumption is only seen at the low end of intake. At the high end of consumption, the UL for iron for men ages 19 and above and women 19 and above is 45 milligrams a day. Again, the UL is not the maximum; it is the intake level that is likely to pose no risk of adverse health effects.
The RDA represents 18% of the UL for men and 40% of the UL for women. So there is a large safety margin for supplemental iron. The fact that 45 milligrams a day chronically consumed won't produce an adverse effect suggests that perhaps our fears over a few milligrams of supplemental iron are potentially unfounded. If we assume that the UL is a maximum, then the position of zero risk between the RDA and the UL is in the middle. That would be 26.5 mg for men and 31.5 mg for women per day.
According to the community nutrition mapping project hosted by the agricultural research service of the United States Department of Agriculture, only 37% of children aged one through eight years and males aged nine through 18 years of age consumed diets providing at least 100% of the RDA for iron. Only 20% of females aged nine through 64 years consumed at least 100% of the RDA for iron, suggesting that supplemental iron is critical for post-menopausal women.
Knowing that these are the relationships between these intake values should help you understand that there's safety in iron consumption, especially in adult males and postmenopausal women.