Anthropogenic carbon dioxide emissions may increase the risk of global iron deficiency

Certain cereal grains and other crop plants have been shown to have lower iron concentrations when grown under elevated CO2. This study by researchers from Massachusetts, USA, examined diets from 152 countries to investigate which groups of people might be most at risk of iron deficiency as a result of increasing CO2 emissions, on the basis of current dietary composition, the current global prevalence of iron deficiency, and projected CO2 emissions up to the year 2050.

The study has three main aims: 1) to determine the quantity and source of dietary iron supply by country, 2) to quantify the predicted decrease in iron consumption per capita by country, and 3) to identify (based on the above and on current anaemia prevalence) the countries most vulnerable to iron losses. The ultimate goal of the study is to “identify high-risk regions where high prevalence of contemporary iron deficiency overlaps with a large vulnerability to the eCO2 effect”. The authors make the reasonable assumption that due to the hidden nature of the effect, consumers would be unlikely to have made corrective adjustments to their diets to counteract falling iron content.

The authors begin by outlining the consequences and impacts of iron deficiency, the factors affecting the iron content of food plants (the main dietary source of iron for most of the world’s population), and the evidence that food crops grown at elevated CO2 (eCO2) of 550 ppm have decreased iron content by 4-10%.

To answer their research questions the researchers integrated food supply data from multiple sources to estimate the source and quantity of dietary iron intake in 152 different countries, in particular by the groups of people identified as most vulnerable to the effects of iron deficiency: children aged 1-5 and women of childbearing age (15-49). They used data from the elevated CO2 study (Myers, 2014, 2015) and some straightforward statistical modelling (weighted averages and distribution analysis) to estimate the likely changes to iron content in food portions under eCO2. They then integrated this information on changes to dietary intake with data on current dietary iron supply and current global prevalence of anaemia (which is the best available proxy for global prevalence of iron deficiency, although anaemia is not caused solely by iron deficiency, and not all iron deficiency manifests as anaemia), to categorise the countries by risk on the basis of two criteria. Countries were first classified by their current anaemia rates as either “moderate/severe” (>20% population anaemic) or “mild” (<20% anaemic). Secondly, they were classified into one of three categories (statistically determined “tertiles”) based on the projected dietary iron losses. The final classification of a country’s vulnerability to eCO2-related iron losses was made by combining these classifications thus: countries with moderate/severe anaemia and in the highest (worst) tertile for iron losses were classed as high risk; high anaemia, mid-tertile iron-loss countries were classed as moderate risk; low anaemia, mid-tertile iron-loss countries as low risk; and the risk of low-tertile iron-loss countries (regardless of current anaemia levels) was defined as “little to none”.

The key findings of the paper were:

  • 57% of children aged 1-5 (334 million) and 60% of all childbearing-aged women (1.06 billion) live in countries at high risk from eCO2-related iron-losses (primarily in South and East Asia, North and East Africa).
  • Across all countries, eCO2-related dietary iron losses ranged from 1.5% to 5.5%.
  • The largest losses were located in India, the Middle East and North Africa, with the heterogeneity in risk being attributed to the differences in the prevalent sources of dietary iron in each country: countries in which the majority of dietary iron is derived from cereal grains and flours would suffer the highest iron losses under eCO2, while countries deriving most of their iron from animal sources (generally wealthier and less anaemic countries in North America, Europe and Australasia) or from plants that do not appear to be affected by the eCO2 effect (e.g. some in sub-Saharan Africa and South America) would suffer the lowest eCO2-related iron losses.
  • In countries with a low or moderate risk from eCO2-related iron losses, those individuals consuming vegetarian diets are at a higher risk that the national average.
  • It is apparent from the data that, like so many effects of CO2 emissions and climate change the populations most vulnerable to the effects of eCO2-related iron losses are broadly those in the lowest-income countries, who have the fewest options and resources for adapting to these effects.

While the authors outline in some detail the limitations of their study (including that they have not been able to link projected iron losses to actual increases in iron deficiency, and the assumption that diets will not change despite the projected increases in meat consumption globally), and the further analysis that would be desirable, the findings of this paper are undoubtedly concerning. The authors propose several solutions for high risk countries: firstly, technology-driven solutions (e.g. governmental or industrial food fortification programmes); secondly, wider nutrition-based education programmes to enable more informed dietary choices by consumers; and thirdly, (or rather, simultaneously with the other two), better treatment of disease like malaria or parasitic infections, which limit iron absorption. Ultimately however, the authors present their findings as further motivation to mitigate CO2 emissions.


Iron deficiency reduces capacity for physical activity, lowers IQ, and increases maternal and child mortality, impacting roughly a billion people worldwide. Recent studies have shown that certain highly consumed crops—C3 grains (e.g., wheat, rice, and barley), legumes, and maize—have lower iron concentrations of 4–10% when grown under increased atmospheric CO2 concentrations (550 ppm). We examined diets in 152 countries globally (95.5% of the population) to estimate the percentage of lost dietary iron resulting from anthropogenic CO2 emissions between now and 2050, specifically among vulnerable age-sex groups: children (1–5 years) and women of childbearing age (15–49 years), holding diets constant. We also cross-referenced these with the current prevalence of anemia to identify most at-risk countries. We found that 1.4 billion children aged 1–5 and women of childbearing age (59% of global total for these groups) live in high-risk countries, where the prevalence of anemia exceeds 20% and modeled loss in dietary iron would be in the most severe tertile (>3.8%). The countries with the highest anemia prevalence also derive their iron from the fewest number of foods, even after excluding countries consuming large amounts of unaccounted wild-harvest foods. The potential risk of increased iron deficiency adds greater incentive for mitigating anthropogenic CO2 emissions and highlights the need to address anticipated health impacts via improved health delivery systems, dietary behavioral changes, or agricultural innovation. Because these are effects on content rather than yield, it is unlikely that consumers will perceive this health threat and adapt to it without education.


Smith, M. R., Golden, C.D. and Myers, S.S. (2017). Potential rise in iron deficiency due to future anthropogenic carbon dioxide emissions, GeoHealth, 1, 248–257

Read the full paper here.

See also the FCRN summaries of these related papers:

Constraints to nitrogen acquisition of terrestrial plants under elevated CO2

Increasing CO2 threatens human nutrition


Chapter 7 of our online resource Foodsource on the connection between food and health.

11 Oct 2017
Photo: Flickr, Prelude 2000, Iron Kettle, Creative Commons License 2.0
Photo: Flickr, Prelude 2000, Iron Kettle, Creative Commons License 2.0