A study of 641 pregnant women in Ireland found high rates of iron deficiency, particularly in the third trimester, despite being generally healthy and living in a high-resource setting.
In a recent study published in The American Journal of Clinical Nutrition, researchers studied iron status changes in primiparous females in high-resource settings. They proposed an early pregnancy ferritin threshold that predicts iron deficiency in the final trimester.
Background
Iron deficiency is common among pregnant women due to the higher iron requirements to ensure fetal development and maternal wellness. Maternal iron deficiency can lead to adverse pregnancy outcomes such as postpartum depression, preterm birth, low birth weight, and small-for-gestational-age birth. It can also impair fetal iron accumulation, resulting in long-term neurodevelopmental effects and an early onset of iron deficiency after birth.
However, prospective and well-powered analyses of changing iron levels in pregnancy are limited. The degree of iron deficiency at the beginning that compromises iron status by the end of pregnancy is unclear. A global consensus on pregnancy-specific thresholds for many biomarkers is lacking. The World Health Organization (WHO) recommends ferritin <15 μg/L, but recent United Kingdom guidelines suggest a threshold of <30 μg/L to indicate iron deficiency.
About the study
In the present prospective cohort study, researchers assessed changes in iron status during pregnancy and identified a ferritin cut-off at 15 weeks to estimate iron deficiency at 33 weeks of gestation.
Researchers analyzed data from the IMproved PRegnancy Outcomes via Early Detection (IMPROvED) consortium conducted across Ireland, the United Kingdom, the Netherlands, and Sweden. Participants were recruited individuals from November 2013 to August 2017. The study included primiparous females aged ≥16 years with singleton, low-risk pregnancies. Participants provided data on height, weight, smoking history, alcohol intake, and nutrient supplementation pre-pregnancy and during early pregnancy to analyze their effects on iron status during pregnancy.
Participants provided serum samples during their initial antenatal appointment (between gestational weeks 11 and 13) and at week 15, week 20, and week 33 of gestation. Enzyme-linked immunosorbent assays (ELISA) assessed serum iron and inflammatory biomarker levels. Iron biomarkers included soluble transferrin receptors (sTfR), ferritin, and total body iron (TBI). Inflammatory markers were α-glycoprotein (AGP) and C-reactive protein (CRP). Ferritin levels <15 μg/L indicated iron deficiency.
Multivariable logistic regressions determined the odds ratios (ORs) for analysis. The study excluded females with anemia or hemoglobin <110 g/L during their initial antenatal visit. Other exclusions were women with moderate-severe hypertension at consultation (above 160/100 mm Hg) and comorbidities like diabetes mellitus, systemic lupus erythematosus, renal disease, sickle cell disease, antiphospholipid syndrome, and human immunodeficiency virus (HIV) acquired immunodeficiency syndrome (AIDS).
Results
Among 629 participants, 98% were Caucasian, and 81% were Irish. Multivitamins were consumed by 30% and 56% of participants during pre-pregnancy and early pregnancy, respectively. Over 73% of women took multivitamins comprising 15 to 17 mg of iron. Iron deficiency prevalence increased across pregnancy trimesters. The prevalence rates were 4.50%, 14%, and 51% at gestational week 15, week 20, and week 33, respectively. Using the ferritin cut-off of below 30.0 μg/L, the rates in the corresponding weeks were 21%, 44%, and 84%, respectively. Applying the sTfR threshold of >4.40 mg/L yielded prevalence rates of 7.20%, 13%, and 61%, respectively.
Compared to using sTfR above 4.40 mg/L, iron deficiency prevalence using ferritin <15.0 μg/L was significantly lower at week 33 (61% vs. 51%, OR 0.7). TBI thresholds of<0.0 mg per kg generated deficiency rates lower than those obtained using sTfR or ferritin. Ferritin below 60.0 μg/L at week 15 predicted iron deficiency at week 33 [area under the curve (AUC), 0.8].
Iron supplements (as part of multivitamins) consumed in the pre-pregnancy or early pregnancy periods reduced the risk of iron deficiency throughout gestation, including the final trimester (OR, 0.6). Researchers observed a trend toward lower ferritin in pregnant women who smoked in early pregnancy. The final-trimester iron deficiency rates were similar in individuals with CRP ≤5 mg/L, >5.0 mg/L, and ≤10 mg/L with AGP of ≤1 g/L. In women with CRP >10 mg/L, ferritin underreported iron deficiency compared to sTfR. The findings indicate that the current inflammation threshold (CRP >5.0 mg/L) may be unsuitable during pregnancy.
Conclusion
The study showed that pregnancy significantly impacts maternal iron levels, even among high-resource, iron-supplemented populations.
Despite a generally healthy, low-risk cohort, four of five women were iron deficient by the third trimester, with ferritin levels <30 μg/L. Early screening for iron deficiency during pregnancy is recommended, with target serum ferritin levels above 60.0 μg/L. fIron-containing supplements, typically multivitamins, can protect against iron deficiency throughout pregnancy.
In this study, inflammation rates were higher than expected for a healthy population. The impact of high inflammation rates during pregnancy on iron status needs further evaluation.