Rationale
Adequate dietary iron intake during early life is crucial for several developmental processes, particularly in the brain. Exclusively breastfed term infants have access to sufficient levels of iron for the first 4–6 months of life, during which 20–40%, (~ 50–100 mg) of total body iron at birth is recycled from excess haemoglobin (Hb) and stored for later use [
1], with an additional ~ 0.15 mg absorbed from breastmilk containing around 0.5 mg/L of iron per day [
2]. Foetal iron demand is highest during the third trimester [
3]; thus pre-term and low birthweight infants are generally considered at higher risk of neurological deficits arising from iron deficiency and routinely receive supplemental iron during the neonatal period [
4].
The final 18 months of the first 1000 days (conception to 2 years of age) represent a critical window of nutrition-dependent development, primarily in the central nervous system [
5]. Iron supports myelination, establishment and consolidation of neurotransmitter pathways and the substantial metabolic needs rapidly proliferating neural networks [
6]. Insufficient supply of iron during this period has marked and well-characterised effects on neurodevelopment [
7], including weaker cognitive, motor and social development compared to iron-replete infants [
8]. By comparison, relatively little is known about the effects of chronic dietary iron overexposure during this period, though emerging concerns regarding delayed neurotoxic effects of iron-mediated oxidative stress have sparked new debate among biochemists, nutritionists and paediatricians about appropriate intake during this critical window [
9‐
11].
Historically, most high-income countries implemented both targeted food fortification programs, such as the addition of inorganic iron to infant formula [
12], and broader open-market fortification of staple cereal products [
13] to address the high prevalence of iron deficiency (ID) and iron deficiency anaemia (IDA) in both infants and the wider populations during the mid-twentieth century. By 2011, the number of children 6–59 months meeting the diagnostic criteria for IDA (Hb < 110 g/L [
14]) in participating high-income countries had dropped to < 15% [
15]. While aggressive infant formula fortification regulations have likely contributed to this decrease—the American Academy of Paediatrics (AAP) Committee on Nutrition (CoN) has supported the use of formula containing 10–12 mg/L of iron for over three decades [
16]—so has increased availability of complementary foods, including iron-fortified infant cereals and staple foods [
10]. The vast majority of infants and toddlers in the USA already receive adequate nutrient intake through diet alone [
17]. In high-income European countries, guidelines for formula fortification are more conservative: non-breastfed term infants are advised to consume preparations containing 4–8 mg/L to 6 months with no set levels for 6–24-month-old children, based on a lack of evidence supporting optimal iron concentrations and acknowledgement of concerns regarding long-term adverse outcomes arising from excessive brain iron levels [
18]. Both North American and European guidelines are substantially more hawkish than those issued by the World Health Organisation (WHO) in 2016 that encompass all countries, regardless of gross domestic product. The WHO recommends daily oral supplementation for children aged 6–23 months with 10–12.5 mg of elemental iron for no longer than three consecutive months a year, and only in areas where IDA prevalence is > 40% and malaria is not endemic [
19].
By contrast, the AAP CoN also recommends augmentation with direct supplementation to 12 months, followed by the introduction of multivitamin preparations to 36 months ‘if iron needs are not being met’ [
16]. Herein lies a major outstanding question: while daily iron intake from complementary foods is substantially more variable than regular formula consumption, does the consumption of iron-rich foods, particularly those containing highly bioavailable haem, compound average intake to a point where the need for fortified formula and/or oral supplementation becomes unnecessary? Predictive modelling of a scenario in the USA where legislation requiring open-market fortification of staple foods is abolished suggests a modest increase in IDA prevalence in young children, though still well below 10% [
20]. Screening for IDA and IDA risk (as altered haematological markers indicative of ID without anaemia) is recommended by the APP every 12 months to identify infants in need of adjuvant supplementation [
16]. We argue that this approach has limited clinical utility, as there is no global consensus regarding cut-off levels of non-Hb markers used to identify ID [
21]. Twelve-month intervals between screening provide limited information on time trends that would indicate iron stores are being depleted, and ID itself is asymptomatic and may not be pathological if markers remain stable over a set period. Concerning the latter point, an interesting theory proposed by Quinn [
22] and supported by recent trials in Kenya [
23,
24] where bacterial gastroenteritis is endemic posits that ID during the 6–24-month critical window is an evolutionary mechanism intended to limit the proliferation of pathogenic iron-dependent gut bacteria [
22].
There is no question that adequate dietary iron intake is critical for normal neurodevelopment and overall healthy growth, though reassessment of long-standing policies promoting potentially excessive intake in a high-income setting are well overdue. A recent systematic review and meta-analysis of 35 trials involving over 40,000 infants from 4 to 23 months lacked sufficient statistical power to identify any neurodevelopmental benefit of daily iron supplementation [
25]; and the most extended prospective cohort (
n = 437; 43% attrition) assessed to date reported that children fed ‘low’ (2.3 mg/L) iron formula from 6 to 12 months outperformed those receiving AAP CoN-comparable 12.7 mg/L fortified preparations in all measures of neurodevelopment at 10 years, including spatial memory, visual-motor integration and intellectual ability [
26]. Although far from conclusive, this does raise some concerns about potential delayed adverse long-term health outcomes beyond those characterised in IDA. This systematic review and meta-analysis will examine effects of supplementary iron intake via direct supplementation or food fortification in non-IDA infants on haematological indices of iron stores, growth, neurodevelopment and adverse health effects. This study will help to determine if more research focussing on iron-replete infants in contemporary high-income settings are needed to inform revised nutritional guidelines. The ultimate goal is to establish ideal intake levels that sustain infant IDA levels at their current low rate and minimise any potential long-term risks arising from overexposure to iron during this critical window of neurodevelopment.