Am Fam Physician. 2002;66(7):1217-1225
The prevalence of nutritional iron deficiency anemia in infants and toddlers has declined dramatically since 1960. However, satisfaction with this achievement must be tempered because iron deficiency anemia in infants and toddlers is associated with long-lasting diminished mental, motor, and behavioral functioning. Additionally, the prevalence of iron deficiency anemia in one- to three-year-old children seems to be increasing. The exact relationship between iron deficiency anemia and the developmental effects is not well understood, but these effects do not occur until iron deficiency becomes severe and chronic enough to produce anemia. At that point, treatment with iron can reverse the anemia and restore iron sufficiency, yet the poorer developmental functioning appears to persist. Therefore, intervention should focus on the primary prevention of iron deficiency. In the first year of life, measures to prevent iron deficiency include completely avoiding cow's milk, starting iron supplementation at four to six months of age in breastfed infants, and using iron-fortified formula when not breastfeeding. Low-iron formula should not be used. In the second year of life, iron deficiency can be prevented by use of a diversified diet that is rich in sources of iron and vitamin C, limiting cow's milk consumption to less than 24 oz per day, and providing a daily iron-fortified vitamin. All infants and toddlers who did not receive primary prevention should be screened for iron deficiency. Screening is performed at nine to 12 months, six months later, and at 24 months of age. The hemoglobin/hematocrit level alone detects only patients with enough iron deficiency to be anemic. Screening by erythrocyte protoporphyrin or red-cell distribution width identifies earlier stages of iron deficiency. A positive screening test is an indication for a therapeutic trial of iron, which remains the definitive method of establishing a diagnosis of iron deficiency.
A high prevalence of iron deficiency anemia in U.S. infants was first widely noted in the 1930s. Thirty years later, when the prevalence rate had not dropped,1 iron deficiency began to be seen as a significant public health problem. Because nutritional factors were the cause in the vast majority of these cases, iron deficiency anemia began to be referred to as nutritional anemia.1 It soon became standard to screen all infants between nine and 12 months of age for iron deficiency by screening for anemia. Anemia became the commonly employed marker for iron deficiency, and the hematocrit level became the screening test.
This universal screening strategy, in conjunction with an increase in the popularity of breastfeeding, iron-fortification of infant formulas and cereals, the start of the Special Supplemental Food Program for Women, Infants, and Children (WIC) in 1972, and education of physicians and the public, was extremely successful. By the mid-1980s, a dramatic decline in the prevalence of iron deficiency anemia was noticed across the socioeconomic spectrum throughout the United States.2 In 1971, 23 percent of nine- to 36-month-old children in an inner city clinic in New Haven, Conn., had a hemoglobin level below 9.8 g per dL (98 g per L).1 By 1984, the rate in the same clinic had dropped to 1 percent.3 A study of middle-class nine- to 23-month-old infants in a private practice in Minneapolis found that a 7.6 percent prevalence of anemia between 1969 and 1973 decreased to a prevalence of 2.8 percent between 1982 and 1986.4
This achievement in the reduction of iron deficiency anemia in children has been considered a major success story. By 1987, leaders in the field suggested that routine screening for iron deficiency was no longer indicated except in known high-risk populations and should be replaced by selective screening based on the individual patient's risk for iron deficiency.5 It appeared that although the medical community needed to remain vigilant and identify pockets of continued elevated prevalence, iron deficiency anemia was no longer the public health threat that it had been. Unfortunately, iron deficiency anemia is still with us, and its developmental effects appear to be long-lasting.
Mental, Motor, and Behavior Effects
The association between iron deficiency anemia and diminished mental, motor, and behavioral development in infants is not a recent discovery. A possible link was noted in the late 1970s,6 and subsequent studies of 12- to 23-month-old infants in the past two decades confirmed those findings.7–10 By the 1990s, the association between iron deficiency anemia and lower developmental test scores was well-established but may not have received the expected amount of attention in the United States because of the shrinking prevalence of iron deficiency anemia. In recent years, it has become clear that these effects are long-lasting despite correction of the iron deficiency anemia.
Mental, motor, and behavior effects develop only when iron deficiency is severe enough to cause anemia.7,8 In studies using Bayley Scales of Infant Development,7 infants with iron deficiency anemia receive lower scores on mental and motor tests, including gross and fine motor coordination,11 and demonstrate affective differences, such as wariness, fearfulness, and unhappiness.9 These findings have been confirmed by a variety of studies in different cultural settings.12 Further study of the behavior component found activity differences, with the anemic infants being less playful, tiring more easily, and preferring to be held.10 These mental and motor effects are not detectable on routine physical examination; it is not known if the behavior changes are noticeable.
Treatment with iron, with subsequent complete resolution of the anemia and the iron deficiency, does not correct all of the behavior effects.10 Furthermore, the lower mental and motor test scores associated with iron deficiency anemia persist.7–9 In the longest trial to date, children reevaluated at 11 to 14 years of age demonstrated functional impairment in school despite complete correction of the iron deficiency anemia they had as infants.13
These children were more likely to have repeated a grade, to have reduced arithmetic achievement and written expression, and to show differences in motor function, spatial memory and selective recall. In addition, their behavior was more likely to be characterized as problematic by parents and teachers.
Iron Deficiency Anemia After the First Year of Life
Historically, the prevention of iron deficiency anemia has focused on the first 12 months of life. It appears that toddlers deserve the same degree of attention because of the risk of developmental effects from iron deficiency anemia and because the prevalence of iron deficiency anemia between one and three years of age may be greater than was formerly thought.
Two large-scale studies, the Third National Health and Nutrition Examination Survey (NHANES III)14 and the Third Report on Nutrition Monitoring in the United States (1988–1991),15 reported the prevalence of iron deficiency anemia in one- to two-year-olds to be 3 percent, and in one- to three-year-olds to be 15 percent.
A more recent study, conducted in an urban setting with an equal mix of lower and middle socioeconomic groups, noted that 10 percent of one- to three-year-olds had iron deficiency anemia.15 Severe cases of iron deficiency anemia (hemoglobin level less than 6 g per dL [60 g per L]) have been reported in this age range as well.16 In a longitudinal study of toddlers, 12-month-olds were noted to be receiving nearly 100 percent of the recommended daily allowance (RDA) for iron, but by 18 months of age, the intake of iron had declined to a level well below the recommended amount.17
These findings might have been anticipated because one- to three-year-olds have the lowest daily iron intake of any age group across the lifespan.15 At one year, breastfeeding or iron-fortified formula is often replaced with cow's milk, non–iron-fortified cereals enter the diet, and juices reduce the child's appetite for solid food.
Primary Prevention
The primary prevention of iron deficiency anemia in infants and toddlers hinges on healthy feeding practices. In infants, the introduction of cow's milk in the first year of life is the greatest dietary risk factor for the development of iron deficiency and iron deficiency anemia.18–20 Cow's milk is low in iron, and its iron is poorly absorbed.21 In addition, it decreases the absorption of iron from other dietary sources.21 Therefore, the strict avoidance of cow's milk in the first 12 months of life is essential in preventing iron deficiency anemia.
Breastfeeding is the ideal feeding practice for many well-documented reasons, including lowering the risk of iron deficiency anemia. Although breast milk is low in iron content, about 50 percent of the iron is bioavailable to the infant.12 Yet, exclusive breastfeeding after four to six months puts infants at risk for iron deficiency. Therefore, some form of dietary iron supplement that provides 1 mg elemental iron per kg per day is recommended for term infants starting at four22,23 to six12,20 months of age. Iron-fortified cereal can help meet this requirement24; however, many cereal-fed infants still develop iron deficiency anemia.25
To prevent iron deficiency, another option is a daily oral iron supplement, using ferrous sulfate drops26 or infant vitamin drops with iron. Vitamin drops contain 10 mg of elemental iron per dropper, which is the RDA for children six months to six years of age.26,27 Iron supplementation via drops or iron-fortified cereal should be continued throughout the period of breastfeeding. Breastfed preterm and low-birth-weight infants require supplementation at a dosage of 2 mg of oral elemental iron per kg per day, starting at two to four weeks of age.26 Infants weighing less than 1,500 g (3 lb, 4 oz) need higher dosages (3 mg per kg per day for 1,000 g [2 lb, 3 oz] to 1,500 g and 4 mg per kg per day for less than 1,000 g).12 All supplemental iron preparations, especially those for adults, should be stored out of the reach of children to prevent fatal poisonings.
Infants started on formula at birth and those switched from breast milk to formula should receive iron-fortified formula.28 Term and preterm infants (weighing more than 1,000 g) who are fed iron-fortified formulas are able to maintain iron sufficiency without additional iron supplementation.26,29 Vitamins given to these infants should not contain iron.
Low-iron formulas (less than 6.7 mg per L of iron) place infants at risk for iron deficiency anemia while offering no advantage over standard iron-fortified formulas with respect to gastrointestinal side effects.30 Controlled31 and double-blinded crossover32 trials show no difference between low- and standard-iron formulas in the frequency of fussiness, cramping, colic, regurgitation, flatus, or stool characteristics (except a darker color with standard iron-fortified formulas). Moreover, iron given at higher dosages to treat known iron deficiency anemia in 12-month-old infants caused no more gastrointestinal side effects than placebo.33
In the second year of life, cow's milk continues to cause problems in maintaining iron stores, and its consumption should be limited to less than 24 oz per day,34 with some clinicians calling for a stricter limit of 16 oz per day. Mothers who wish to continue to breast-feed after 12 months of age should be encouraged to do so, and iron supplementation should be maintained in some form. If breast-feeding is stopped before 24 months, a recent suggestion has been to substitute iron-fortified formula for cow's milk because of the negative effects of cow's milk on iron status.22 This may not be practical for many parents.
Other preventive measures for toddlers include encouraging a diversified diet rich in sources of iron and vitamin C, continuing use of cereals fortified with iron instead of more advertised cereals, avoiding excessive juice intake, and giving an iron-containing vitamin.15
Secondary Prevention
SCREENING
Infants with one or more risk factors (Table 1) should be screened for iron deficiency. Of these risks, the introduction of cow's milk in the first year of life is the most potent dietary factor for the development of iron deficiency.20 Poverty also significantly increases the risk for iron deficiency anemia, leading to the recommendation for continued routine screening of all infants from lower socioeconomic backgrounds.24 Forgoing screening might be considered if it is certain an infant has received primary prevention.
Diet |
Cow's milk ingestion |
Low-iron formula |
Breastfeeding without iron supplementation |
Prenatal/perinatal |
Anemia during pregnancy |
Poorly controlled diabetes |
Low birth weight |
Prematurity |
Multiple gestation |
Socioeconomic |
Low socioeconomic background |
Recent immigration from a developing country |
Other |
Qualified for but not receiving WIC assistance |
Rate of weight gain greater than average |
After 12 months, any toddler who was at risk as an infant but not screened needs to be tested at that time for iron deficiency. Other toddlers at risk (e.g., past history of iron deficiency anemia, cow's milk consumption of more than 24 oz per day, diet low in iron and vitamin C, or recent immigration from a developing country) should be screened between 15 and 18 months and at 24 months.34 A positive screening test requires confirmation with a therapeutic trial of iron. A negative screen provides an opportunity to intervene with primary prevention.
The ideal screening test would be capable of identifying iron deficiency in the absence of anemia. This would allow for the treatment of iron deficiency in the pre-anemic stage, preventing iron deficiency anemia and its associated mental, motor, and behavior effects. No such test is widely used at this time. The standard test has been the hemoglobin (or hematocrit) level, which leads to the diagnosis only if the iron deficiency is severe enough to cause anemia. This approach has been called into question because the developmental consequences of iron deficiency anemia suggest that identification of iron deficiency before anemia would be preferable.35
The serum ferritin level, transferrin saturation, and erythrocyte protoporphyrin level also can be used in the diagnosis of iron deficiency (Figure 1).36 Of these, the erythrocyte protoporphyrin measurement has the advantages of lower cost and office-based availability. Clinical studies have demonstrated its effectiveness as a screening tool.37,38 While an elevated erythrocyte protoporphyrin level is not as specific for iron deficiency as other markers, the decline in the prevalence and severity of lead toxicity makes an elevated erythrocyte protoporphyrin level a likely positive screen for iron deficiency.
Practices with a large number of infant and toddler patients at risk for iron deficiency or a high prevalence of iron deficiency anemia may find it helpful to invest in an office hematofluorimeter to measure erythrocyte protoporphyrin. As a screening test, it will miss some cases of iron deficiency even in the presence of anemia, making the combination of erythrocyte protoporphyrin and hemoglobin measurement a more effective screening strategy.
If erythrocyte protoporphyrin measurement is not an option, obtaining a red-cell distribution width (RDW) with the hemoglobin measurement could be a consideration.14 An elevated RDW is believed to be an early indicator of iron deficiency and might prompt a therapeutic trial of iron (Figure 2) to confirm the diagnosis.12 This is an attractive approach because the complete blood count (CBC) with red blood cell indexes alone could be used to screen for iron deficiency and iron deficiency anemia. However, this use of RDW is not currently standard practice, and cut-off values for RDW are instrument-specific and must be known by the ordering clinician.
If obtaining a CBC or an erythrocyte protoporphyrin level is impractical, screening solely with hemoglobin should not be abandoned. It is better to discover a patient who has developed iron deficiency anemia than to miss the diagnosis, as severity and chronicity of the condition may worsen the outcome. Another reason to keep hemoglobin as part of the screening strategy is that a baseline hemoglobin level is ultimately necessary in the confirmation of the diagnosis of iron deficiency.39 When the erythrocyte protoporphyrin level is elevated or the hemoglobin is low (less than 11 g per dL [110 g per L]), a therapeutic trial of oral iron is the gold standard to establish the diagnosis of iron deficiency.40 Black infants normally have slightly lower hemoglobin levels, and a cutoff of 10.7 g per dL [107 g per L] defines anemia in this population.26
A therapeutic trial of iron is the preferred approach to diagnosing iron deficiency because it is more reliable and less expensive than obtaining an iron panel.41,42 In children, it is important to remember that a recent infection can transiently depress the hemoglobin.43,44 Therefore, it is recommended to delay testing in an infant or toddler who had an infection within the previous two weeks.26 If the therapeutic trial of iron is negative, a work-up for the etiology of the anemia is indicated.
Other hematopoietic markers are being evaluated for their potential to simultaneously screen for and diagnose iron deficiency in infants and toddlers. The serum circulating transferrin receptor assay is a relatively new test, and the most recent test of iron status to be suggested is reticulocyte hemoglobin content.45 Neither modality is widely available, and both need more clinical study.
TREATMENT
After a positive screening test for iron deficiency and a diagnosis confirmed by a therapeutic trial of iron, the infant or toddler should complete a course of iron therapy. Elemental iron, at a dosage of 3 mg per kg, is given orally (usually as ferrous sulfate syrup, which is 20 percent elemental iron) once daily before breakfast.21,26 Absorption is improved if it is ingested with a source of vitamin C, such as orange juice. Total length of treatment is three months, including the one-month therapeutic trial of iron.26