Elsevier

Clinical Nutrition ESPEN

Volume 10, Issue 3, June 2015, Pages e118-e123
Clinical Nutrition ESPEN

Original article
Identifying factors predicting iron deficiency in United States adolescent females using the ferritin and the body iron models

https://doi.org/10.1016/j.clnesp.2015.03.001Get rights and content

Summary

Background & aims

Iron deficiency is the most prevalent nutritional deficiency in the United States affecting 9–16% of female adolescents. With the primary purpose of detecting iron deficiency, primary care screening consists of a hemoglobin or hematocrit laboratory test. This method is simple and inexpensive, but tests for anemia, and is neither sensitive nor specific for iron deficiency. Alternate methods for diagnosing iron deficiency using the ferritin and body iron models are not widely utilized. The study objective was to compare iron deficiency risk factors among adolescent females defined by the ferritin and body iron models to better characterize those who may benefit from iron deficiency testing as opposed to the current anemia-based screen.

Methods

This cross-sectional study of female adolescents aged 12–21 years utilized National Health and Nutrition Examination Survey 2003–2006 data. Anemia was defined by standard hemoglobin cutoffs. The ferritin model defines iron deficiency through transferrin saturation, ferritin and erythrocyte protoporphyrin laboratory testing. Body iron calculates iron status with a formula involving transferrin receptor and ferritin. Bivariate and multivariable analyses examined associations between questionnaire responses and iron deficiency defined by each model.

Results

Among 1765 participants, 2.7% were anemic. Iron deficiency prevalence was 13.1% and 9.1% by the ferritin and body iron models, respectively. Based on the model, anemia-based screening had a sensitivity of 15.6–18.8% for iron deficiency. Multivariable associations for ferritin model iron deficiency included age, race/ethnicity, activity level and medroxyprogresterone acetate injection. Age and food insecurity were significant using the body iron model.

Conclusions

Universal anemia-based screening misses the majority of iron-deficient adolescent females. The common risk factor identified here, adolescent age, may both inform preventive care guidelines on age-based screenings and prospective studies of adolescent iron deficiency risk factors.

Introduction

Iron deficiency (ID)1 is the most common form of nutritional deficiency in the United States [US; [1], [2]]. It is estimated that 9–16% of US female adolescents are iron deficient while 2–5% are anemic [1], [3], [4]. Due to lack of a simple, inexpensive test for ID, primary care screening is based on testing for anemia, which has low sensitivity and specificity for detection of ID [2], [5], [6]. This is unfortunate, as even non-anemic iron deficient adolescents experience significant morbidity, which is easily corrected with iron supplementation [7], [8], [9].

Risk assessment tools can select individuals requiring more extensive laboratory evaluation. To develop such a tool for adolescent ID screening, risk factors must be identified with data from that population. Most studies of risk factors for ID and anemia group adolescents with older reproductive age women, typically defined between 12 and 49 years of age [1], [10], [11], [12]. However, the care of adolescent females cannot be approached similarly to older women.

A challenge in developing an ID risk assessment tool is that current ID laboratory testing relies on multiple markers that, combined, determine iron status [6]. The ferritin model, which uses serum ferritin, erythrocyte protoporphyrin, and transferrin saturation to define ID, was used from the mid-1980's to 2006 to determine ID among the US population [4], [6], [13], [14]. In 2003 Cook et al. presented the body iron model in response to the need for a reliable method to both assess and quantify iron status [6], [15]. Similar to the ferritin model, the body iron model does not provide a single, simple test for ID for use in the office setting, but uses a formula with ferritin and soluble transferrin receptor. Though comparison of the prevalence of ID defined by the 2 models found fair to good agreement, the body iron model produces lower estimates of ID prevalence with better prediction of anemia and less inaccuracy from inflammation [4].

Evaluation of both ID models and associated risk factors through the lens of their utility for the development of a pediatric screening tool would fill a gap in current ID screening. Therefore, the objective of this study was to compare ID risk factors among adolescent females as defined by the ferritin and body iron models with the National Health and Nutrition Examination Survey (NHANES)2 dataset. Our hypothesis was that the two models would share enough overlapping risk factors to create a clinical prediction tool to select adolescents at high risk of ID for more costly laboratory tests or even empiric iron therapy. This implementation science-based study has the potential to replace the current low-sensitivity anemia-based screen (hemoglobin) used daily in US pediatric primary care to test adolescents for ID.

Section snippets

Data source

NHANES is a program of studies to assess the health and nutritional status of the US population [16]. The survey combines household interviews and physical examinations conducted in mobile examination centers by the National Center for Health Statistics, Centers for Disease Control and Prevention (CDC).3 NHANES includes demographic, socioeconomic, dietary, and health-related questions. Participants are selected via a stratified multistage

Demographics

A sample of 1765 adolescent females aged 12–21 years participating in NHANES 2003–2006 with the necessary laboratory data to define ID using both the ferritin model and the body iron model were included in this analysis. This sample represents 6.3% of all NHANES study participants (unweighted N = 28,127) and 65.4% of all adolescent women (unweighted N = 2700) in the selected years. Among the sample, 65.8% were self-described white, 14.6% black, 9.7% Mexican, 4.2% Hispanic and 5.7% other. Almost

Discussion

This analysis of a nationally representative cohort of female adolescents aged 12–21 years confirms that current anemia-based screening, with a sensitivity of only 15.6–18.8%, greatly underestimates ID and is inadequate as a widely used screening tool. Survey-based risk factors for adolescent ID showed some variability by the model used to define its presence and the prevalence of ID. However, adolescent age was a significant predictor of ID in both the ferritin model and the body iron model,

Conclusions

Anemia-based screening is widely utilized in primary care, but fails to detect the majority of iron deficient adolescent females. Currently, universal laboratory testing for adolescent ID is impractical in the clinical setting. Identification of adolescent age as a common risk factor between the ferritin and body iron models may inform age-based recommendations on ID screening as well as prospective studies on optimizing adolescent ID screening.

Conflict of interest

None declared.

Acknowledgments

Dr. Sekhar's research is supported by the Eunice Kennedy Shriver National Institute of Child Health and Human Development of the National Institutes of Health under BIRCWH award number K12HD055882, “Career Development Program in Women's Health Research at Penn State.” The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.

References (35)

  • A.C. Looker et al.

    Prevalence of iron deficiency in the United States

    JAMA

    (1997)
  • P.B. Devaki et al.

    Effects of oral iron(III) hydroxide polymaltose complex supplementation on hemoglobin increase, cognitive function, affective behavior and scholastic performance of adolescents with varying iron status: a single centre prospective placebo controlled study

    Arzneimittelforschung

    (2009)
  • J. Parker et al.

    Adjusting national health and nutrition examination survey sample weights for women of childbearing age. National Center for Health Statistics

    Vital Health Stat

    (2013)
  • Centers for Disease Control and Prevention

    Iron deficiency–United States, 1999–2000

    MMWR Morb Mortal Wkly Rep

    (2002)
  • Oregon Evidence-based Practice Center

    Screening for iron deficiency anemia in childhood and pregnancy: update of the 1996 U.S. preventive task force review

    (2006)
  • S.E. Cusick et al.

    Iron-status indicators

    Pediatrics

    (2008)
  • Cited by (16)

    • Effect of education through video and packaging modifications of iron tablets on female adolescent behavior in the iron supplementation intake in SMPN 2 and SMPN 1 Parigi

      2021, Gaceta Sanitaria
      Citation Excerpt :

      The results of the study revealed that health education provided for 2 × 60 min with audio-visual and booklet media in the intervention group together could improve caregiver attitudes in preventing tuberculosis in family members. Health Education can change behaviour such as about contraception usage.12,24 The development of educational media for iron supplementation in the form of video and modification of iron tablet packaging together has an effect on the knowledge, attitudes, and intentions of the female adolescents in iron supplement intake.

    View all citing articles on Scopus
    View full text