Adolescent girls who donate blood are at greater risk for iron deficiency and anemia and should take extra steps to keep their bodies’ iron stores up to recommended levels, researchers say.
The researchers recommend these girls consider oral iron supplements, increase the minimum time between donations, and donate platelets or plasma instead of whole blood.
“We’re not saying that eligible donors shouldn’t donate. There are already issues with the lack of blood supply,” says Aaron Tobian, professor of pathology, medicine, oncology, and epidemiology at the Johns Hopkins University School of Medicine and co-lead author of the paper, which appears in Transfusion.
“However, new regulations or accreditation standards could help make blood donation even safer for young donors.”
250 milligrams of iron
Each year, an estimated 6.8 million people in the United States donate blood, according to the American Red Cross, and adolescents are an increasing part of the donor pool, thanks to high school blood drives. In 2015, donors aged 16 to 18 made about 1.5 million blood donations.
Although donation is largely safe, adolescents are at a higher risk for acute donation-related problems, such as injuries from fainting during donation, the researchers say.
Blood donation can also increase the risk of iron deficiency, removing about 200 to 250 milligrams of iron in each whole blood donation. Adolescents typically have lower blood volumes, so donating the same amount of blood as adults leads to a higher proportional loss of iron during donation. Girls are even more at risk of iron deficiency than boys due to blood losses during monthly menstruation.
Previous studies show that being younger, being a girl, and donating more often are all associated with lower serum ferritin levels (a marker for total body iron levels) in blood donors. No previous study using nationally representative data, however, had compared the prevalence of iron deficiency and associated anemia between donor and non-donor populations, specifically in adolescents.
The researchers analyzed data from the National Health and Nutrition Examination Survey, a long-running US Centers for Disease Control and Prevention study, which included collections of blood samples as well as questions about blood donation history, from 1999 to 2010.
The findings show that about 10.7 percent of the adolescents in the study had donated blood within the previous 12 months, compared with about 6.4 percent of adults. Mean serum ferritin levels were significantly lower among blood donors than among non-donors in both the adolescent (21.2 vs. 31.4 nanograms per milliliter) and the adult (26.2 vs. 43.7 nanograms per milliliter) populations.
The prevalence of iron deficiency anemia was 9.5 percent among adolescent donors and 7.9 percent among adult donors—both low numbers, but still significantly higher than that of non-donors in both age groups, which was 6.1 percent. Further, 22.6 percent of adolescent donors and 18.3 percent of adult donors had absent iron stores.
Collectively, the researchers say, the findings highlight the vulnerability of adolescent blood donors to associated iron deficiency.
Tobian and biostatistician Eshan Patel, the study’s co-lead author, note that some federal policies and regulations are already in place to protect donors of all ages from iron deficiency. For instance, donors undergo hemoglobin screening, must meet a minimum weight, and must wait eight weeks between whole blood donations.
Source: Johns Hopkins University