What Type of Salt Should You Buy? Rethinking 1924 Food Fortification Policy in 2024

by Jessica Samuels

For 100 years, food fortification, the practice of deliberately increasing the content of vitamins and minerals in a food, has been essential to combating public health crises. However, these practices have continued into the modern era. Because overconsumption of nutrients has been linked to toxicity and diseases, public health officials should continue to reflect on the benefits and risks of food fortification today.

History of Food Fortification

In the United States, food fortification (also known as enrichment) began in 1924 to address endemic goiter, enlargement of the thyroid gland. A physician in Cleveland suggested the use of salt since it was so commonly consumed to increase iodine consumption. After some persuasion, the Michigan State Medical Society studied the safety of iodized salt and launched the world’s first food fortification campaign. This was the first time food was deliberately manufactured with an eye towards addressing disease. However, while some members of the salt industry were excited by the potential to improve public health through their product, others were not. The Morton Salt Company argued that furnishing iodine to the populus properly belonged to large pharmaceutical companies. But the results were overwhelming: The incidence of goiter among children in Michigan decreased from 35% to 2.6%.

Next, in 1933 the fortification of milk with vitamin D was driven by the American Medical Association’s Committee on Foods to reduce the incidence of rickets, a type of bone disease. Since primarily young children suffered from rickets, milk was the ideal vehicle for fortification because it was seen as a staple food for children, as well as pregnant and lactating women. In addition, information about fortified milk could easily be given by physicians to mothers during routine appointments.

In the 1940s, officials were concerned about the poor nutritional status of young men enlisting for service during World War II. This fear produced the Enrichment Act of 1943—also known as the first War Food Order—which required that all grain products crossing state lines be enriched with thiamin, niacin, and riboflavin. The mandatory order was later repealed in 1946.

Most recently, after folic acid (vitamin B9) was found to prevent neural tube defects in growing fetuses, the U.S. Public Health Service recommended that all women of reproductive age get 400 micrograms (mcg) of folic acid per day. In 1998 the Centers for Disease Control and Prevention required that manufacturers add folic acid to already enriched cereal grain products. Neural tube defects in newborns have decreased significantly.

Since the repeal of the Enrichment Act of 1943, FDA has not required mandatory fortification. And since 1952, FDA has enforced standards of identity, establish labeling guidelines for enriched foods to ensure that the characteristics of a specific food are consistent with consumer expectations. More controversially, however, FDA has regularly limited the type and amount of nutrient that can be added to a food. Additionally, FDA has published guidance expressing that food fortification is appropriate when narrowly tailored to (1) correct a dietary insufficiency, (2) restore nutrient levels (3) provide a balance of nutrients, and (4) prevent nutrient inferiority in a food that replaces a traditional food in the diet.

If FDA is interested in limiting food fortification, are manufacturers making the right choice by continuing to enrich their products?

Food Fortification for the Modern Diet

The addition of iodine to salt; vitamin D to milk; thiamine, niacin, riboflavin, and iron to flour; and folic acid to cereal solved major public health crises. Adding these safe nutrients to common foods saved millions from nutrient deficiency diseases without any action by the consumer. Many individuals, especially less-resourced individuals and those with limited diets, have come to rely on food fortification without necessarily realizing it.

However, toxicity is related to excessive intake of micronutrients. Concerns about overfortification has become more salient given the rise of vitamin and supplement consumption. For example, high total folate intake has been linked with increased incidence of several cancers, including an increased risk of type II endometrial cancer in postmenopausal women. Patients suffering from diseases may not report their fortified food and supplement intake to physicians, increasing the risk of drug-nutrient interactions or interference with therapies. For example, calcium has been fortified in numerous products and intake has been highlighted as a major health benefit. But there are well-known adverse effects of excess calcium, linked to the increased incidence of prostate cancer in men and increased risk of cardiovascular events in women.

Additionally, it is difficult to determine what foods should be fortified given diversity of diets. Common diets today include those related to lifestyle, religion, and heritage. For example, in 2016 FDA approved folic acid fortification of corn masa flour since it is a staple food for many Latin Americans and descendants living in the US, and corn masa flour replaces foods where folic acid is currently added. For vegans, FDA has approved fortification of plant-based milk products to meet substantially equivalent values of nutrients in enriched cow’s milk. But it is difficult to determine let alone enforce ideal fortification given changing food trends and expanding diversity of food choices.

Some reasons for enriching food may not apply to all populations. Supplements allow individuals to tailor nutrient intake to their own needs. Children, pregnant women, and healthy adults require different levels of nutrients like iodine and folic acid. Some consumers are more informed about how critical nutrition is and can choose what foods to eat. However, this desire to address nutrient deficiencies individually may not be accessible to all communities including those who rely on convenient, cheap food that lacks sufficient nutrients without enrichment.

Potential dangers of fortification are more noteworthy in the modern era. But from a public health perspective, perhaps it is necessary to risk nutrient toxicity for some to support the nutrient needs of others.

Jessica Samuels is a third-year dual degree law and public health student (J.D./MPH 2025). Her research interests include genetics, environmental health sciences, novel biotechnologies, and the FDA regulatory process. She has previously published work on the accuracy of ultrasound in predicting malignant ovarian masses. At HLS, Jessica is co-president of the Harvard Health Law Society.

 

The Petrie-Flom Center Staff

The Petrie-Flom Center staff often posts updates, announcements, and guests posts on behalf of others.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.