For 100 years, food fortification, the practice of deliberately increasing the content of vitamins and minerals in a food, has been essential to combating public health crises. However, these practices have continued into the modern era. Because overconsumption of nutrients has been linked to toxicity and diseases, public health officials should continue to reflect on the benefits and risks of food fortification today.
History of Food Fortification
In the United States, food fortification (also known as enrichment) began in 1924 to address endemic goiter, enlargement of the thyroid gland. A physician in Cleveland suggested the use of salt since it was so commonly consumed to increase iodine consumption. After some persuasion, the Michigan State Medical Society studied the safety of iodized salt and launched the world’s first food fortification campaign. This was the first time food was deliberately manufactured with an eye towards addressing disease. However, while some members of the salt industry were excited by the potential to improve public health through their product, others were not. The Morton Salt Company argued that furnishing iodine to the populus properly belonged to large pharmaceutical companies. But the results were overwhelming: The incidence of goiter among children in Michigan decreased from 35% to 2.6%.