We might not always realize it, but a lot of the stuff we're putting into our mouths has been meticulously engineered by Big Brother to turn us into robust, super-human specimens. Sure, it kind of sounds like the plot of a corny sci-fi flick—but we'd be nothing more than rickets-stricken piles of rotting teeth without it.
The conspiracy theorists and tin-foil-hat-wearers can say what they will about government nutrient initiatives, and their wild ramblings are generally harmless—until it comes voting time, that is. So as the curious, science-minded folks we know you are, you'll want to arm yourself with a thorough knowledge of the good Big Bro's help can actually do.
Here's a rundown of three of the United States' longest-running, most important health initiatives.
Water fluoridation may be something we consider man-made now, but the original water fluoridation was totally, 100 percent Gaia-grown. The first known case of natural fluoridation came about in 1901, when Dr. Frederick McKay of Colorado notice some bizarre staining that seemed to be related to the water supply. But it wasn't just staining—decay rates were lowering, too.
It wasn't' until 1925, when UK dentist Norman Ainsworth published a groundbreaking study about fluoride, that we really began to understand what was behind these mysteriously pristine choppers. After examining over 4,000 children, Ainsworth discovered that children living in the areas where mottled teeth were most prominent also tended towards far lower rates of decay. Then, in 1931, an American chemist named H.V. Churchill became concerned that the results of the study were due to aluminum in the drinking water. However, after analyzing water samples from areas in which staining was endemic, the only common chemical factor turned out to be higher levels of fluoride. What's more, it was eventually determined that at 1 ppm, fluoride still had the beneficial effects without the less desirable symptom of teeth staining.
As there had been no negative health effects stemming from the naturally fluoridated water, health authorities in the US came to the conclusion that areas with low fluoride should also be able to benefit from this happy accident—by artificially fluoridating their waters. And it wasn't long at all before they started seeing results. Grand Rapids, Michigan became the first town in the world to be artificially fluoridated in 1945, and soon after, studies proved that the tooth decay levels in Grand Rapids' children were nearly half that of those in Muskegon. And what do you know—that very same year, the fine people of Muskegon started artificially fluoridating their water, too. The American Dental Association has vocally endorsed the practice ever since.
Currently, about 72.5 percent of Americans live in areas with fluoridated drinking water, but that doesn't mean this dentists' dream hasn't had its opponents over the years. Water fluoridation has been linked to nearly every possible disease known to man, as well as a few that don't quite seem to exist yet. Plus, it certainly didn't help that in 1964, the film Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb included a scene that spoke to the hearts of many a conspiracy theorist by describing fluoridation as "the most monstrously conceived and dangerous Communist plot we've ever had to face." Of course, it didn't matter that the film was a satire poking fun at the very same theorists who were so inspired by its words.
But as far as science goes, there's never been any real, widely accepted proof that artificially fluoridated water has posed any potential health problems. Quite the contrary, in fact—its saved many of us from a lifetime of stained, rotting chompers.
Ever since the 1920s, Americans have been shoveling the majority of their necessary iodine quota into their mouths with every salty, savory bite of food they consume. Because that's when iodized salt got the former portion of its name tacked on in an effort to reduce the prevalence of goiter—the inflammation of the thyroid gland. People certainly weren't getting it elsewhere—potassium iodide (the form of iodine we find in our salt) is a micronutrient, which means that our bodies can't naturally synthesize it. We're totally and wholly dependent on the crap we stuff down our throats to provide us with the micronutrients we need.
And we really need iodine. Currently, iodine deficiency is the leading cause of (preventable) mental retardation throughout the world. About 30 percent of the world population's diet is iodine-deficient, and this isn't something you're only going to find in third-world countries. Take Europe for instance—iodized salt isn't nearly as common across the pond, a factor that majorly contributes to its title as a holder of nearly one-fifth of the world's cases of iodine deficiency.
So in 1924, the United States government decided to take a page out of the Swiss' book (who were already churning out iodized salt), and approached the Morton Salt Company about fortifying their salt with iodine after noticing markedly high instances of illnesses (like goiters) that stemmed from iodine deficiencies around the Great Lakes and the Pacific Northwest—the soil around these areas being naturally deficient. It was pretty easy to narrow down the cause of the goiters, too, since about 90 percent of those that are subjected to them have a lack of iodine in their diets to blame. The government looked to salt, specifically, for no reason other than the fact that it's one of the most universal ingredients imaginable.
Of course, all that was back in the 20s—the Gatsby era. These days, thanks to advances in travel and food storage, most of the food we eat in the United States isn't locally grown anymore, so any food that comes from more iodine-starved soils would generally be counterbalanced by food grown in other locales. In fact, we're getting more iodine on average than we actually need; the FDA recommends 150 micrograms of the stuff per day, and US-dwelling men get about 300 micrograms per day while women get around 210.
Given iodine's importance, though, Uncle Sam isn't taking any chances. You'd need about 1,100 micrograms in a day to hit the Tolerable Upper Intake Level and 2 million micrograms to actually overdose, so the benefits severely outweigh the risks.
By 1938, the AMA Council of Foods and Nutrition had become reasonably comfortable endorsing food fortification—assuming the science was there to back it up. It was also around this time that the American diet was becoming more and more dependent on a brilliant, new invention: Refined flour. With an increasingly industrialized US came the need to extend bread's shelf-life, which was cut short by the fatty acids of traditional grain's outer germ. The germ would start reacting as soon as it was exposed to oxygen, so by simply removing it, producers were left with a batch of beautiful, white flour that was totally immune to becoming rancid. Everybody wins—or so they thought. Unfortunately, people at the time knew nothing of the vitamins, micronutrients, or the amino acids that made wheat germ essential to bodily health.
So it wasn't long before pellagra (a vitamin deficiency disease caused by a chronic lack of niacin) started running rampant across the United States, and that certainly made health officials to take notice. Because with pellagra comes the four delightful D's: diarrhea, dermatitis, dementia, and last but certainly not least, death. And that was just the most prevalent condition. Soon, other outbreaks started cropping up from micronutrient loss, including a neurological disorder from thiamin deficiency and a redness/swelling of the tongue that's brought on by a riboflavin deficiency. So it quickly became painfully apparent that there was a problem.
To counteract these tortuous effects, bakers started adding high-vitamin yeasts to their products along with synthetic B-vitamins. Then in 1942, just four years after food producers first started de-germing their wheat, 75 percent of the white bread being made in the US was fortified with thiamin, niacin, iron, and riboflavin—damning many of these conditions to relics of a past era.
After World War II, though, the FDA decided against mandating flour enrichment. Instead, the agency established two standards of identity (SOIs) for bread, allowing companies to put out both enriched and non-enriched loaves for the masses.
And though that double SOI rule still applies today, the qualifications for carrying that "enriched" label have certainly evolved. In 1998, the Centers for Disease Control and Prevention took the recommendation of the US Public Health Service and made folic acid fortification mandatory in enriched grains. Studies had proved that folic acid supplementation was highly beneficial in minimizing the number of babies born with neural tube defects, and as folic acid fortification went up, instances of neural tube defects went down by one-third between 1995 and 2002.
Now you know why they call it "Wonder" Bread.