The Fund supports networks of state health policy decision makers to help identify, inspire, and inform policy leaders.
The Milbank Memorial Fund supports two state leadership programs for legislative and executive branch state government officials committed to improving population health.
The Fund identifies and shares policy ideas and analysis to advance state health leadership, strong primary care, and sustainable health care costs.
Keep up with news and updates from the Milbank Memorial Fund. And read the latest blogs from our thought leaders, including Fund President Christopher F. Koller.
The Fund publishes The Milbank Quarterly, as well as reports, issues briefs, and case studies on topics important to health policy leaders.
The Milbank Memorial Fund is is a foundation that works to improve population health and health equity.
September 2014 (Volume 92)
Quarterly Article
Howard Markel
December 2024
Dec 19, 2024
Back to The Milbank Quarterly
When reviewing major public health initiatives over the past century, it is amazing to see how often beliefs, policies, and scientific fact have changed. A policy or notion deemed acceptable by one generation might be denigrated and avoided by a subsequent one; a product or practice once considered safe, if not entirely healthy, might later emerge as something harmful to our individual or collective health.
Take the simple, white, crystalline substance that has and continues to grace almost every dining table in the world: salt.
At this late date, it may be hard to believe, but table salt was a key player in one of the most successful public health campaigns of the early 1920s.1 In the United States, the Great Lakes states, as well as those in the Northwest, were considered to be part of the “goiter belt” because the soil (and, hence, the vegetables and produce grown there) was so iodine deficient. Iodine, of course, is essential to normal thyroid function and good health, and its absence from the diet causes an enlargement of the gland, or goiter, and, more seriously, hypothyroidism. Such patients are easily recognized by their slow movements, listlessness, decreased concentration, and lower thresholds for work. Iodine deficiency is even more serious for young children, who are at risk of developing serious brain damage (cretinism) from an iodine-poor diet.
In fact, goiter was one of the most significant metabolic health problems among young Americans in the early 20th century. For example, in 1918, as the nation prepared to enter World War I, Simon Levin, an examining physician for the draft board of Houghton County, Michigan, rejected nearly 30% of the potential US Army recruits because of goiters and hypothyroidism.2 In addition, a statewide study of Michigan schoolchildren during the same era determined that 47.2% of those examined exhibited signs and symptoms of goiter and serious thyroid dysfunction.3
There was little debate over the severity of the situation in the goiter belt. The real problem was how to get iodine into the diets of those who needed it most, without incurring the wrath of food manufacturers, doctors, politicians, and the public at large. Some suggested giving children sodium iodide syrup 2 to 3 times a year, but this idea proved to be both unpalatable and a compliance problem. A plan in Rochester, New York, to add iodine to the city’s drinking water fell flat as well.
Then in 1922, David Murray Cowie, a pediatrician at the University of Michigan, joined the goiter battle. He, too, had seen his share of goiter patients and children with hypothyroidism so severe that they were deemed intellectually disabled. Dr. Cowie’s “Eureka moment” came one evening while reading a long monograph from the Swiss health authorities (another part of the world where the soil is iodine deficient). The report described a plan to add, as a goiter preventive, a specific amount of sodium iodide or potassium iodide to each package of table and cooking salt sold in that country. Cowie correctly reasoned that since everybody consumed salt every day and if he could convince the salt manufacturers to agree to change their product, a major public health problem might be easily solved. This turned out to be not so simple, however, and it took another 2 years of intense lobbying, cajoling, and educating to even launch the idea.
Cowie conferred with virtually every salt manufacturer in Michigan and beyond, explaining the medical and public health rationale of his scheme. He also hired a chemist from the Dow Chemical Company to demonstrate that a small amount of iodide (about .01% of the total content) did nothing to alter the taste or function of the salt.
After winning the support of the “salt men,” Cowie had to convince a large number of doctors who worried that adding iodine to the daily diet might make the situation worse and lead to hyperthyroidism and even heart damage. Working with a number of distinguished physicians, he presented reams of data to the contrary and was able to push through a referendum from the Michigan State Medical Society endorsing the use of iodized salt. More important, Cowie and his colleagues embarked on a “barnstorming” trip, giving health lectures all over the state. The topic most often requested was how to prevent goiter with a few grains of salt. On May 1, 1924, the first boxes of iodized salt appeared on grocers’ shelves all over the state. Incredibly, the salt manufacturers voluntarily produced every box; no laws or regulations were required.
Morton Salt Company advertising blotter, circa 1925. From the collection of the University of Michigan Center for the History of Medicine.
Within months, more than 90% of the table salt sold in Michigan was iodized. Seeing how popular this new product was, the Morton Salt Company of Chicago rolled out its iodized salt product in September and distributed it across the nation. Other US salt manufacturers followed suit.
The success of Cowie’s campaign is difficult to deny. By 1935, the incidence of goiter in Michigan alone dropped by 74% to 90%, with the greatest decrease among children who had continuously used iodized salt for at least 6 months. Studies conducted in other goiter belt locations showed similar reductions.
Unfortunately, 1924 was a long time ago. Modern medicine knew next to nothing about the harms of excess salt consumption, and Dr. Cowie could not have imagined the burgeoning food-manufacturing and -packaging industry that would develop in the decades following his public health victory. In moderation, of course, sodium is essential to the normal physiological machinations of every human body. But we have long since gotten away from the moderate consumption of salt, especially with the advent and ease of prepared, processed foods and “fast foods” that are, literally, loaded with salt and make up a huge and decidedly deleterious portion of the American diet.
Today, the average American consumes 3,300 mg of sodium each day, an amount that exceeds the US Food and Drug Administration (FDA) recommendation of 2,300 mg (or 1 teaspoon) per day for healthy patients. For people over the age of 50, African Americans, and patients with hypertension, diabetes, kidney, and/or heart disease, the FDA recommends less than 1,500 mg/day.
The combination of excess salt consumption with hypertension, a leading cause of strokes and heart and kidney disease, is especially harmful to the public’s health. Some 67 million Americans have been diagnosed with the “silent killer,” including two-thirds of all Americans over the age of 60. Extrapolating from a number of well-developed population health studies, experts suggest that excess salt consumption causes the deaths of 40,000 to 90,000 people in the United States each year, at an annual cost of nearly $20 billion.
In 2003, the United Kingdom’s Food Standards Agency (FSA) began to address its own national salt crisis by establishing a menu of popular processed foods. The agency divided these foods into distinct categories, with salt reduction targets for each. It also worked with stakeholders to create “front of the package” labeling with nutritional information that would help consumers make healthier choices. At the same time, the FSA orchestrated a national education program on ways to reduce individual salt consumption. Perhaps most important, the FSA asked food producers to gradually and voluntarily meet these goals over time in a way that consumers would not miss or notice the reduced salt.4
A recent study in the BMJ reports that from 2003 to 2011, salt consumption in Britain plummeted an impressive 15%. During that same period, there was a 42% decrease in deaths from strokes, a 40% decrease in deaths from heart attacks, and a significant drop in blood pressure. (It should also be noted that cigarette smoking and cholesterol levels also declined during this period; produce consumption and body mass index rose; and better ways of treating heart disease and hypertension were developed, thereby making salt reduction a likely, but not the only, contributor to these trends.)5
In the United States, well-stocked opponents have rebelled at the mention of any legal restrictions on what we put in our mouths, no matter how dangerous. Such activists are often aided and abetted by major food companies, which have huge bankrolls to lobby legislators against passing any regulations of our food chain. Despite plenty of warnings from distinguished and respected health authorities, ranging from the American Heart Association to the Institute of Medicine, the FDA has found the issue of regulating the salt content in food to be a complicated and political morass. More recently, the FDA and the US Department of Agriculture have begun advocating a voluntary plan for food companies to cut sodium using the British model.
The goal, of course, is to reverse a dietary trend and save tens of thousands of lives and tens of billions of dollars every year. At the supermarkets and restaurants we frequent, we Americans also need to demand foods that are not chock-full of salt and to demonstrate that concern with our pocketbooks. Indeed, we all must play an active role in working toward reducing our individual salt intake to that tiny teaspoon (or less) a day. Let’s hope that the food companies take a lesson from Cowie’s “salt men” of nearly a century ago and join the pursuit of a healthy population. But this time we need to remove salt from the equation.
In the September 2014 issue of The Milbank Quarterly, we offer several ways to improve the population’s health through adept policy suggestions and demonstrable evidence.
The issue begins with our op-ed columnists making observations on a variety of topics ranging from the global eradication of polio and childhood lead poisoning to out-of-pocket expenses for the health care consumer and electronic medical records.
Newly added to our original contributions and preceding each article’s abstract are “Policy Points,” a 2- to 3-bullet-point synopsis of the article’s import on explicating or advancing a particular set of health policies. Just as the article’s abstract presents, in an abbreviated form, the scholarly design and results of a study, the Policy Points will serve to alert legislators, their aides, policymakers, and implementers about the ramifications of the Quarterly’s peer-reviewed studies.
We then proceed to a study by Genevieve Pham-Kanter on the potential for financial conflicts of interest to influence the members of the FDA’s advisory committees during the drug approval process. Not surprisingly, she found that those committee members with exclusive financial relationships with sponsoring firms tended to be more biased toward voting for the sponsor. Yet she also found that members with nonexclusive financial relationships with both the sponsor and its competitors did not appear to be as biased. Following this article is a commentary by David Rothman on the corrosive effects that conflicts of interest can have on the medical marketplace.
In their article, Denise Lillvis, Anna Kirkland, and Anna Frick document the debate over vaccines by analyzing the political efforts and outcomes of state vaccine legislative efforts between 1998 and 2012. They suggest that vaccine politics are entering another phase, one in which immunization supporters may be able to counter the increase in opt-out rates, particularly in states with recent outbreaks of childhood infectious diseases and those with politicians favoring science-based policies.
Next, Vicki Freedman and Brenda Spillman study the disability and care needs of older Americans by analyzing the National Health and Aging Trends Study, a new national panel study of more than 8,000 Medicare enrollees. Given the number of elderly Americans with substantial late-life care needs, especially those with few economic resources, Freedman and Spillman recommend developing policies to improve long-term services, reduce unmet needs, and benefit both older adults and those who care for them.
Jon Christianson, Caroline Carlin, and Louise Warrick contribute an article on the dynamics of community health care consolidation and the acquisition of physician practices as it played out in Minnesota between the late 1970s to the present. Their study is an excellent illustration of how the acquisition of physician practices will be difficult to reverse in the current health care environment. Provider revenue uncertainty, they argue, is a key factor driving consolidation, with public and private attempts to control health care costs contributing to that uncertainty.
Our final article, by Douglas Conrad, David Grembowski, Susan Hernandez, Bernard Lau, and Miriam Marcus-Smith, presents some of the emerging lessons from regional and state innovation efforts to enact value-based payment reforms in their health care delivery systems. One of the key lessons they uncovered is that multistakeholder, value-based payment reform requires the presence of a trusted, widely respected “honest broker” that can convene and maintain the ongoing commitment of health plans, providers, and purchasers.
All these studies merit our attention and deliberation. And at the risk of overextending the metaphor of the need to restrict our sodium intake, none of us should take these papers with a grain of salt.
References
Author(s): Howard Markel
Read on Wiley Online Library
Volume 92, Issue 3 (pages 407–412) DOI: 10.1111/1468-0009.12064 Published in 2014
Howard Markel is the editor-in-chief of The Milbank Quarterly. He is also the George E. Wantz Distinguished Professor of the History of Medicine and director of the Center for the History of Medicine at the University of Michigan. An acclaimed social and cultural historian of medicine, Dr. Markel has published widely on epidemic disease, quarantine and public health policy, addiction and substance abuse, and children’s health policy. From 2006 to 2016, he served as the principal historical consultant on pandemic preparedness for the U.S. Centers for Disease Control and Prevention. From late April 2009 to February 2011, he served as a member of the CDC director’s “Novel A/H1N1 Influenza Team B,” a real-time think tank of experts charged with evaluating the federal government’s influenza policies on a daily basis during the outbreak. The author or co-author of ten books and over 350 publications, he is editor-in-chief of The 1918–1919 American Influenza Pandemic: A Digital Encyclopedia and Archive. He received his AB (summa cum laude) and MD (cum laude) from the University of Michigan and a PhD from the Johns Hopkins University. He completed his internship, residency, and fellowship in general pediatrics at the Johns Hopkins Hospital. In 2008, he was elected a member of the Institute of Medicine of the National Academy of Sciences.