Pellagra and the Four Ds

2014 marks the 100th anniversary of the war on pellagra, a war that lasted nearly 25 of those years before victory could finally be declared. You have not heard of the war on pellagra? The celebration is not on your calendar? You’re not alone.

Why did it take so long? Was the science so intractable, like the current “war” on cancer? No. It was politics and pigheadedness that were the obstacles.

What is Pellagra?

It was in 1914 that Dr. Jozef Goldberger, a Hungarian-born, U.S. educated epidemiologist, was assigned by the U.S. Public Health Service to investigate an epidemic of a disease disabling and killing hundreds of thousands of U.S. citizens, especially in the southern part of the United States. The disease was pellagra, a word with Italian roots, meaning sour or rough skin, a reference to the dermatitis that is one of the hallmarks of the disease. Pellagra, we know today, is a nutritional deficiency disease, caused by diets poor in niacin (or its equivalent, nicotinamide). Niacin was the third of the B complex vitamins to be identified and so was given the designation “vitamin B3”. Niacin is critically important for essentially every cell in our bodies. It is a part of the biochemical machinery that captures and channels the energy produced when sugars and fats are “burned”. That energy powers all cell work, from muscle contraction to nerve function to simple everyday maintenance of cell integrity. Thus it is easy to understand how deficiency of this vital molecule could produce total body disease.

Pellagra was characterized by the “four Ds” – dermatitis, diarrhea, dementia, and death. It is a perfect example of a point made in earlier posts in this blog that most tissues need most nutrients, and that a deficiency of virtually any nutrient impairs virtually every function of the body. This is in contrast to the popular belief that one nutrient may be good for the skin, another for memory, another for the eyes, another for the immune system – on and on – which is simply not accurate.

What is its Cause?

But pellagra is interesting and instructive for other reasons as well. Pellagra had been recognized for a couple of centuries prior to Goldberger’s work, and there were varying theories as to its cause or what to do about it. Some medical experts thought it due to poor diet, others to infection or poor sanitation, and still others to toxins from food spoilage. Whatever its basis, pellagra prevalence had increased alarmingly in the southern U.S. at the turn of the 20th century and something needed to be done to stop it. When Goldberger was assigned the task, the majority view in the U.S. seemed to tilt toward an infectious cause. The choice of Goldberger itself probably reflected that view, as he had earlier distinguished himself in epidemics of indisputable infectious diseases – yellow fever in Cuba, dengue fever in Texas, and typhus in Mexico City.

Epidemiologists, you know, have classically examined disease outbreaks to try to figure out how and why they happen – not just their cause, but why here? and why now?. In this case Goldberger quickly recognized that, in addition to its prevalence in the rural south, pellagra was common also among the inmates of northern institutions – orphanages and mental asylums particularly.  But he noted a peculiar feature of those disease pockets – the institutional staffs did not develop it. Goldberger knew from personal experience that infectious outbreaks rarely discriminate between the keepers and the kept. He, himself, had succumbed to yellow fever, dengue, and typhus when he had investigated those epidemics. The one feature by which inmates and staff clearly differed in the northern institutions was diet, suggesting to Goldberger that poor diet was the likely underlying cause.

Nutritional deficiency was not an accepted category of disease when Goldberger started work. As I have noted in earlier posts, the prevailing view in medicine at the time was that, if you ate enough to perform daily work, you were adequately nourished. The idea that not eating something could make you sick was considered nonsense. Thus Goldberger  had to overcome an immense amount of disbelief and resistance, and it is hard for us today to grasp all he had to go through.  For example, he got grants to improve the diets of institutional inmates, which promptly cleared the pellagra. But when the grants were used up, institutional diets reverted to their prior, inadequate status, and pellagra reappeared. No one seemed to pay attention – not the institutional officials, not the government, and certainly not organized medicine.

What was it about the diet of individuals who suffered pellagra that was the basis for the niacin deficiency? Southern sharecroppers, particularly when economic times were hard, lived on what were called the three Ms – “meal, meat, and molasses”. The “meal” was cornmeal and was the basis for corn bread and grits and other such typically southern foods. The meat was not, in fact, a good source of protein, as it was fatback, providing mostly fat calories. And the molasses, likewise, provided mostly a source of sugar calories.

You may wonder about the cornmeal. What we in the U.S. call corn (technically maize) is a New World plant and was the principal cereal grain used by Native Americans up and down the length of the Western hemisphere. Surely they did not all suffer from pellagra. And, in fact, they did not. The reason is that they processed the kernels of corn differently from the way Old World immigrants did their milling.

All cereal grains have to be milled in order to remove the less edible parts, thereby producing flour which can then be cooked in a variety of ways. The niacin necessary to prevent pellagra (along with the other B vitamins), is concentrated primarily in the germ of the corn kernel (and the wheat kernel, as well). The milling practices used by Caucasians both in America and in southern Europe (to which maize had been imported), effectively removed the niacin from the cornmeal, just as similar practices with rice removed thiamine (vitamin B1), leading to the disease, beriberi (which was epidemic in Southeast Asia a decade earlier). By contrast, Native Americans soaked the corn in lime water before milling, a practice that released the niacin and allowed it to be carried over into the flour during the stone grinding that the Indians used to make flour. Thus the European milling practice produced a cornmeal that was bereft of its niacin, whereas the Native American milling did not.

The Opposition Grows

Goldberger, convinced that diet was the culprit,  conducted an experiment in a Mississippi prison farm, exposing prisoners to a diet like those eaten by people manifesting pellagra and – no surprise – they developed pellagra within a few months. Medical experts claimed that it wasn’t real pellagra and found other imaginary flaws in the project.  Goldberger then went on to inoculate himself, his wife, and his assistant with blood and throat scrapings from pellagra patients – a test of the infectious hypothesis. But to no effect. He transferred skin scrapings and even fecal samples to healthy volunteers.  Sometimes the recipients got  temporarily sick, but they did not get pellagra. It simply was impossible to “catch” the disease.  Still no one paid attention.

In fact southern politicians actively resisted the conclusion that diet was the culprit, fearing that the high prevalence in their states would cast their region in an unfavorable light if the disease was caused by poverty. They could accept infection (over which they had little control), but not poor diet due to socio-economic factors – for which they could be considered responsible. This was not the first time politics tried to discredit science – and certainly not the last.

Victory at Last

The story does not have an altogether happy ending. Fifteen years after starting on the project, Jozef Goldberger died of kidney cancer. He had not yet been vindicated and there was still, despite all his work,  a prevalent view that pellagra was an infectious disease. Happily, the work was carried on by nutritional biochemists at the University of Wisconsin who were able conclusively to demonstrate that insufficient intake of niacin or nicotinamide was the entire explanation for the pellagra problem. By that time the truth could no longer be evaded. Steps began to be taken, first at a state level, and then finally by the U.S. government itself, to ensure that certain cereal products (mainly white bread flour) would be enriched with B vitamins, and specifically in this case, niacin. Doing so did not solve the underlying poverty, but it did help the inadequate diets of those trapped in poverty. In the United States, at least, pellagra is a disease of the past – fortunately – and it is doubtful today that most health professionals would recognize it if a case happened to come to their attention.

Some readers, who have looked more deeply into these topics, may recall that there is an essential amino acid (one of the building blocks of protein), called tryptophan, which the body can convert into niacin to a limited extent (limited largely by how much tryptophan may be left over after using ingested tryptophan to replace body protein). So, if the meat eaten by the poor sharecroppers had been of good quality, they might well have avoided the niacin deficiency because they would have been able to make sufficient niacin, at least to blunt the more severe manifestations of the disease.

Comment

We who are the beneficiaries of these hard won victories can too easily take nutrition for granted. It is important to reflect, occasionally, on how we got here.

The enrichment of white flour, which grew out of the concern to eradicate pellagra in this country, is a good example of food fortification, i.e., the addition to foodstuffs either of components eliminated during processing (as in making cornmeal), or components that most human beings need but may not otherwise consume in sufficient quantity (technically “fortification”).  Another major addition to cereal grain products, implemented as late as 1998, was folic acid, a vitamin necessary for normal fetal development (as well as for a number of other vital activities). But it was fetal development that was the principal stimulus, as folic acid deficiency is a principal reason for congenital defects of the nervous system, such as spina bifida and what are called, generically, “neural tube defects”. This most recent fortification was done without much public fanfare, although there was a tremendous amount of foot-dragging involved. The Food and Drug Administration had waited a full 24 years after the first request from the National Academy of Sciences before mandating the addition of folic acid.  And that was accomplished finally by political pressure, not by the persuasive force of the science.

As the human race becomes increasingly inactive, physically (at least in North America), we can no longer afford to eat the large amount of food that was necessary to fuel the hard physical work that prevailed as recently as our grandparents’ day. For that reason, we have to be increasingly conscious of the need to consume products that are high in the necessary nutrients, simply because we cannot eat as much as we once did. It is for that reason that fortification of widely consumed products (such as bread), is an increasingly attractive means of preventing critical nutrient deficiencies. Sometimes this can be an emotionally charged issue, often criticized as unwarranted governmental interference or an invasion of privacy.  This is a concern that all of us need to reflect on and become informed about.

This entry was posted in Nutrition, Policy and tagged . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

CAPTCHA Image

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>