Why the Neanderthals Lost the Race

Based on both physical and DNA evidence, anthropologists classify humans as members of the family of great apes, which includes orangutans, gorillas, and chimpanzees, as well as various human species. The branch to which we modern humans belong includes at least six species belonging to the genus Homo whose fossil remains have been found in East Africa over the past 60+ years. There was once a tendency to think of the latter as ancestors of modern humans, as if they were stages up a single branch of the tree, rising from primitive to advanced over hundreds of thousands of years, with Homo sapiens at the very tip, i.e., the “highest” form of human. On the contrary, the human family tree is both more complicated and more interesting. These various other members of the genus Homo are now recognized as separate twigs off the human branch of the great ape lineage. They are not so much our ancestors as our cousins. That branching continued to occur for millennia alongside the twig that we now recognize as modern humans (Homo sapiens). Some of those other humans made it to Europe before we (Homo sapiens) did.

The migratory path of the several human groups took them north out of Africa, then through the Middle East, ultimately colonizing Europe and Western and Southern Asia. Evidence for the European in-migration of Homo sapiens is clear and points to a time about 40,000 to 50,000 years before the present. But that’s not early enough to qualify as “first.” The discovery of settlements by other species of Homo throughout Europe, extending as far east as Western Siberia (e.g., the Denisovan people, >50,000 yrs ago), indicates that they predated the arrival of Homo sapiens by thousands of years.

The best studied of these earlier migrants out of Africa are the Neanderthals, whose remains and cultural artifacts are found throughout Europe, and who clearly precede the arrival of Homo sapiens. Anthropologists and paleontologists have puzzled over how it was that Homo sapiens, as late arrivers, came to displace the Neanderthals. Many theories have been proposed, ranging from superior weapons and technology to superior intelligence, and most recently to the use of what Scientific American termed “the ultimate weapon,” cooperation.

For the most part, these explanations have not been completely satisfying. Nevertheless, one factor does seem certain: Homo sapiens effectively “swamped” the Neanderthals. There were simply many more of us than of them. But that alone does not explain the apparent, complete disappearance of the Neanderthals. DNA evidence indicates that there was some limited interbreeding between resident Neanderthals and immigrant Homo sapiens. So, in one sense, some of the Neanderthal genome has survived. Interestingly, the presence of Neanderthal DNA in the modern human genome, which amounts to something like 2–3 % of the total genome, is largely confined to modern Europeans and Asians, indicating that the interbreeding occurred after the arrival of Homo sp. in the Middle East and/or Europe.

Still, why did the Neanderthals disappear from Europe? Recently, Leonard Greenfield, a physical anthropologist/paleontologist at Temple University, has set forth a persuasive case for a critical role of vitamin D, both in shaping the evolution of modern humans, and in explaining the disappearance of the Neanderthals.
It is generally agreed that the ancestral home of all of the various Homo species was East Africa, a location that would have provided abundant vitamin D in the form of sunlight. (Solar UV-B radiation converts a precursor compound into vitamin D.) Contemporary individuals from East African tribes exhibit a vitamin D status, derived mainly from cutaneous synthesis, which is equivalent to what would be produced in a Caucasian by a purely oral intake of 5,000 to 8,000 IU/day. However, it is also known that solar input of vitamin D inexorably diminishes as individuals move north out of equatorial latitudes.Thus north-migrating peoples coming out of East Africa pretty much all faced some degree of vitamin D deficiency.

That fact is generally considered to be the main explanation for the rapid loss of skin pigmentation among the migrating tribes of Homo sp. The heritable mutation that lead to the shift to pale skin thereby enhanced cutaneous synthesis of vitamin D and thus partially offset the diminished solar UV-B irradiance at higher latitudes. Individuals without that change in skin pigmentation would have been even more seriously vitamin D deficient than the others, and their pelvic bone structures could have been so distorted by D-deficiency rickets that delivery of babies from below would have been difficult or outright impossible, leading ultimately to extinction of those tribes and families that failed to develop pale skin.

But that simply means that all migrants coming out of Africa would have had marginal to deficient vitamin D status. The farther the northward migrants got from their place of origin, the worse their vitamin D status. But that tells us nothing about why Neanderthals, particularly, lost the race to survive in Europe. The only sources of vitamin D available to European Homo sp. would have been what little sun exposure might have been available and a diet rich in seafood & marine mammals. The high latitude of most of Europe and its extensive and persistent cloud cover mean that most individuals would have gotten little vitamin D by the solar route, which leaves only food. Greenfield points out that only the Homo sapiens immigrants had developed cultural practices that included fishing and/or eating the meat and fat of marine mammals. As a result, the Homo sapiens “immigrants” would have had been better able to achieve and maintain a healthy vitamin D status than the Neanderthal “natives”.
But general health, alone, is probably not a satisfactory explanation for what appears to have been the fairly rapid extinction of the Neanderthals. There’s more to the story. Adequate vitamin D status is absolutely essential for an organism to mount an adequate immune response, particularly in the face of foreign antigens, to which the “natives” would have had no prior exposure. (There are many contemporary examples of populations being “wiped out” by infectious diseases with which they had had no experience, brought to them, even if unwittingly, by “discoverers” or colonizers.)

Thus it appears likely that native, Neanderthal populations, would have declined both in numbers and in dominance simply because, unprotected by adequate vitamin D and hence with compromised immune competence, they succumbed to diseases brought to them by the invading Homo sapiens, whose vitamin D status was better and who, in addition, had inherited some degree of resistance to the diseases concerned. Also, as just noted, the invaders had dietary practices that, in comparison to Neanderthals, better suited them to live and thrive in a vitamin D-deprived environment (i.e., fish eating). Presumably, had the resident Neanderthals been able to achieve a more adequate vitamin D status they would have been better equipped to deal with the diseases brought to them by the invading Homo sapiens migrant bands.

There is a moral to this story, namely that nutrition is important after all, not just for the health of individuals, but for the survival of whole populations. But there is yet another insight to be gained. We are able to discern the association between poor population-level survival and low vitamin D status in the Neanderthals, but only from our great distance in time. Individual Neanderthals with inadequate immune competence would have been prone to become sick or to die, but up close one could not have been certain that it was the vitamin D status that was responsible, even if we had been there. Nor would every individual with low vitamin D status have succumbed. There is great deal of variability in sensitivity to, and need for, vitamin D from person to person. It’s just that, considering the population as a whole, the risk of a Neanderthal individual’s developing one of those unfamiliar diseases would be elevated, and, as a group, Neanderthals would thus be less competitive in a Darwinian sense. This was the reason Greenfield puts forth and it seems the most satisfactory of extant explanations for the fact that the Homo sapiens population grew and prospered, while the Neanderthal population, already fewer in numbers, shrank.

Further reading:
Greenfield, L.O. Vitamin D Deficiency in Modern Humans and Neanderthals. (2015). OutskirtsPress, Denver, CO

Posted in Nutrient intake requirements, Nutrition, Vitamin D | Tagged , , , , | Leave a comment

The IOM Miscalculated Its RDA For Vitamin D

Last year (2014) saw an unusual event. Two statisticians at the University of Alberta in Ednonton, Canada (Paul Veugelers and JP Ekwaru) published a paper in the online journal Nutrients (6(10):4472-5) showing that the Institute of Medicine (IOM) had made a serious calculation error in its recommended dietary allowance (RDA) for vitamin D. Immediately, other statisticians checked the Canadians’ analyses and found that, indeed, they were right. Together with my colleagues at Grassroots Health, I went back to square one, starting with a different population entirely, and came to exactly the same conclusion. The true RDA for vitamin D was about 10 times higher than the IOM had said. Not a small error. To understand, how this might have happened and why this is important, some background may be helpful.

An RDA is technically the amount of a nutrient every member of a population should ingest to ensure that 97.5% of its members would meet a specified criterion of nutritional adequacy. For vitamin D, the IOM panel determined that the criterion for adequacy was a serum concentration of a particular vitamin D derivative (25-hydroxyvitamin D) of 20 ng/mL or higher, and that for adults up to age 70, 600 IU of vitamin D per day was the RDA.

Both of those figures provoked immediate and unprecedented dissent from a diverse group of nutritional scientists, but the disagreement centered mostly around the IOM panel’s reading and interpretation of the evidence, rather than its calculation of the RDA. The Edmonton statisticians took the dissent a step further, showing that the actual calculation was itself wrong. Here’s what seems to have happened.

What Happened
Not everyone gets the same response to a given intake of any particular nutrient, i.e., some require more than others to reach the specified target, and while the average response to a certain dose of vitamin D may be above the target level, a substantial fraction of a population can still be below it. Thus, the RDA will always be higher than the average requirement, and for some nutrients, substantially so. As a consequence, ensuring that every member of a population receives the RDA guarantees that 97.5% of that population will be getting at least enough, while many will be getting more than they actually need.

The IOM panel identified a number of published studies showing the 25-hydroxyvitamin D response to various vitamin D doses. They plotted the average response in each of those studies against dose, thereby generating what is termed a “dose response curve”, i.e., a way to estimate how much of a response would be predicted for any given vitamin D intake. But, to make a long story short, because it used average responses, that curve tells us nothing about the intake requirement for the individual members of a population, and particularly those whose response to a given dose falls in the bottom 2.5 percentiles. The IOM panel surely knew that the average intake required to meet or exceed 20 ng/mL was not the same as the RDA, as it would be inadequate for all those with below average responses (about half the population). So, to catch the “weak” responders, they calculated the 95% probability range around their dose response curve, designating as the RDA the point where the bottom end of that probability range exceeded 20 ng/mL. While this might seem to have been the right approach, it was not. The panel appears to have overlooked the fact that the 95% probability range for their curve is for the average values that would be expected from similar studies at any particular dose. The dispersion of averages of several studies is, as every beginning student of statistics knows, much more narrow than dispersion of individual values within a study around its own average. And it’s the 2.5th percentile individual values from those studies, not the study averages, that should have been used to create the relevant dose response curve.

It’s this latter approach that the Canadian statisticians used. They took precisely the same studies as the IOM had used and demonstrated that the requirement to ensure that 97.5% of the population would have a value of at least 20 ng/mL, was 8,895 IU per day. Recall that the IOM figure was less than 1/10 that, i.e. 600 IU per day up to age 70 (and 800 IU per day thereafter). When my colleagues and I analyzed the large GrassrootsHealth dataset, we calculated a value closer to 7,000 IU per day, still a full order of magnitude higher than the estimate of the IOM, and not substantially different from the estimate of Veugelers and Ekwaru.

Why This Is A Problem
This is an important mistake, not simply because it shouldn’t have been allowed in a major policy document, but because IOM recommendations have important effects on a wide array of government programs. These include nutritional standards for US military, for school lunch programs, for WIC and many others, both in the United States and in Canada.

Canada, which paid one third the cost of generating the IOM report, is in a particularly difficult situation. Its First Nations peoples, living near the Arctic Circle, do not get any vitamin D from the sun, as do those of us living at more temperate latitudes. They are totally dependent upon food and supplement sources. Their ancestral diets, based largely on seals and whales, constituted a rich source of vitamin D.  They are much less commonly consumed today, in part because of the ready availability of low nutrient density foods flown in from the south, and in part because environmental pollution has made seal and whale products a source of dangerous toxins (as well as necessary nutrients).  The Canadian government, responsible for the health of all of its citizens, can turn only to the existing IOM recommendation (600 IU per day) to set standards for the people living in its northern territories. But, as the Edmonton statisticians noted, that number is woefully inadequate.

There is almost no public awareness of this error or its implications in the United States, but that is not true for Canada. A large nutritional health foundation located in Calgary (Pure North S’Energy Foundation) has taken out a series of half page advertisements in Canada’s national newspaper (Globe and Mail), alerting Canadians to the fact that the error was made and that they need more vitamin D than current policy indicates (  The IOM, Health Canada, and the Canadian Ministry of Health have all been formally alerted to this problem. The Health Ministry has agreed to undertake an independent reanalysis of the calculation of the RDA, but the results are not yet available and the shape of the ministry’s action is still uncertain.

How It May Have Happened
It’s one thing to know how the mistake was made, and quite another to know how it could have happened. Here, one can only speculate, as the IOM processes are shrouded in secrecy. The IOM report was a massive document and it is likely that much of the background work, such as the literature search, the drafting of the report, and the statistical calculations, were done by IOM staff members who may not, themselves, have been sufficiently expert in the vitamin D field to recognize discrepancies that might have popped up. (It is noteworthy that several of the dissenting letters submitted to scientific publications following release of the IOM report had specifically cited the fact that 600 IU per day was not sufficient to guarantee a level of 20 ng/mL.) It would then have been up to the expert panel to review and adjust this staff work. To be fair to the panel, it is important to understand that the scientific members of IOM panels are not compensated for their time and effort. They do it as a public service, and they are all busy scientists with work of their own. Still, it was their job, and one must wonder how they failed to see an error that was apparent to other equally knowledgeable, but outside, scientists.

There may be a moral here. It is widely recognized that many of the panel members, before coming together to review the evidence, had already staked out a position to the effect that, while the previous (1997) recommendation for vitamin D (200 IU per day) was probably inadequate, the actual RDA was almost certainly below 1000 IU per day. Accordingly, when the statistical calculations produced a number that matched their own expectations, they may not have been inclined to question its derivation.

There is a generally held belief that science is objective, data-driven. And to a substantial extent that is so. But science and scientists are not identical. Scientists often have strongly held opinions and, like people in general, find ways to construe the evidence to support their beliefs. When those beliefs are wrong, science, as a field, ultimately abandons them. I am confident that this IOM error will be corrected sooner or later. This is partly because it is demonstrably erroneous, and partly because the related set of IOM recommendations for vitamin D has not elicited a consensus in the field of vitamin D research.  If the Dietary Reference Intakes produced by the IOM are important, then it is important that they be right. I can only hope that not too much human damage will occur as we wait for the needed correction to happen.

[Author’s note: if readers are aware of factual inaccuracies in what I have written here, I would be grateful if you would call them to my attention at , so that I can correct this post.]

Posted in Nutrition, Vitamin D | Tagged , , , , , , | Leave a comment

The Paradox of Osteoporosis Irreversibility

Over 30 years ago I gave a paper at an osteoporosis meeting in Jerusalem titled “The Paradox of Irreversibility of Age-Related Bone Loss.” By  ”irreversibility” I meant that once the bone was lost there was not much that could be done to restore it.

Bone_HeaneyPerhaps “puzzle” would have been a better word than “paradox.”  From our experience with other similar situations, we would have expected that the lost bone would be restored.  The underlying facts are that during the postmenopausal period bone loss occurs rapidly as estrogen levels drop to low values.  Estrogen replacement therapy started at menopause prevents that loss, showing clearly that it is the estrogen deficiency that is responsible. Similarly, severe calcium deficiency also leads to bone loss, and maintaining a high calcium intake does slow that loss, and perhaps even prevent it.  And, as is generally recognized, low calcium intake and low estrogen status are common in contemporary women during the post-menopausal years. These factors are the principal reasons for age-related bone loss in women.

But neither estrogen, nor calcium, nor the combination of the two, will restore the lost bone after it is gone.  This seemed puzzling because with many other nutritional and hormonal deficiencies, restoring the lost hormone or nutrient does generally return the body to its pre-deficiency state.  Examples would be hypothyroidism, which responds to thyroid hormone replacement, and iron-deficiency anemia, which responds fully to replacement of the lost iron.

The explanations for this seeming irreversibility which I offered at the time were twofold.  1) bone building (or rebuilding) requires  weight-bearing or impact exercise, and physical activity generally declines after midlife; so a condition necessary for rebuilding was missing. 2) much of the lost bone is trabecular in character, i.e., the spongy latticework in the center of bones such as the vertebral bodies of the spine; once that lattice is lost, there is no longer a scaffolding or framework on which to rebuild.

I believe that both reasons are at least partially correct, but today they seem to me far from satisfactory explanations for this puzzling irreversibility. There is, I think, a better, more complete explanation, one that can be tested (and thus proved or disproved), and one that, if correct, could revolutionize the treatment and prevention of osteoporosis.

Bone is not just calcium. It is made up, first of all, of a protein matrix within which the calcium salts are embedded.  Soak a bone in acid and you remove the calcium.  But what’s left still looks like the bone you started with, except now it’s rubbery rather than hard. It’s now all protein and no mineral. The key point is that, while bone is the body’s reservoir of calcium, that calcium is tied up as part of a structure, the largest component of which is protein. When the body needs calcium and has to make withdrawals from the skeletal reserves, it does so not by leaching the calcium from this protein-mineral complex, but by physically tearing down microscopic units of bone and scavenging the calcium that is released in the process. Inevitably, therefore, the protein matrix – the structure – goes as well.

In order to profit fully from a high calcium intake, a patient who has lost bone needs to consume enough protein to allow the body to rebuild the lost structure. Otherwise all that a high calcium intake can do is to prevent the body’s further tearing down of bone to meet the calcium needs of other body systems and tissues. That’s a good thing to do, but it is not enough. Nevertheless, it is precisely to prevent that draining of the body’s calcium reserves that a high calcium intake (whether from food or supplements) is today a vital part of the standard of care for patients with osteoporosis. Even so, the failure of nutritional replacement to rebuild lost bone is what originally set the stage for the entry of pharmaceutical agents, some of which can produce substantial bone rebuilding.

That landscape began to change a few years ago when an insightful investigator at the Tufts Nutrition Research Center on Aging in Boston noticed that a high calcium intake did, in fact, lead to increased bone gain if the patient’s intake of protein was high. Bess Dawson-Hughes had previously published the results of a calcium and vitamin D supplementation trial, producing a better than 50 percent reduction in fracture risk in healthy elderly Bostonians with those two nutrients alone. But, like others before her, she noted that, while high calcium intakes reduced or stopped bone loss in her treated subjects, the two nutrients didn’t lead to bone gain. They didn’t, that is, in individuals consuming usual protein intakes. However, in a subset of her treated patients, who, it turns out, had protein intakes above 1.5 times the RDA (0.8 g/kg body weight), bone gain was dramatic (while it was zero in those with more usual – and usually thought “adequate” – protein intakes). The figure below shows the 3-year change in bone mineral density (BMD) at the hip in the calcium- and vitamin-supplemented participants in the Tufts study.  Only with the highest protein intakes was there appreciable bone gain.

Ca-Prot_for_blogFor me, it was an “Aha!” moment.  Why hadn’t we thought of that?  It was known that bone is 50 percent protein by volume (but only about 20 percent calcium by weight).  And it was known that when bone is torn down (as with estrogen or calcium deficiency), its protein is degraded in the process. So it made sense that, to rebuild the lost bone, you would need not just calcium but fresh protein as well.

When I first heard of this result, I immediately went to our own Creighton database on calcium metabolism in midlife women (the “Omaha Nuns Project) and looked to see whether protein intake (which we had recorded and measured) made a difference in the bone metabolism of our nuns. There it was, just as the Tufts investigator had shown. Our nuns with protein intakes below the median for the group could not retain calcium, no matter what the intake (i. e., they couldn’t build bone). By contrast, those with protein intakes above the median for the group retained extra calcium reasonably well.

So, here were two distinct data sets, two quite different investigations, exhibiting the same interdependence of calcium and protein. What we, and probably most clinical nutritionists, had failed to recognize, was that the adult RDA for protein is just barely enough to prevent muscle loss, and is not enough to support tissue building or rebuilding. But, as already noted, when calcium deficiency leads to bone loss, the bone protein is lost as well, and that has to be rebuilt to restore the lost bone.

This mutual dependence of calcium and protein provides a good illustration of two key (and often underappreciated) aspects of nutrition. The first is that nutrients almost always act together with other nutrients. The second feature is what Bruce Ames of the University of California, Berkeley, has called a “triage” system within nutrition. The body operates a triage mechanism, ensuring that the most vital functions receive the nutrients first and leaving the other tissues and systems of the body to get by on what is left over. It seems that this triage mechanism is at work with respect to adult bone rebuilding.  With limited protein intake, the body ensures that its most vital functions are served first.  Bone, in effect, gets the leftovers.  We need a high protein intake precisely to ensure that there will be something left for bone.

Two unplanned observations such as those of Dawson-Hughes and our own Creighton group, even if they make perfect sense, would not generally be considered enough to change public policy, particularly when it comes to nutrient intake recommendations.  So, if we are to be certain that supplementing both protein and calcium will permit rebuilding of lost bone, it will be necessary to mount one or more clinical trials testing that hypothesis.

Such a trial would likely be designed to start  with a group of probably several hundred postmenopausal women who had already lost bone and whose protein intakes were in the range of the current RDA, that is about 0.8 g/kg/day.  All would be supplemented with sufficient calcium to permit maximum bone building if the individuals concerned could, in fact, use the calcium efficiently. They would all also receive sufficient vitamin D to ensure a serum concentration of 25(OH)D of 40 ng/mL or higher.  Then half would be given a diet, probably involving a protein supplement, which would raise their protein intakes to above 1.2 g/kg/day.  [Some might argue that there should be a third group, one with protein intakes at the RDA, but without the substantial calcium and vitamin D supplementation envisioned above.]  In either case, trial duration would be about three to four years, and the endpoint would be the observed change in BMD over that treatment period. The predicted outcome would be that the lower protein group receiving calcium and vitamin D would have no appreciable change in BMD, while the higher protein group, also receiving extra calcium and vitamin D, would exhibit clinically significant bone gain.  As outlined here, such a trial could not be blinded, mainly because the diets would be perceptibly different.

Even if such a trial were to start today, it would probably be at least five years before the results would be clear and actions could be taken to change official recommendations and influence individual dietary behaviors.  What should one do in the meanwhile?

This is a matter for individual decision, but it is helpful to know that high protein intakes are safe. Their principal negative impact is on the wallet, not the body (as rich protein foods tend to cost somewhat more than foods high in added sugars, for example, or other types of empty calories).  For me the decision is easy; I’d opt for the high protein intake without a second thought (with, of course, adequate calcium and vitamin D, as well).  As a dividend, I should note that there is one food group that is both a very rich source of protein and at the same time the principal source of calcium in the diets of first world populations – dairy.  Moreover, if consumed as milk, its cost is less than the average cost of the other foods in your grocery cart.

Even if the trial were to be successful, It  would be naïve to think that would be the end of the story.  Recall that the pharmaceutical industry stepped into this field 25 years ago when it appeared that nutritional therapy was not up to the task (at least as it was conceived at the time).  If this protein hypothesis is correct, then better nutrition could be a much better form of prevention than pharmacotherapy.  However, I suspect that the pharmaceutical industry will not back out of the field as readily as it got into it.

To be fair, resistance from big Pharma should not be surprising. After all, they’ve invested billions of dollars in helping us solve a critical health problem for an aging population. Naturally, they (and our pension funds who are their stockholders) want to protect that investment. Still, if diet can do the job for us, few would choose a lifetime of pill taking or injections over better eating.

This blog would be incomplete if I did not call attention to the fact that bone structure and density are designed by natural selection to resist mechanical loads, in other words, to permit a person to do physical work.   In the absence of continuous mechanical loading, there is no diet, by itself, that will allow an older adult to regain the bone he/she had as a child.  So, yes, calcium is important.  And protein is important.  But physical work is important, too.  How much Ca? – probably 1500–1800 mg/day.  How much protein? – probably at least 1.2 g/kg body weight/day.  How much exercise?  – probably about what the cardiovascular exercise people recommend, with special emphasis in this instance on impact exercise, such a jumping rope.  Look at toddlers.  Look at the impact forces to which they subject their skeletons.  That’s how they grow bone.

Posted in Calcium, Nutrient intake requirements, Nutrition, Vitamin D | Tagged , , , , , , | Leave a comment