PART ONE: Defining normal – lessons from our ancestors

Part Two: Defining normal – thermostats, feedback and adaptation
Part Three: Defining normal – living on the plateau
Part Four: Defining normal – origins and resiliency

Nutrition doesn’t know what normal is.

You might think that the idea of “normal” would be pretty straightforward. We say an engine is running normally if it is doing what it was designed to do, it does so without various kinds of hiccoughs, and it doesn’t break down prematurely. In theory the same concept should apply to nutrition, where “normal” would mean getting enough of all nutrients to allow our various organs and systems to run the way they were designed and to continue to run smoothly for as long as possible.

Unfortunately, while we know what a mechanical device is designed to do, we don’t have the same assurance when it comes to our physiology. We don’t have an owner’s manual to consult. Instead, we try to find individuals in the population who appear to be healthy, assess how much of various nutrients they ingest, and consider such intakes to be adequate (i.e.,“normal”). After all, they’re “healthy”. That seems sensible on the surface, but it is inherently circular because it begs the question of “normal”. While such individuals may not be exhibiting recognized signs of nutritional deficiency, that certainly does not mean that current intakes are optimal for long-term physiological maintenance. (A parallel is the regular changing of the oil in our cars which has no immediately apparent effect, but certainly has consequences for the future of the engine). If, as seems increasingly likely, there is a causal role played by inadequate nutrient intake in the chronic degenerative diseases of aging, then we need to find a better way to assess what is “normal”.

It’s important to understand that “normal” in this sense does not mean that a person with an adequate intake thereby has “optimal” health. Nutrition is terribly important, but it is certainly not the only determinant of health. By contrast, an “adequate” (or normal) nutrient intake is the intake above which further increases produce no further benefit to the individual – long-term or short-term. That’s conceptually straightforward, but hard to establish empirically. Of the many difficulties I might list are:

  1. The harmful effects of an inadequate intake may not be apparent until later in life; as a result the requisite studies are generally unfeasible;
  2. We may not know what effects to look for even if we could mount such a study, and;
  3. The required evidence can come only from studies in which one group would be forced to have an inadequate (i.e., harmful) intake, which is usually ethically unacceptable.

Not being able to confront these difficulties head-on, we fall back to presuming that prevailing intakes are adequate and we shift the burden of proof to anyone who says that more would be better. (“Better” here means, among other things, a smaller burden of various diseases later in life, an outcome which, as just noted, may not be easily demonstrable.)

Fortunately there are alternative approaches that could be used and that have clear parallels in other fields of medical physiology. This post is the first of a series in which I address these alternatives, beginning with ancestral intake.

Lessons from our ancestors

It’s important to recognize two key points when it comes to ancestral intake.

  1. Nutrients are substances provided by the environment which the organism needs for physiological functioning and which it cannot make for itself; and
  2. The physiology of all living organisms is fine-tuned to what the environment provides. This latter point is just one aspect of why climate change, for example, can be disastrous for ecosystems since, with change, the nutrients provided by the environment may no longer be adequate.

Thus, knowing the ancestral intake of human nutrients provides valuable insight into how much we once needed in order to develop as a species.

It’s helpful to recall that humans evolved in equatorial East Africa and during early years there (as well as during our spread across the globe) we followed a hunter-gatherer lifestyle. During those millennia populations that found themselves in ecologic niches that did not provide what their bodies actually needed, simply didn’t survive. The ones that did survive – our ancestors – were the ones whose needs were matched to what the environment provided. The principles of Darwinian selection apply explicitly to this fine-tuning of nutrient intakes with physiological functioning.

Thus knowing how much protein or calcium or vitamin D or folate our pre-agricultural ancestors obtained from their environments gives us a good idea of how much might be optimal today. There is no proof, of course, that an early intake is the same as a contemporary requirement, because many other things besides diet have changed in the past 10,000 years. But since we have to presume that some intake is adequate, it makes more sense to start, not with what we happen to get today, but with the intake we can be sure was once optimal. The burden of proof should then fall on those who say that less is safe, not on those who contend that the ancestral is better than the contemporary intake.

How do we know what the ancestral intake of many nutrients might have been? Certainly, in some cases, we don’t know, and this approach, therefore, might not be possible for such nutrients. But, surprisingly, we do have a pretty good idea about the primitive intake of many nutrients. And when we have the data, why not use what we do know for those nutrients?

There are not very many populations today living in what we might call the ancestral lifestyle, and often they are in marginal habitats which may not be representative of what early humans experienced. But that has not always been the case. Over the last 150 years there has been extensive, world-wide, ethnographic study of native populations with particular emphasis on those who have come into stable equilibrium with their environments. There are reams of data with respect to dietary intakes reposing in various libraries and museums, remarkably comprehensive, and shedding priceless light on the habits and status of people we can no longer know or experience first-hand.

Take vitamin D as just one example. We know that proto-humans in East Africa were furless, dark-skinned, and exposed most of their body surface to the sun, directly or indirectly, throughout the year. We know how much vitamin D that kind of sun exposure produces in the bodies of contemporary humans, both pale and dark-skinned, and we have made direct measurements of the vitamin D status of East African tribal groups pursuing something close to ancestral lifestyles. We know also that, as humans migrated from East Africa north and east, to regions where sun exposure was not so dominant, and where clothing became necessary for protection from the elements, skin pigmentation was progressively lost, thereby taking better advantage of the decreased intensity of UV-B exposure at temperate latitudes and enhancing the otherwise reduced vitamin D synthesis in the skin.

All of these lines of evidence converge on a conclusion that the ancestral vitamin D status was represented by a serum concentration of 25-hydroxy-vitamin D (the generally agreed indicator of vitamin D nutritional status) in the range of 40–60 ng/mL (100–150 nmol/L). Recent dose response studies show that achieving and maintaining such a level typically requires a daily input, from all sources combined, of 4000–5000 IU vitamin D.

Thus, using this ancestral intake criterion of “normal”, one might formulate a contemporary recommendation for vitamin D nutrient input somewhat as follows:

“We don’t know for certain how little vitamin D a person can get by on without suffering health consequences, but we do know that our ancestors had an average, effective intake in the range of 4000–5000 IU/day. We also know that this intake is safe today. Thus we judge that the most prudent course for the general population is to ensure an all-source input in the range of 4000–5000 IU/day until such time as it can be definitively established that lower intakes produce the same benefits.”

This entry was posted in Calcium, Nutrient intake requirements, Nutrition, Vitamin D and tagged , . Bookmark the permalink.

2 Responses to PART ONE: Defining normal – lessons from our ancestors

  1. Robert P. Heaney, M.D. says:

    I try to keep my 25(OH)D level at 100–125 nmol/L (40–55 ng/mL). To do this I usually take about 3000 IU per day as a supplement, but I’m getting more in 3+ dairy servings per day and various other fortified foods.

  2. EW says:

    Dr. what is your Vitamin D score if I may ask? I saw the You Tube video for the UCTV series and at the end of it you seemed to indicate that levels of 125 are good? And 150 optimal? Did I understand that correctly? Currently I have worked hard to get my D levels to 90. I’m thinking of getting my D levels to 130 after watching your video, as for some reason it has given me a 91-99% improvement over the last year and half that I’ve been on it.

    I had very very severe Laryngopharyngeal Reflux (LPR) before I started D3 at 6,000 IU. Medications and surgery did not help. My Drs told me that I had to learn to live with a searing burning throat and that it would take 10 years for science to figure out what I had. They would not send me outside of their hospital for a second opinion. In those last 5 months before I stumbled onto to vitamin D on my own reseach, I also developed incontinence and severe shoulder bone pain.

    Within two days that I started Vitamin D3 the incontinence and bone pain went away completely and has not returned. And the LPR has slowly gone away. My ENT tells me the severely burnt spot he biospsied in 2/2011 is healed. He wanted to know what I was doing as I wasn’t on medication (PPIs). I put myself on heavy doses of probiotics and D3 and studied everything I could find on the subject. However my throat still looks red and the acid vapor is still getting by the UES. But I’m so much better since I started D3. I am trying to strengthen my UES by taking Vit D3 and getting sun at noon from mid-April thru Mid Sept each year.

    Did I understand correctly that optimal levels of D are 150? I’m so grateful that I stumbled onto Vit D3 as it has given me my quality of life back. It is not possible to thank you adequately enough for shedding so much light on the basic understanding of our biological need of Vitamin D…I live in the 38th latitude and also avoided the sun for the last 40 years as I regrettable listened to all the news that the sun was bad. I’m so sorry I ruined my health. Now my big question that haunts me is it possible to undo decades of deficiency in D? Have I permanently ruined my health?

Leave a Reply

Your email address will not be published. Required fields are marked *

CAPTCHA Image

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>