We Can Thank Poor Evolutionary Design for Vitamin D Deficiencies
My doctor recently declared me deficient in vitamin D and prescribed a weekly pill. Because I take care to eat a healthy and diverse diet, I was a bit annoyed. She said it was no big deal and actually very common, the medical equivalent of a parent telling a child, “Because I said so.” Later on, I was grousing to some of my friends and many of them said they had gotten the same news. It made me wonder: What is going on with vitamin D?
A Vexing Vitamin
Truth be told, we shouldn’t really need vitamin D in our diets at all. Humans and other animals can synthesize this molecule right from cholesterol, something we always have plenty of. Doing so, however, involves a rather obnoxious biochemical pathway. The synthesis of vitamin D begins in the skin with activation of a precursor molecule by ultraviolet light from the sun. After that, the process moves to the liver for further chemical modification and then on to the kidneys for a final tweak.
This biochemical pathway is really odd and seems unnecessarily complicated. Our bodies are adept at making lots of complex molecules without stops in three different tissues. What’s more, involving the liver and kidneys makes for a huge detour. Because vitamin D and its precursors are fat-soluble molecules, they distribute in tissues throughout the body and especially accumulate in adipose, or fat tissue. It would make much more sense to house the enzymes for the synthesis of vitamin D within fat cells, which are often tucked right underneath the skin anyway, where the UV light is needed for the first step.
Another big problem with our vitamin D synthesis scheme is that, in cold climates, humans must bundle up in order to survive. Since keeping warm wasn’t an issue in Africa, where our ancestors lived for millions of years, we mostly ditched our own fur and instead we now rely on the furs of other animals to keep us warm, thereby blocking the penetration of UV light into the skin and reducing the synthesis of vitamin D.
In warmer regions, humans tend not to cover their skin as much, but this opens them up to skin cancer and also the UV-mediated destruction of folic acid. Skin pigmentation helps mitigate those problems, but at the cost of reduced vitamin D synthesis. The result is a problematic tug-of-war: Skin cancer and folic acid deficiency on the one side and vitamin D deficiency on the other.
There is a way around this problem, of course. Many animals, including humans, can escape this paradox by simply ingesting the activated vitamin D precursor. Dietary vitamin D still requires the activation steps in the liver and kidney, but at least it lets us stay out of the sunlight. Such supplementation has allowed humans to live in frigid climates and it is no coincidence that Arctic diets tend toward fish and whale blubber, two very rich sources of vitamin D.
And yet, despite this, vitamin D deficiencies are still common, and archaeological evidence suggests that they have been for a long time. We know from studying skeletal remains that rickets, caused by childhood vitamin D deficiency, was a common malady in pre-agricultural human populations. The domestication of fowl helped with this, as eggs are rich in vitamin D, and meat and fish are good sources as well. But prior to that, and therefore for most of our evolutionary history, eggs, meat and fish were not in steady supply and still aren’t for many people around the world.
The reason for a continued lack of vitamin D in our lives is that patterns of feast and famine can be effectively managed when it comes to calories, which can be easily stored (too easily, but that’s another story) – but not for vitamins. We don’t store vitamin D for a rainy day, so infrequent large doses do not compensate for weeks of deficiency. While we can get vitamin D in our diet, maintaining a regular supply is difficult. Supplements are often the best option for those who must go without sunlight for long stretches of the year.
What’s It Good for Anyway?
But why do we need vitamin D in the first place? The answer reveals even more poor evolutionary design. While vitamin D has many functions in the human body, the most important, and therefore most sensitive to deficiency, is the absorption of calcium in our intestines.
Humans are notoriously bad at extracting calcium from our food. We start off okay as babies, absorbing a respectable 60% of the calcium we are fed, but our absorption ability steadily drops as we mature. Adults are lucky if we can get 20% of the calcium that we ingest and by middle age, it is often below 10%. You might be happy with the 300mg of calcium in a small cup of milk, but if you’re over 50, you’re probably not absorbing more than 20 or 30mg of that calcium. The rest ends up in the toilet. And that’s the sad state of affairs when we are getting enough vitamin D.
Those deficient in vitamin D cannot absorb calcium from their diets at all. This is why milk is typically supplemented with vitamin D. If we don’t get adequate sunlight – and most of us don’t – we become desperate for calcium, even if we’re consuming plenty of it. Calcium is necessary for more than just healthy bones — it’s crucial for things like flexing our muscles — so our bodies consider the calcium in our bones as reservoirs to be tapped when the need is urgent.
When we don’t get enough vitamin D, we can’t absorb calcium, so we start pulling it out of our bones. If the bones are still growing when that happens, they become misshapen; a condition called rickets. If we’re already fully grown, the lost calcium weakens the bones and they become brittle and easily broken; that’s osteoporosis. All of this because we can’t absorb calcium, even when it’s right there for the taking.
Evolutionary Headache
The fact that vitamin D is required for calcium absorption is the most bizarre part of the whole story. All vitamin D does is signal the cells to absorb calcium. That’s it! Having a switch for calcium absorption is important because having too much can also be unhealthy. But having that switch involve another compound that also must be acquired in the diet is foolish because now there are two ways to suffer from calcium insufficiency: a lack of calcium or a lack of vitamin D.
Since wild animals don’t have the benefit of vitamin D-infused milk in their diet, and are completely covered with thick, light-blocking fur, it seems that they might suffer from similar issues.
Nope, not at all. Other mammals, including our own dogs and cats, synthesize vitamin D just fine because they activate it in their fur instead of their skin. We lost this option when we ditched the fur. Vitamin D deficiency is a uniquely human scourge. So much for being the pinnacle of creation!
So, the bottom line is that some rather glaring design flaws in our bodies have made vitamin D deficiency so common and harmful. Evolution does not produce perfection and nowhere is this clearer than our demanding dietary needs. For many of us, getting a little more direct sunlight would do the trick when it comes to vitamin D, but who really wants to flirt with melanoma? We can also try to eat fish more regularly. Or whale blubber. I think I’ll just stick with the weekly pill.
Nathan H. Lents is professor of biology at John Jay College, CUNY, and the author of Human Errors: A Panorama of Our Glitches, From Pointless Bones to Broken Genes
http://blogs.discovermagazine.com/crux/2018/07/10/vitamin-d-deficiency-evolution/#.W00B21VKiEt
A Vexing Vitamin
Truth be told, we shouldn’t really need vitamin D in our diets at all. Humans and other animals can synthesize this molecule right from cholesterol, something we always have plenty of. Doing so, however, involves a rather obnoxious biochemical pathway. The synthesis of vitamin D begins in the skin with activation of a precursor molecule by ultraviolet light from the sun. After that, the process moves to the liver for further chemical modification and then on to the kidneys for a final tweak.
This biochemical pathway is really odd and seems unnecessarily complicated. Our bodies are adept at making lots of complex molecules without stops in three different tissues. What’s more, involving the liver and kidneys makes for a huge detour. Because vitamin D and its precursors are fat-soluble molecules, they distribute in tissues throughout the body and especially accumulate in adipose, or fat tissue. It would make much more sense to house the enzymes for the synthesis of vitamin D within fat cells, which are often tucked right underneath the skin anyway, where the UV light is needed for the first step.
Another big problem with our vitamin D synthesis scheme is that, in cold climates, humans must bundle up in order to survive. Since keeping warm wasn’t an issue in Africa, where our ancestors lived for millions of years, we mostly ditched our own fur and instead we now rely on the furs of other animals to keep us warm, thereby blocking the penetration of UV light into the skin and reducing the synthesis of vitamin D.
In warmer regions, humans tend not to cover their skin as much, but this opens them up to skin cancer and also the UV-mediated destruction of folic acid. Skin pigmentation helps mitigate those problems, but at the cost of reduced vitamin D synthesis. The result is a problematic tug-of-war: Skin cancer and folic acid deficiency on the one side and vitamin D deficiency on the other.
There is a way around this problem, of course. Many animals, including humans, can escape this paradox by simply ingesting the activated vitamin D precursor. Dietary vitamin D still requires the activation steps in the liver and kidney, but at least it lets us stay out of the sunlight. Such supplementation has allowed humans to live in frigid climates and it is no coincidence that Arctic diets tend toward fish and whale blubber, two very rich sources of vitamin D.
And yet, despite this, vitamin D deficiencies are still common, and archaeological evidence suggests that they have been for a long time. We know from studying skeletal remains that rickets, caused by childhood vitamin D deficiency, was a common malady in pre-agricultural human populations. The domestication of fowl helped with this, as eggs are rich in vitamin D, and meat and fish are good sources as well. But prior to that, and therefore for most of our evolutionary history, eggs, meat and fish were not in steady supply and still aren’t for many people around the world.
The reason for a continued lack of vitamin D in our lives is that patterns of feast and famine can be effectively managed when it comes to calories, which can be easily stored (too easily, but that’s another story) – but not for vitamins. We don’t store vitamin D for a rainy day, so infrequent large doses do not compensate for weeks of deficiency. While we can get vitamin D in our diet, maintaining a regular supply is difficult. Supplements are often the best option for those who must go without sunlight for long stretches of the year.
What’s It Good for Anyway?
But why do we need vitamin D in the first place? The answer reveals even more poor evolutionary design. While vitamin D has many functions in the human body, the most important, and therefore most sensitive to deficiency, is the absorption of calcium in our intestines.
Humans are notoriously bad at extracting calcium from our food. We start off okay as babies, absorbing a respectable 60% of the calcium we are fed, but our absorption ability steadily drops as we mature. Adults are lucky if we can get 20% of the calcium that we ingest and by middle age, it is often below 10%. You might be happy with the 300mg of calcium in a small cup of milk, but if you’re over 50, you’re probably not absorbing more than 20 or 30mg of that calcium. The rest ends up in the toilet. And that’s the sad state of affairs when we are getting enough vitamin D.
Those deficient in vitamin D cannot absorb calcium from their diets at all. This is why milk is typically supplemented with vitamin D. If we don’t get adequate sunlight – and most of us don’t – we become desperate for calcium, even if we’re consuming plenty of it. Calcium is necessary for more than just healthy bones — it’s crucial for things like flexing our muscles — so our bodies consider the calcium in our bones as reservoirs to be tapped when the need is urgent.
When we don’t get enough vitamin D, we can’t absorb calcium, so we start pulling it out of our bones. If the bones are still growing when that happens, they become misshapen; a condition called rickets. If we’re already fully grown, the lost calcium weakens the bones and they become brittle and easily broken; that’s osteoporosis. All of this because we can’t absorb calcium, even when it’s right there for the taking.
Evolutionary Headache
The fact that vitamin D is required for calcium absorption is the most bizarre part of the whole story. All vitamin D does is signal the cells to absorb calcium. That’s it! Having a switch for calcium absorption is important because having too much can also be unhealthy. But having that switch involve another compound that also must be acquired in the diet is foolish because now there are two ways to suffer from calcium insufficiency: a lack of calcium or a lack of vitamin D.
Since wild animals don’t have the benefit of vitamin D-infused milk in their diet, and are completely covered with thick, light-blocking fur, it seems that they might suffer from similar issues.
Nope, not at all. Other mammals, including our own dogs and cats, synthesize vitamin D just fine because they activate it in their fur instead of their skin. We lost this option when we ditched the fur. Vitamin D deficiency is a uniquely human scourge. So much for being the pinnacle of creation!
So, the bottom line is that some rather glaring design flaws in our bodies have made vitamin D deficiency so common and harmful. Evolution does not produce perfection and nowhere is this clearer than our demanding dietary needs. For many of us, getting a little more direct sunlight would do the trick when it comes to vitamin D, but who really wants to flirt with melanoma? We can also try to eat fish more regularly. Or whale blubber. I think I’ll just stick with the weekly pill.
Nathan H. Lents is professor of biology at John Jay College, CUNY, and the author of Human Errors: A Panorama of Our Glitches, From Pointless Bones to Broken Genes
http://blogs.discovermagazine.com/crux/2018/07/10/vitamin-d-deficiency-evolution/#.W00B21VKiEt