Reminder that taking Vitamin D in excess without having enough Vitamin K could lead to vascular calcification, whereas sufficient levels of Vitamin K promotes proper absorption of calcium into bones.
Vitamin K supplementation for the primary prevention of osteoporotic fractures: is it cost-effective and is future research warranted?
https://pubmed.ncbi.nlm.nih.gov/22398856/
I don't believe they controlled for vitamin D status. The line has always been you need k and d in sufficient levels to get calcium to the right place so I'm not surprised that one or the other causes problems (although I do really want to see someone redo this with people with clinically validated moderate vitamin d levels).
That's not the same as controlling for vitamin d because people's blood levels vary wildly even with supplementation. To control for vitamin d level you would need to blood test for it and supplement until the blood level is between 20 ng/ml and 50 ng/ml, ideally around 30. Depending on how they were supplementing blood levels could be way too low still (especially for obese people) or way too high with likely different effects on the outcome here.
The dosage of vitamin K was probably much too low. In Japan K2 is dosed at 45mg (not mcg) per day, which was found to be the minimum effective dose for osteoporosis.
A reasonable dose for coronary calcium is probably at least 5mg or more, a far higher dose than the 750mcg they used in the study.
Fitness influencers and podcasters took the Vitamin D story and ran with it to extremes. The number of people who think more is better with vitamin D or assume that their levels are severely low without checking is scary.
Taking a nominal amount of Vitamin D is probably a good idea for those of us who spend a lot of time indoors. Taking 10s of thousands of IUs every day for years is probably a bad idea for anyone who isn’t regularly checking their Vitamin D levels.
> Taking 10s of thousands of IUs every day for years is probably a bad idea for anyone who isn’t regularly checking their Vitamin D levels.
This extreme is obviously bad but if you spend most of your time indoors and you're located somewhere that doesn't get much sun without government fortifying your food with vit D (like some Scandi countries do), I don't see the harm in taking somewhere around low 1000sIU per day.
I’m indoors most of the time (previous skin cancer so I’m really careful about sun exposure these days) and I’ve been on 5000 IU daily for years and my levels show up basically perfect on the blood test every year
"The average content of vitamin D 3 found in wild caught salmon was 988 ± 524 (mean ± SEM) IU of vitamin D 3 /3.5 oz which is a typical amount that is served for dinner (Table) In contrast, farmed salmon had approximately 25% of the vitamin D content present in the flesh of wild salmon"
And everyone knows that a fish meal that size doesn't satisfy all the way to bed time.
So you can imagine a hunter gatherer running around eating a ton of fish, berries etc. and how many IU's that would be
Unfortunately I have not seen any study that has been conceived in such a way that could allow to determine whether supplements with vitamin K2 are useful or not.
It is known with certainty that humans convert a part of the ingested vitamin K1 into vitamin K2.
While there have been many studies that have shown beneficial effects of supplements with vitamin K2, none of them have used a correct control group.
A deficiency in vitamin K2 can have two causes, either the capacity of a human body to produce vitamin K2 is not enough to cover the necessary amount of it, or the daily intake of vitamin K1 has been too low.
In order to distinguish the two causes, the control group, which does not take vitamin K2 supplements, must have a much greater daily intake of vitamin K1, perhaps 10 times or 20 times greater.
It is possible that the relationship between vitamin K1 and vitamin K2 is like between beta carotene and vitamin A, where, with a high enough daily intake of beta carotene, vitamin A is no longer needed, because the body can produce enough of it.
Unlike for beta carotene, we do not know yet whether there is a threshold of daily intake of vitamin K1 above which there is no need of vitamin K2 supplements.
It would be important to know this for sure, because vitamin K1 is extremely abundant in many green vegetables and it would be simple to reach very high daily intakes without any nutritional supplements.
Eh. I know one of the researchers whose name is on one of those Vitamin D studies. Their recommendation to me was 5000IU daily. I think there's a lot of caveats to what you're saying, and the benefits outweigh the risks.
"If you choose to take vitamin D supplements, 10 micrograms a day will be enough for most people."
"Do not take more than 100 micrograms (4,000 IU) of vitamin D a day as it could be harmful. This applies to adults, including pregnant and breastfeeding women and the elderly, and children aged 11 to 17 years."
The problem with vitamin D is that there's a lot of wrong information. It's because of this[0]:
>A statistical error in the estimation of the recommended dietary allowance (RDA) for vitamin D was recently discovered; in a correct analysis of the data used by the Institute of Medicine, it was found that 8895 IU/d was needed for 97.5% of individuals to achieve values ≥50 nmol/L. Another study confirmed that 6201 IU/d was needed to achieve 75 nmol/L and 9122 IU/d was needed to reach 100 nmol/L.
>The largest meta-analysis ever conducted of studies published between 1966 and 2013 showed that 25-hydroxyvitamin D levels <75 nmol/L may be too low for safety and associated with higher all-cause mortality
10 mcg = 400 IU. It's based on the previous (erroneous) daily recommended intake of 20 mcg per day.
Also, if you look at how much vitamin D you get from sunlight then [1]:
>A minimum erythema dose (1 MED) amounts to about 30 minutes midsummer midday sun exposure in Oslo (=20 mJ/cm2). Such an exposure given to the whole skin surface is equivalent to taking between 10,000 and 20,000 IU vitamin D orally, i.e. as much as 250-500 μg vitamin D, a similar amount as that obtained when consuming 200-375 mL of cod liver oil
It makes me wonder if health authorities use a dart board to set recommended amounts for vitamins. But hey, at least they're finally changing things and 4000 IU (100 mcg) seems reasonable.
I'm just wondering how they justified keeping recommendations at 10-20 mcg a day when the sunlight study says you can get 250-500 mcg in 30 minutes.
Edit: I believe it's 30 minutes of midday sun. You're going to get a lot less vitamin D at other times of the day.
I'm on 10k IU / day and have been for a couple of years now (my specialist prescribed it for me as part of Anti-MAP treatment for Crohn's Disease). He found a high correlation between patients that took high doses of vitamin D and recovery vs those that didn't (But he didn't give me a paper to read on it unfortunately)
I was once at a dinner sitting at the table with the head of the USA office of supplements and asked him how much Vitamin D one should take and he said “Get sunshine.”
I don't know. But it is my understanding that how high up in the sky the sun is matters a lot. Ie 5 pm sun is going to be a lot less effective than midday sun, because UV-B light has to go through more of the atmosphere.
>Does it scale up to a limit?
Yes, because sunlight only really makes a precursor to vitamin D. Ie you can't overdose on vitamin D from sunlight.
Prompt: generate a dataframe that lists: the amount of sunlight exposure by time, given the UV index, that corresponds to a given dose of Vitamin D in IUs, and milligrams (for each and/or an average patient profile)
How much vitamin D supplementation results in the same serum level as sunlight exposure?
Background: ... Majority of the adults are deficient in both vitamin D and magnesium but continue to go unrecognized by many health care professionals.
Conclusions: Vitamin D screening assay is readily available, but the reported lower limit of the normal range is totally inadequate for disease prevention. Based on the epidemiologic studies, ∼75% of all adults worldwide have serum 25(OH)D levels of <30 ng/mL. Because of the recent increase in global awareness, vitamin D supplementation has become a common practice, but Mg deficiency still remains unaddressed. Screening for chronic magnesium deficiency is difficult because a normal serum level may still be associated with moderate to severe deficiency. To date, there is no simple and accurate laboratory test to determine the total body magnesium status in humans. Mg is essential in the metabolism of vitamin D, and taking large doses of vitamin D can induce severe depletion of Mg. Adequate magnesium supplementation should be considered as an important aspect of vitamin D therapy.
If I read this right, they’re increasing the maximum non-prescription dose of vitamin D from 1000 IU/day to 2500 IU/day. This seems to be the official dose on the label; nothing prevents you from taking extra pills. There’s some discussion of methodology; the new limit seems to be a maximum known-safe dosage for someone in the 95th percentile of vitamin D intake. There’s a fairly large safety margin built into these numbers for risk tolerance.
The flaw in this "science" is it's one-size-fits-all rather than µg/kg.
In addition, it doesn't take into account absorption, genetics, epigenetics, or much of anything except an average guess by age and gender.
Picking magic numbers and asking people to stupidly follow them isn't science, it's blowing smoke. Science would be routinely checking levels of all vitamins, minerals, and micronutrients and asking patients to adjust their intake accordingly.
I had horrific migraines. Would walk into a store, and the lights would leave me completely dysfunctional within 10 minutes. Tons of triggers. Spent 18-20 hours a day in bed unable to function.
After about a year of 3000u daily vitamin D, my migraines were mostly gone. Lights and most other triggers didn’t bother me at all.
I did get tested at beginning and levels were quite low.
I'll add my anecdata. I'm dark skinned and an engineer, so sunlight and me are strangers at best.
I've taken 5000 IU/day (with vitamin K2) and have my blood levels tested every year. I don't remember what my serum levels are but they were on the high end of healthy individuals. Last I checked, my levels matched what a person reaches who spends the day in the sunlight, perhaps like a outdoor worker.
Whatever you try, if you have an annual check up, you should get you blood levels tested to verify you reach sane levels. I think most studies and articles fail to mention how dark the subjects are.
Agreed... also anecdotal, I have a autoimmune condition (psoriasis which includes on my face) that taking more Vitamin D helps with (so long as stress doesn't increase). I started taking 50,000 IU's to keep it under control. [I also take the K2 supplement] I also took a DNA test, and it indicated I had problems with Vitamin D absorption. A year later, my blood levels were tested and I was square in the normal range (taking 50,000 IU a day). But the two data points (from DNA and blood test) I think are important considerations.
Am in Canada, my doctor is Nigerian. He mentioned that for most of his patients, esp. the darker skinned ones, if they come in presenting symptoms of things that could be fixed by Vit D he doesn't even bother ordering blood work anymore -- he just sends em to go buy a bottle of Vit D, and call me if that doesn't work or things get worse.
Even in the US, the RDA is woefully inadequate because of the variance in requirements per individual. Also, it was miscalculated by the IOM. [0]
Anecdotally, I must take 11-12k IU/day (80k / week) of D3 with K2 MK7 and a multivitamin. My blood levels are only normal because of this "absurd" amount and I would be deficient if I followed these recommendations.
From what I understand, the levels you are taking could be dangerous depending on individual so such levels should only be done after verifying Vitamin D levels.
Unfortunately your doctor is clueless about Vitamin D. The Standard American Diet (SAD) contains little or no K2 so your doctor can not assume you will get it from your diet.
While vitamin D has extremely no abuse potential, and very rare side effects, the evidence from "The Big Vitamin D Mistake"
paper (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5541280/) clearly recommends a dosage of at least 8000IU daily for adults and seniors.
The effects of vitamin D shortage have disproportionate negative effect on population with dark skin tone, but is frequently offset by traditional Inuit diet.
Health Canada should consider recommending 8000IU vitamin D + vitamin K as daily prescriptions to all healthy adults. Fortification of milk with cholecalciferol has been effective in Finnland, so may also be considered.
I sent this to Health Canada email, in case they did not consider the consequences.
Apparently Bureau of Nutritional Sciences of Food Directorate is responsible for the norms in Canada. I resent the request to update recommendation there and asked whether they reconsidered the mentioned paper.
Strange how doctors I follow tell me that in winter months I'd need 2000IU per day of VitD it needed for my latitude/ sun exposure/ skin color. While the recommended daily intake is about 10x lower, so low that it may even be too low to measure the benefits. Are those FDA-like institutes so slow?
If milk was that bad, humans wouldn't have developed adult lactose tolerance 10K years ago. Being able to drink/eat milk and dairy products was clearly an advantageous adaptation because it became a very widespread mutation.
Finland supplements milk with cholcalciferol to prevent vitamin D deficiencies.
This lead to public health improvement: the evidence from "The Big Vitamin D Mistake" paper (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5541280/) clearly recommends a dosage of at least 8000IU daily for adults and seniors.
Milk is one of the healthiest foods you can eat. It is a complete food, meaning it has all essential nutrients for human nutrition, including protein, fat, carbs, essential vitamins and minerals. If you had to pick a single food to live off for the rest of your life, milk would definitely be in the top 3.
Potatoes. They a very wide breadth of nutrients. If you had to eat only 1 food, potatoes would be best, because eating that much breadth would turn it into depth.
I think milk, potatoes, broccoli would be a hard to beat top 3, maybe eggs instead of milk.
It’s utter nonsense, unless you are lactose intolerant. The mutation that allowed adults to digest milk had such a high fitness value it spread from Scandinavia through the human population like wildfire by evolutionary standards.
Lactose intolerance is not the only problem with lactose.
Those who do not have lactose intolerance split lactose into glucose and galactose, which are absorbed.
Like fructose, galactose must be converted in the liver. In people where either the conversion is slower or they consume a lot of dairy, galactose may persist for longer times in the body. In such cases there is an increased risk of eye cataract.
We lose the ability to properly digest milk after the weaning stage.
Some have developed "lactose tolerance" but that's not saying it's great food.
We developed "lactose tolerance" in places where the winter was hard and you would get mostly gains+potatoes all winter long: in those scenarios milk (dairy products) is great! But that's "near starvation" from our modern day understanding.
We know milk has a lot nutrients, but also can literaly make us sick. 85%+ of the world population is lactose intolerant (the default). It contains casomorphine, which is addictive (so calf follows mum), hence the reactions of dairy addicts in this thread (like to want to take an addicts crack away). Dairy makes our bodies inflame, and promotes obesity.
But the dairy industry is verrrry good in promotion. So everyone and their dog know it "healthy and promotes strong bones" (which is bollocks).
About 1/3 of adult humans can digest milk. For the majority it is harmful. However the other third have evolved to digest milk as adults and it isn't harmful in the same way.
It is an intersting coincidence that ability to digest milk as an adult and fluency in English are correlated.
As a Latin American adult, I offer the counter-evidence that most adults I know can drink milk with little issue, while few are fluent in English. The mutation that allows humans to drink milk beyond their infancy spread about 10 millennia ago, way before the Indo-European family of languages was a thing.
Edit: Did a little sleuthing. I'm from Mexico, and just learned that half of the Mexicans can't digest milk. Seems like we imported the mutation from the Spanish/Portuguese since the indigenous people are the least able to digest milk. I live in Northern Mexico, and while most of Mexico is a cauldron of mixed races, the further South you go, the more pure indigenous people you can find.
That's why in my social circle most people possess the lactase adaptation, but if I lived further South, less and less people would. Live and learn.
It is a correlation. Most of European decent can drink milk, and most of European decent of leaned English. There is however no relationship between them and so it shouldn't be a surprise that you can find exceptions either way.
>It is an intersting coincidence that ability to digest milk as an adult and fluency in English are correlated.
Last time I checked Scandinavia didn’t speak English, and so is Germany, France, Spain and the rest of Europe, neither as official language nor fluent. Your link between those is silly and incorrect, for one: as said most if not all of European aren’t English native speakers, and the biggest native English speakers are found in North America, specifically in the US, a country that obviously is a mix of all people and ethnicities, even the ones who call themselves “whites” are actually mixed to some degree, so if you consider the fluency in English as your baseline, then Americans from asians roots for example shouldn’t be lactose intolerant while some German dude who barely speaks English is?! That’s idiotic. Second: you are correlating something genetically inherited with something you can learn in the first few years, which is even sillier.
If you correlate that with race, then you might be correct, as maybe the majority of caucasians, asians, etc. are/aren’t lactose intolerant, but I never looked at any studies in that regard, but it will make sense at least.
Consider this...Milk is intended to feed baby cows. Humans are not cows. Some cows milk for humans is likely not harmful. But beyond that you're ingesting a cocktail of nutrients and hormones intend for a baby cow.
Put another way, just because Big Milk has normalized doesn't mean it's a good dietary decision.
Humans are mammals, like cows, so biologically we share a great deal in common, including the production of milk for our young.
It doesn't make sense to argue against consuming something on the basis of "what it was intended for". Nature makes no such distinctions when it comes to acquisition of energy.
Fruits and vegetables are not "intended for human consumption" either, they're intended to spread the plant's seed away from other competing plants. Cooked cow muscle fibers, also known as steak or beef, are intended for movement, but they are also delicious and nutritious. Organs of ruminants and birds, such as liver, are also extremely healthy for humans to consume, but they of course were evolved with other purposes in mind.
As for "big milk", well milk is an extremely dynamic food used to produce a large variety of delicious products like cheese, butter, cream, yogurt, whey protein, cake, milk chocolate, etc. Milk consumption also pre-dates modern advertising by 9000 years or so, with earliest signs of human cultivation of milk from cows dating back to the Neolithic.
And to the extent that you choose some milk alternative over milk itself, are you just falling for the Big Almond or Big Soy industry's alternative propaganda?
Consider this...Corn is intended to feed a baby plant. Humans are not corn. Some corns cob for humans is likely not harmful. But beyond that you're ingesting a cocktail of nutrients and hormones intend for a baby plant.
Put another way, just because Big Corn has normalized doesn't mean it's a good dietary decision.
Proceed at your own risk.
PS. Sorry for being so snarky, but I hope you read my silly comment with an open mind, and realize how weak your argument is as of now. You need facts to back up such extraordinary statement. It even has anthropological implications.
You're ignoring adaptation. It's not about if we can eat creatures that existed before homo sapiens, or if we can eat them raw and wild; it's about how being able to eat things affects survival and reproduction.
Take mushrooms, for example: they've existed for 700+ million years, way before homo sapiens, and while 1-2% of them are poisonous to us, we can eat others just fine. It's not like the others were healthy, it's that food wasn't available all the time, and individuals being able to eat mushrooms thrived better than those who couldn't. They kept eating mushrooms and getting sick, until someone didn't, because they were hungry.
Same with fire. Fire is like predigesting food, making nutrients more readily available for our relatively fast digestion. It will soften hard fibers, cartilage, hard meat, and coerce nutrients out of hard-to-digest plants. After it there was no need for massive jaws to munch your food for most of the day.
Cooking is likely what made humans what we are today, it was practiced by homo erectus 1.5 million years ago. Homo sapiens emerged just 300,000 years ago, so cooking with fire also existed before us, hence it's part of our natural environment.
And don't get me started on agriculture! While I agree that we're doing it wrong in many aspects, agriculture enabled human civilization. Instead of everyone being a hunter-gatherer, the farmers could produce food, even a surplus, while others could produce other things (or exploit the population :). Rejecting herding, farming, fire, and technology in general is stupid, it's like declaring oneself a "sovereign citizen", but still wanting all the comforts of life paid with taxes, like firefighters, 911 calls, electricity, waste management, clean water, roads, police, weather reports, tornado warnings, etc. You wouldn't exist today without those things you seem to reject.
Seriously, I don't know why I'm bothering to argue with you, I have the impression that you're not interested in facts, history or science, but in personal opinions and feelings.
Not any more trouble than the trouble you have believing Big Diary's marketing is scientific fact, and that they value you and your health more than they value profits.
Fact: You're ingesting a cocktail intendend for another species at a completely different moment in their lifecycle. To say nothing of the human influence on that cocktail (i.e., additional hormones given to increase production). But celebs pushing "Got milk?" means it's a smart and healthy choice? Here...have a cigarette while you're at it.
Or "Look moms...no need to use your own milk. That's messy. That's an inconvenience. You don't need to be there...there's cows' milk. We promise you that stuff for baby cows is good for your new born as well." Lol. Are you listening to yourself?
That's all fact.
And the fact that that's been normalized is your problem, not mine. I don't have a problem with abstract thinking. But the fact is, we don't need it. It's a distraction. Let's instead stick to fact.
A bigger part of the corn syrup thing was basically coincidence of timing - it came out right at one the cyclical peaks of fat vilification. So products were reformulated to have less fat (and often less salt), and surprise surprise, they tasted like crap.
I should add, that cheaper corn also makes the things that eat corn cheaper. That doesn't mean those things are healthy.
The power structure / status quo is based on stability. Hunger / starvation of the masses tends to bring about instability. The point is, the food system is NOT driven by health (of the masses or the environment for that matter). It's driven by Food X's ability to starve off starvation (and associated unrest). That is, just because you're eating what's been normalized doesn't mean it's ideal for your health.
Cheaper === more consumption. That's basic economics. The corn subsidies drove down the price of that sweetener and in turn those (mostly unhealthy) products feel in price. *And* it became attractive to use in other products.
Yes, of course. There are other factors. But that doesn't diminish the fact that tax dollars are used to make the taxpayers unhealthy.
This is a weird argument. No food is /intended/ to feed humans. Every single thing we eat is just some random bit of plant or animal that happens to be nutritious.
* Spinach leaves are "for" transforming sunlight into plant growth.
* Rump steak is "for" moving the back leg of a cow.
* Honey is "for" feeding bees.
Surely on the "food not for humans <===================> food for humans" scale something that is produced explicitly to feed the young of a fairly closely related mammal is closer to the "for humans" end of the scale than most food?
Please note I'm not trying to argue that that you're wrong about your classification of milk not being "for" humans, I'm trying to point out the absurdity of even trying to define food as being for or not for humans in the first place.
=================================================
Also talking about "Big Dairy" in this context is a bit silly too. You could definitely argue that milk has been heavily marketed as a "healthy" food in the United States over the past few decades, and maybe considering it to be unusually healthy relative to other foods is a mistake.
But trying to argue that humans shouldn't eat it at all while referring to "Big Dairy" is ridiculous - humans have been eating dairy for thousands of years, much longer than modern capitalism has even existed.
=================================================
And lastly, saying "it's not an argument, it's fact" is an extremely arrogant way to dodge actually having to justify what you're saying. Don't do that.
The ability to drink cow’s milk evolved relatively recently, in the past 7,500 years or so. But that ability made you so much more likely to survive and reproduce that it quickly came to dominate Europe. Obviously we don’t know what traits will be adaptive moving forward, but in the past drinking milk was incredibly advantageous.
Humans eat so many different plants and animals that are not from Africa. Everything has hormones and soy can cause issues for some people because it has phytoestrogen.
and it is trivial to get them shipped. go to amazon.com (instead of amazon.ca) and find a vendor who will ship to Canada. Customs ain't gonna check and even if they do they likely don't know 10000IU is too high -- it's not drugs, guns, or livestock so they won't care.
Vitamin K supplementation for the primary prevention of osteoporotic fractures: is it cost-effective and is future research warranted? https://pubmed.ncbi.nlm.nih.gov/22398856/
Matrix Gla protein is an independent predictor of both intimal and medial vascular calcification in chronic kidney disease https://www.nature.com/articles/s41598-020-63013-8
Matrix Gla protein https://en.wikipedia.org/wiki/Matrix_Gla_protein