Hacker Newsnew | past | comments | ask | show | jobs | submit | m4burns's commentslogin

For example, this 2019 NIH grant awarded to Ecohealth Alliance Inc.: https://reporter.nih.gov/project-details/9819304

This is part of a series of ongoing projects that have received NIH funding over the last 7 years. Ecohealth, in turn, funds WIV: https://www.nature.com/articles/d41586-020-02473-4


This is close to the rate of vitamin D deficiency in the general population, e.g. https://www.ncbi.nlm.nih.gov/pubmed/21310306


The study you linked was based on a threshold of 50 nmol/L, while the originally posted study used two thresholds, 30 nmol/L and 50 nmol/L. At the 50 nmol/L cutoff, the study you linked indicates a deficiency rate of 41.6% for the general population, compared to a rate of 91.3% among psychiatric patients according to the posted study.

I’d still take the posted study with a grain of salt. Vitamin D deficiency between individuals is likely to be highly correlated within a particular region and time period due to its dependence on weather, and the authors didn’t control for that. But they did say that this was just a pilot, and given the effect size it’s probably worth investigating further.

Edit: fixed units


Good points. Here is some data on vitamin D deficiency in the UK (where the original study was conducted) that also addresses seasonality: https://www.bjfm.co.uk/prevention-of-vitamin-d-deficiency

"Low serum levels of vitamin D are found in significant numbers of all population groups in the UK: in winter 30-40% of all age groups in the general population are classed as vitamin D deficient. Even towards the end of summer 8% of adults and 13% of adolescents remain deficient."

where deficient is defined as < 25 nmol/L of 25(OH)D


Thanks for the link. Seems like the posted study’s results are very similar to the deficiency rates for the overall UK population in winter. (Their data was collected between February and April.)


More important difference: the base rate in the grandparent post is for the US, while the psychiatric hospital admission study is for hospitals in the UK. Serum vitamin D depends significantly on sunlight exposure and hence latitude, so the base rate is different there.


I agree, good point. See m4burns’s comment, which includes a link to broader UK data.


Regarding the weather: most people most often stay indoors or in cars(does the glass filter the Vitmain-D causing rays), and usually wear long pants and a shirt, so only a small body area is exposed to the sun, so it's possible that even in summer, people suffer from Vitamin-D deficiency.


Be careful with increased exposure to sunlight. UV is a known carcinogen. https://www.cancer.org/cancer/cancer-causes/radiation-exposu...

There are safer sources of vitamin D.


To this end, I strongly suggest checking your local UV index before heading outdoors for any extended period of time. I use an iOS app called UVLens (no affiliation), which displays your local UV index throughout the day and can give you an estimate of how long before you’ll burn (based on skin color, eye color, etc.)

Provided you’re taking reasonable precautions, limited exposure to broad-spectrum UV is quite safe.


You just need to slap on some suntan location if its going to be sunny - which what I do to counter the increased risk due to meds I have to take


Yeah, absolutely - most regions probably have a nonzero population with Vitamin D deficiency year-round. But if I had to pick a place and time at which the overall rate of Vitamin D deficiency is high, the UK in February through April seems like a safe bet.


I came here to say that it didn't compare to general population.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: