It seems very likely that it's possible to force people into lucid dreaming. However, it seems that any external methods rely on external cues during REM sleep and I'd worry that in doing so you might be subtly reducing quality of REM sleep.
Yet nearly every controlled trial supplementing Vitamin D has limited or no effect, except maybe in the most deficient people. Many many trials have been conducted. Vitamin D advocates always have an excuse — it’s too little vitamin D; no, actually you also have to add Vitamin K2; etc
Part of the problem of all the observational trials looking at vitamin d is that low vitamin d is a biomarker for being less healthy. People who are ill spend less time outdoors. People who spend more time outdoors are already healthier and are also getting other benefits from their outdoor activities.
You can try to control for all of these things, but every time we actually try to test what happens if you give people Vitamin D, we find almost no benefit.
Yes, rickets and osteomalacia (when caused by vitamin d deficiency) are both treatable with vitamin d. Randomized controlled trials clearly show this, unlike the vitamin d trials for most other things that people are pushing vitamin d for.
Everyone should probably have their blood level of vitamin d tested. Most people are not going to see dramatic changes to health from vitamin d supplementation.
Methotrexate is standard of care, considered “strongly recommended” over hydroxychloroquine (source: https://onlinelibrary.wiley.com/doi/epdf/10.1002/acr.24596 ) for rheumatoid arthritis. If I read the paper right, hydroxychloroquine may be favored in “low disease” states.
So I would certainly expect some sort of difference in patient population. It would suggest those initiated on hydroxychloroquine would normally have less severe rheumatoid arthritis (or have some other difference that their doctor would choose to not use methotrexate by default), although I don’t know if that’s the case in reality.
Not having prescribed either since I was a PCP years ago, I would just say that I agree with this line of inquiry.
If both drugs were equally good and were pretty much used at random, then they could be good instruments. But if there is confounding by indication, then it’s harder to get value out of an analysis that compares them.
It doesn’t mean the conclusions are wrong (and nor does my comment above), it just tempers my interpretation.
You could try and reduce the possible effect by controlling for disease severity. If it reduced the effect a large amount it'd strengthen the case that the effect is actually due to RA disease severity. Actually I'd be a bit surprised if the authors didn't do that for an observational study.
Agreed. I didn't see it described in their methods, although it seemed that the main focus of this paper (based on length dedicated to the material) was the biological/model data rather than the EHR data.
Yes, my wife is an emergency medicine doctor. She frequently ends up wearing her mask in the car and other places where it’s unnecessary until I remind her she’s wearing it.
Maybe. But this is according to a paraphrase of unverified (possibly not even American) intelligence. Both the document's authority and origin are disputed, and the WSJ can't even say which department or agency wrote this. This does not seem like any sort of official US intelligence report.
One thing is fantastically hard to ignore. The SARS2 phylogeny is rooted by a most recent common ancestor in October or November. This is as clear as day and no evidence has ever surfaced to nudge the estimate nor to indicate the virus was present previous to those dates, nor that there were existing proto-SARS2 strains. These would continue to circulate and something would have been found had they existed. (n.b. There _has_ been some now discredited science arguing it was in Europe before those dates.) The correspondence between the virus phylogeny, this report, as well as previous indications of a shutdown in the WIV in early to mid October 2019, is just an awful lot to say maybe to. So, what's the explanation on the other side of your "maybe"?
You're asking me to comment on the entire theory, I think. I'm only commenting on the value of an unseen intelligence report of unknown source. None of what you've said increases the validity of this report.
Everyone commenting on this post is acting immediately as if this report is 100% true, but the reality is that we know very little about the report.
Couldn’t all these same points from that Twitter thread be leveled against the New York Times when they wrote about Trump’s tax returns and financials? It seems to be like an inconsistent bar is being used in terms of transparency and access to the primary source. I’m not saying it’s wrong to be skeptical - in fact I think it’s appropriate - it’s just an interesting observation on how ready we are (or aren’t) to accept allegations based on the story. Ultimately what the lab leak hypothesis deserves is a trustworthy transparent and independent investigation into WIV, but we may be too late in demanding that now, 1.5 years later.
This isn't hard for to check. Yes, basically all research on RaTG13 did appear after the Covid pandemic, as you would naturally expect given that scientists were interested in it given the similarity. Nobody was interested in the RaTG13 previously.
However, the evidence is pretty clear that this was found and published earlier. It was originally described as "BtCoV/4991" and it was described and published as far as I can find as far back as 2016. (The name was changed to RaTG13 to describe that it was collected in 2013 in Tongguan).
This is absolutely wrong. Labs in all the largest cities in the US, simply stopped accepting samples for anything other than COVID. Source: wife is RN working as triage nurse and interacting daily with the largest testing sites in NY, CA, NJ, IL, etc.etc.etc.
> And the decline has not been because of a lack of testing. Since late September, 1.3 million specimens have been tested for influenza, more than the average of about one million in the same period in recent years.
My wife is also an ER doctor and agrees that there were times that they didn't perform many/any flu tests--mostly in the spring of 2020 and after none of the flu tests were returning positive for a while. But this past winter people have definitely been tested for the flu. Rapid influenza tests are also usually performed first, only sending to labs when confirmatory testing is desired.
Hmmm, interesting. Wonder what the source of this 1.3 million flu tests is. It makes no sense. Why would doctors send off flu labs to such a statistically higher degree, in a year when they're not seeing any flu?
That would have nothing to do with HIPAA. HIPAA generally only covers patient information held by health providers, payers, etc. It only applies to "covered entities", which is a specific set of entities. It's a myth that HIPAA applies to health information generally. There's even dispute about whether or not HIPAA applies to doctors and medical services who don't accept insurance ( https://www.lebauerconsulting.com/hipaa-is-my-cash-practice-... )
(although some states have more strict health privacy laws)
It's true that US doctors are paid more (although the gap isn't as big as it used to be). However, pay for US doctors makes up a fairly small portion of overall US medical expenditures (less than 10%). So, you could ask every doctor to work for free and not significantly change costs.
>pay for US doctors makes up a fairly small portion of overall US medical expenditures (less than 10%)
That is likely too low.
The Centers for Medicare and Medicaid Services provides a National Health Expenditure estimate annually. [1]
Physician and clinical services represented $772 billion out of about $3.8 trillion, so more like 20%.
Hospital services are the other big one: about $1.2 trillion.
US physicians are paid terrifically relative to their counterparts almost anywhere else, this is especially true for specialists.
In fact, physicians represent about 15% - 16% of the top 1% of income earners in the US. See table 2 from this paper: https://web.williams.edu/Economics/wp/BakijaColeHeimJobsInco...
which was written using tax return data, not, e.g., self-reported income data.
"Physician and clinical services" includes far more than pay for US doctors -- it's the entire costs for "services provided in establishments operated by Doctors of Medicine (M.D.) and Doctors of Osteopathy (D.O.), outpatient care centers, plus the portion of medical laboratories services that are billed independently by the laboratories". Doctors do not get even close to all of that money paid to them.
> In fact, physicians represent about 15% - 16% of the top 1% of income earners in the US.
This may be accurate, I'm unsure, but it wouldn't change that if US doctors were paid the same as their European counterparts, it would not make a truly significant change in overall US medical spending (this would even be true if doctors did actually make up 20% of medical expenditures, like you assert earlier).
There’s a similar (but smaller) differential for other medical professionals too. But more generally, when I’ve done high-level comparisons of medical spending between the US and Western European countries, it seems like every single cost element is more or less proportionately higher in the US. It seems like basically everyone is spending money in roughly the same proportions, including on things like doctors’ salaries - everything is just scaled up by ~40% to ~100% in the US, depending on which country you compare it to.
And doing so more frequently in the US, so cost is higher but so is rate of consumption, particularly of services and products that make us feel like we have more mastery over outcomes but in fact do not result in better outcomes on the whole.