The medical industry is by and large a Godless entity. Especially in modern times. The vast majority of doctors out there are like horse with blinders on. They know nothing about the healing properties of food and nutrition. Barely even consider it in their analysis. The only thing they know is prescription drugs. I can't think of another industry that has damaged the United States more than the medical and pharmaceuticals.
And that is taking into consideration the war mongering politicians. Even they don't stand a chance against the mass murder, addiction and mental health destruction that "doctors" cause every year.
And the left went from "health care is a right!" To "health Care is a privilege of the vAcCinAtEd"
The end won't be for everyone makes more and more sense daily.
Eastern medicine getting attention finally. Everythings always pills this and pills that with western medicine.
red pills are cool tho