Are doctors and health care workers going to become the world's new most hated profession? 30 years ago lawyers were considered the butt of everyone's joke. As life progressed, we moved into an area where bankers flipped the script and actually became more hated than lawyers.
I am starting to think that for our children, health care workers are going to become the new profession to hate. They are really bringing it on themselves by refusing to stand up to the conglomerates who are ruining the profession. How can you honestly respect a group of people that care more about keeping their paychecks than killing their patients?
To me, this could be the most lasting and most detrimental social change to come out of this war. We really need to be able to trust our health care workers. I am saddened by the fact that in the general case, I no longer do.
BTW, the answer to the title, in case you never heard the joke, is "A good start!"
I'm already there with doctors. Been there for over a decade now. I'm done with them. We have more power over our health than we realize. Doctors are quacks and foot soldiers for big pharma. They don't even know what it means to heal a person anymore. Btw, MEDICAL INTERVENTION is the number 1 killer in America. When you combine both hospital errors and prescription drugs, the western medical industry has been mass murdering people for a long time. Also, every time you hear cancer as the number 2 or 3 killer, keep in mind for the vast majority of those cases counted, it wasn't cancer that killed them, it was chemo. So add those totals to the others above and you'll realize the most risky thing you can do is trust a doctor with your life.