Have to imagine a lot of things that will come to light will include how incestuous the relationship is between Big Pharma and Western medicine, and it will force a total overhaul of healthcare.
Have to also imagine that trust in vaccines will plummet to an all-time low and that more and more parents will opt out of putting their children on vaccine schedules, flu shots will also be a thing of the past, etc.
Anything else you think might drastically change or stay the same?
I believe the business will be all but closed. We'll go back to natural remedies and home care. Hospitals will be emergency only. Maybe urgent care type of stuff. Nobody will go back to drs the way they do now. I don't even think it'll be an option, because let's face it, without big time dollars rolling in, there aren't going to be medical practices on every corner anymore.
Came here to say one word - Bankrupt.