Have to imagine a lot of things that will come to light will include how incestuous the relationship is between Big Pharma and Western medicine, and it will force a total overhaul of healthcare.
Have to also imagine that trust in vaccines will plummet to an all-time low and that more and more parents will opt out of putting their children on vaccine schedules, flu shots will also be a thing of the past, etc.
Anything else you think might drastically change or stay the same?
"Physicians", if they exist, will become a technical position and lose the status and income we've afforded them all these years. At least we can hope.
The death grip of the FDA and the like on the field of medicine will disappear, and "peer review" will become much less important (as it's become a good ole' boy system anyway).