Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.
More concerning, they said, is a rush by medical centers to utilize Whisper-based tools to transcribe patients’ consultations with doctors, despite OpenAI’ s warnings that the tool should not be used in “high-risk domains.”
The full extent of the problem is difficult to discern, but researchers and engineers said they frequently have come across Whisper’s hallucinations in their work. A University of Michigan researcher conducting a study of public meetings, for example, said he found hallucinations in 8 out of every 10 audio transcriptions he inspected, before he started trying to improve the model.
(more)
Well, let's be honest:
The humans who write those medical files/enter notes in them/interpret situations involving your illness and treatment ALSO "make up chunks of texts or even entire sentences."
And you, the Profit Unit, have absolutely NO SAY in the narrative, which becomes an authoritative story about your existence. Forever. And cannot be changed.
"I, Profit Unit."
LOL! -- and so very true.