Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.
More concerning, they said, is a rush by medical centers to utilize Whisper-based tools to transcribe patients’ consultations with doctors, despite OpenAI’ s warnings that the tool should not be used in “high-risk domains.”
The full extent of the problem is difficult to discern, but researchers and engineers said they frequently have come across Whisper’s hallucinations in their work. A University of Michigan researcher conducting a study of public meetings, for example, said he found hallucinations in 8 out of every 10 audio transcriptions he inspected, before he started trying to improve the model.
(more)
Archived article: https://files.catbox.moe/2r66ae.pdf
I was looking for a restaurant which I knew was near a VA (Veterans Administration) Hospital. My phone map finder kept telling me where to turn to get to the "Virginia" Hospital (in North Carolina). lol
...everybody was thinking AI would grow too advanced, and destroy humanity...
...but no... we couldn't even wait that long. we're just gonna put our lives in the hands of half-baked beta versions and assume it will all work perfectly.
one time I asked an AI how to bus across my city.
the AI responded: 'Oh don't worry about the bus, I'll just come pick you up!'
Well, let's be honest:
The humans who write those medical files/enter notes in them/interpret situations involving your illness and treatment ALSO "make up chunks of texts or even entire sentences."
And you, the Profit Unit, have absolutely NO SAY in the narrative, which becomes an authoritative story about your existence. Forever. And cannot be changed.
"I, Profit Unit."
LOL! -- and so very true.
ML transcription is great under ideal conditions, but with low quality (real world) input data it tends to hallucinate.