Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In my impression of using AI a lot more recently, hallucinations are a non-issue if you are simply using it to transform data. AI hallucinates when it's being asked to extract data from it's own "memory" but not when it's simply given a task to perform and all the necessary input to do it.


whisper hallucinates - or is just incorrect - fairly often. I have been transcribing old tv and radio programs just to get a real feeling for how inaccurate it is, and for an example, the tv show JAG's main character is Lieutenant Harmon Rabb, Junior. I prompt "You are transcribing a show called JAG - Judge Advocate General - whose main characters are named Lieutenant Harmon Rabb jr, known as Harm or Rabb, [...]" Maybe 1 time out of 20 it will actually put "Rabb" or "Harm".

Even better is stuff like old call in radio shows, the callers are perfectly understandable, but whisper has lots of issues.

If there's something better than whisper, i'd love to try it out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: