Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

it's almost like AI just repeats data its fed on, even incorrect data, without any real intelligence to determine if the data is correct.... /s

Its not simply garbage in garbage out. There is no logic to verify and analyze the data. You are simply told what is popular in the data.



AI doesn't "just" repeat data. You can feed a LLM 100% fact-checked data and it'll still hallucinate.

It's a core problem with generative AI and it can't be solved with better data.


All of a sudden the saying "eat your own dog food" takes a twist and is no longer fun.


Unfortunately, that is also a sizable portion of the human population. AI definitely does it cheaper and at larger scale though!


I've definitely met a lot of people who fail the GPT test.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: