Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That tendency to hallucinate that you so conveniently downplay is a major problem. I'll take reading the reference manual myself all day rather than sifting through the output of a bullshit generator.


I'm not trying to downplay it, it absolutely is a major problem.

But it's a very different problem from "outsourcing your brain". In fact, the tendency to hallucinate is the very reason you still need to use your brain when using an LLM.

As I said, it's not any different from a web search or a wikipedia read: It's one input, and you still have to think for yourself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: