Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Two things:

1) LLM inference does not “teach” the model anything.

2) I don’t think you’re using “gaslighting” correct here. It is not synonymous with lying.

My dictionary defines gaslighting as “manipulating someone using psychological methods, to make them question their own sanity or powers of reasoning”. I see none of that in this thread.

I don’t get your point here



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: