1) LLM inference does not “teach” the model anything.
2) I don’t think you’re using “gaslighting” correct here. It is not synonymous with lying.
My dictionary defines gaslighting as “manipulating someone using psychological methods, to make them question their own sanity or powers of reasoning”. I see none of that in this thread.
1) LLM inference does not “teach” the model anything.
2) I don’t think you’re using “gaslighting” correct here. It is not synonymous with lying.
My dictionary defines gaslighting as “manipulating someone using psychological methods, to make them question their own sanity or powers of reasoning”. I see none of that in this thread.
I don’t get your point here