@olihawkins@mstdn.social That is, in general, a person who is hallucinating does not realize they are hallucinating. Just as, in general, a person dreaming does not realize they are dreaming. “In general” used advisedly – there are certainly clear exceptions in both cases.
@vruba @olihawkins It's technically functioning. The point is an important one — and it's one that I think a lot of people are coming to realize, by different means and in different contexts.
@emma @olihawkins@mstdn.social Agreed that lack of insight into hallucination is not essential: thus my pains over “in general”.
I’m saying something weaker, which is that being wrong and not knowing it also happens to people sometimes, and in ways that are more LLM-like than I think some of the fiercer LLM critics are doing justice to. (The range of valid criticisms of things people say about LLMs remains vast, to be clear.)
@vruba @emma @olihawkins I think “hallucination” bothers me because it implies a false perception of reality corrupting your model of the world. LLMs dint perceive or model reality, only language (and thus I guess I agree with Oli)
@vruba @olihawkins But lack of self-awareness / not recognizing that one is hallucinating isn't an essential part of the definition of "hallucinating." It's tricky because the original post discusses the LLM's "perspective," which seems to suggest a consciousness that can indeed be self-aware, but that's not what the author meant. There is no real "perspective," but there is a mechanical epistemology that is always "correct" insofar as... (1/2)