• zarkanian@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    10 hours ago

    The interface makes it appear that the AI is sapient. You talk to it like a human being, and it responds like a human being. Like you said, it might be impossible to avoid ascribing things like intentionality to it, since it’s so good at imitating people.

    It may very well be a stepping-stone to AGI. It may not. Nobody knows. So, of course we shouldn’t assume that it is.

    I don’t think that “hallucinate” is a good term regardless. Not because it makes AI appear sapient, but because it’s inaccurate whether the AI is sapient or not.

    • MentalEdge@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      9 hours ago

      Like you said, it might be impossible to avoid ascribing things like intentionality to it

      That’s not what I meant. When you say “it makes stuff up” you are describing how the model statistically predicts the expected output.

      You know that. I know that.

      That’s the asterisk. The more in-depth explanation a lot of people won’t bother getting far enough to learn about. Someone who doesn’t read that far into it, can read that same phrase and assume that we’re discussing what type of personality LLMs exhibit, that they are “liars”. But they’d be wrong. Neither of us is attributing intention to it or discussing what kind of “person” it is, in reality we’re referring to the fact that it’s “just” a really complex probability engine that can’t “know” anything.

      No matter what word we use, if it is pre-existing, it will come with pre-existing meanings that are kinda right, but also not quite, requiring that everyone involved in a discussion know things that won’t be explained every time a term or phrase is used.

      The language isn’t “inaccurate” between you and me because you and I know the technical definition, and therefore what aspect of LLMs is being discussed.

      Terminology that is “accurate” without this context does not and cannot exist, short of coming up with completely new words.

      • zarkanian@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        9 hours ago

        You could say “the model’s output was inaccurate” or something like that, but it would be much more stilted.