• fodor@lemmy.zip
    link
    fedilink
    arrow-up
    3
    arrow-down
    3
    ·
    17 hours ago

    They are errors, not hallucinations. Use the right words and then you can talk about the error rate and the acceptable error rate, the same way we do everything else.

    • DarthFreyr@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      10 hours ago

      An “error” could be like it did a grammar wrong or used the wrong definition when interpreting, or something like an unsanitized input injection. When we’re talking about an LLM trying to convince the user of completely fabricated information, “hallucination” conveys that idea much more precisely, and IMO differentiating the phenomenon from a regular mis-coded software bug is significant.

    • boaratio@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      14 hours ago

      But calling it an error implies that it can be solved. I’d call it a fundamental design flaw.