• ThirdConsul@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    18 hours ago

    Then I’m confused what is your point on Halting Problem vis-a-vis hallucinations being un-mitigable qualities of LLMs? Did I misunderstood you proposed “return undecided (somehow magically, bypassing Halting Problem)” to be proposed solution?

    • skisnow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      First, there’s no “somehow magically” about it, the entire logic of the halting problem’s proof relies on being able to set up a contradiction. I’ll agree that returning undecidable doesn’t solve the problem as stated because the problem as stated only allows two responses.

      My wider point is that the Halting problem as stated is a purely academic one that’s unlikely to ever cause a problem in any real world scenario. Indeed, the ability to say “I don’t know” to unsolvable questions is a hot topic of ongoing LLM research.