• GoodEye8@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    ·
    10 hours ago

    I would imagine they’d be stupid if they did use AI. I’ve seen people use AI to “write” technical documentation that I have had to review. That shit goes straight into the bin because the time I spend fixing all the AI nonsense is about the same amount of time it would take for me to write the document myself. It’s gotten to a point where I straight up reject all AI generated documentation because I know fixing them is a waste of time.

    I imagine legal documents have to be at least as precise as technical documents, so if they’re checking the output I seriously doubt they’re saving any time or money by using AI.

    • Excrubulent@slrpnk.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 hours ago

      And anytime I see anyone advocating this crap it’s always because it gets the job done “faster”, and like, the rule is: “fast; cheap; good; pick two”, and this doesn’t break that rule.

      Yeah, they get it done super fast, and super shitty. I’m yet to see anyone explain how an LLM gets the job done better, not even the most rabid apologists.

      LLMs have zero fidelity, and information without fidelity is just noise. It is not good at doing information work. In fact, I don’t see how you get information with fidelity without a person in the loop, like on a fundamental, philosophical level I don’t think it’s possible. Fidelity requires truth, which requires meaning, and I don’t think you get a machine that understands meaning without AGI.