• James R Kirk@startrek.website
    link
    fedilink
    English
    arrow-up
    1
    ·
    15 hours ago

    That’s all LLM software is, it has no connection to reality, it’s bullshitting 100% of the time. The fact that it is correct mostly, or confident always does not imply that it understands anything it’s saying.