• jol@discuss.tchncs.de
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    8 hours ago

    I know, and accept that. You can’t just tell an LLM not to halucinate. I would also not trust that trust score at all. If there’s something LLMs are worse than accuracy, is maths.