• MiddleAgesModem@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    10 hours ago

    They think the LLM hallucination problem will be ironed out in a couple of years.

    That one is a tad more realistic than uploading human consciousness.

    • WraithGear@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 hours ago

      not unless they pivot on the basic principles of the LLM’s, instead of attempting to force a square peg into a circle hole.