• petersr@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 hours ago

    I am not talking about the bubble. I am talking about AI being a threat to humanity up there with nuclear wipe out.

    • Saledovil@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 hours ago

      Well, we’re still at least one breakthrough away from AGI, and we don’t even know how it will go from there. Could be that humans are already near the maximum of what is possible intelligence wise. As in, the smartest being possible is not that much smarter than the average human. In which case, AGI taking over the world would not be a given.

      Essentially, talking about the threat posed by ASI is like talking about the threat posed by Cthulhu.