• Perspectivist@feddit.uk
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    5
    ·
    edit-2
    1 day ago

    I genuinely don’t understand the people who are dismissing those sounding the alarm about AGI. That’s like mocking the people who warned against developing nuclear weapons when they were still just a theoretical concept. What are you even saying? “Go ahead with the Manhattan Project - I don’t care, because I in my infinite wisdom know you won’t succeed anyway”?

    Speculating about whether we can actually build such a system, or how long it might take, completely misses the point. The argument isn’t about feasibility - it’s that we shouldn’t even be trying. It’s too fucking dangerous. You can’t put that rabbit back in the hat.

    • XLE@piefed.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      23 hours ago

      Sam Altman himself compared GPT-5 to the Manhattan Project.

      The only difference is it’s clearer to most (but definitely not all) people that he is promoting his product when he does it…

    • ErmahgherdDavid@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Here’s how I see it: we live in an attention economy where every initiative with a slew of celebrities attached to it is competing for eyeballs and buy in. It adds to information fatigue and analysis paralysis . In a very real sense if we are debating AGI we are not debating the other stuff. There are only so many hours in a day.

      If you take the position that AGI is basically not possible or at least many decades away (I have a background in NLP/AI/LLMs and I take this view - not that it’s relevant in the broader context of my comment) then it makes sense to tell people to focus on solving more pressing issues e.g. nascent fascism, climate collapse, late stage capitalism etc.

      • danzania@infosec.pub
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 day ago

        I think this is called the “relative privation” fallacy – it is a false choice. The threat they’re concerned about is human extinction or dystopian lock-in. Even if the probability is low, this is worth discussing.

        • ErmahgherdDavid@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          1 day ago

          Relative privation is when someone dismisses or minimizes a problem simply because worse problems exist: “You can’t complain about X when Y exists.”

          I’m talking about the practical reality that you must prioritize among legitimate problems. If you’re marooned at sea in a sinking ship you need to repair the hull before you try to fix the engines in order to get home.

          It’s perfectly valid to say “I can’t focus on everything so I will focus on the things that provide the biggest and most tangible improvement to my situation first”. It’s fallacious to say “Because worse things exist, AGI concerns doesn’t matter.”

          • niartenyaw@midwest.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 day ago

            and not only that. in your example of choosing to address the hull first over the engine, the engine problem is actually prescient. when taking time to debate about AGI, it is to debate a hypothetical future problem over real current problems that actually exist and aren’t getting enough attention to be resolved. and if we can’t address those, why do we think we’ll be able to figure out the problems of AGI?

            • Aceticon@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 hours ago

              The rephrase it as a short(ish) metaphor:

              • It would be like you’re marooned at sea in a sinking ship and choose to address the risk of not having a good place to anchor when you get to the harbour instead of repairing the hull.