• kbal@fedia.io
    link
    fedilink
    arrow-up
    38
    arrow-down
    1
    ·
    1 day ago

    Recording and analyzing all the real-time video and audio feeds of their surroundings that everyone is required to provide while using the Internet, to ensure that no children are present when they use social media.

  • chaosCruiser@futurology.today
    link
    fedilink
    English
    arrow-up
    31
    ·
    edit-2
    1 day ago

    When an AI company goes bankrupt, their hardware will be sold to anyone interested in it. My guess is, MS and Amazon will be buying a bunch of vacant datacenters within the next 10 years.

    That’s enterprise hardware, so it’s not really compatible with your consumer grade gaming PC. If you’re interested in self hosting your own cloud photos and local LLMs, you might want to look into those auctions.

    • yeeght@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      22 hours ago

      This, but also I think the fallout of the AI bubble popping will be different than people are envisioning. After the dot com bubble collapsed a lot of the infrastructure was sold at surplus and repurposed, but some of the infrastructure was left unused and just sat around for a while.

      It’s not completely equivalent, but imo a great example of this is fiber optic. Early on, companies and governments invested in fiber optic technology in their local area to get in on the bubble hype, but after the bubble burst most of it went unused or not fully utilized in the way it was intended until recently, when Google bought various fiber optic networks around the country for Google fiber. This is the reason why in the last 10 or so years you had certain states and cities getting access to fiber connections before others.

      Obviously this isn’t exactly the same, but I’ll be curious what the “fiber” of the AI bubble will be (if anything). My guess would be changes and hopeful improvements to our energy infrastructure, but time will tell.

  • Emilie Easie@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    2
    ·
    24 hours ago

    I listened to a YouTuber suggest that all these abandoned data centers would be great for the government to expand its mass surveillance programs so probably that

  • jordanlund@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    22 hours ago

    The last boom/bust cycle resulted in a lot of high tech gear getting sold off at bargain basement prices.

    Good for new businesses but bad for companies like Cisco trying to sell new gear in a market flooded with cheap used gear.

  • ElectricFire@sh.itjust.works
    link
    fedilink
    arrow-up
    12
    ·
    23 hours ago

    I think hardware as a service will be their next thing, raise the cost of parts so people buy a cheap sub then increase the aubscription year by year.

    • cymbal_king@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      22 hours ago

      Agree. I think MS and Google selling more cheap “cloud laptops” could totally be a thing. The personal device would mostly be a screen and bare bones components

  • moondoggie@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    20 hours ago

    Companies will never admit it. They’ll drive this shit right into the ground and keep digging

  • SoftestSapphic@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    22 hours ago

    Lol when the AI bubble pops it will be most likely destroyed to maintain artificially high hardware costs.

    At least that’s why China ending crypto mining didn’t drastically reduce the price of graphics cards.

  • Perspectivist@feddit.uk
    link
    fedilink
    arrow-up
    11
    arrow-down
    3
    ·
    23 hours ago

    ChatGPT alone has 800 million weekly users of which the vast majority are normal people - not companies. The demand is there despite it not being able to increase company profit margins the way people expected. I don’t see this computing infrastructure needing to run idle anytime soon.

    • Varyk@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      23 hours ago

      Chatgpt is constantly losing money, public surface-level interest won’t matter much when the capital runs out and they’re still accruing significant debt without any revenue.

        • Varyk@sh.itjust.worksOP
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          11 hours ago

          Nope, you’ll certainly need a source to back that speculation up.

          Half a billion people are “using” AI and the total llm market cap is a few billion. On average, users may be willing to pay up to 50 cents a month for inaccurate word association.

          Not even a drop in the buckets companies need to fill up with everything they’re spending just on advertising, not to mention infrastructure, utility and upgrade costs.

          People are statistically not willing to sustainably pay for llms, even if we assumed the rosy predictions of 20x LLM market caps in a decade.

          Devil’s advocate: Increased AI cash flow could occur if people don’t realize their ai “search results” are paid advertisements, and considering longstanding obliviousness to directed advertising and the recent abolishment of US consumer rights…it could happen.

            • Varyk@sh.itjust.worksOP
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              11 hours ago

              Yes; the included source and explanatory paragraph above in the same comment you are referencing.

              Would you care to provide any evidence for your speculation that people are willing to pay enough for AI to sustain its costs?

              • Perspectivist@feddit.uk
                link
                fedilink
                arrow-up
                1
                ·
                11 hours ago

                Would you care to provide any evidence for your speculation that people are willing to pay enough for AI to sustain its costs?

                ChatGPT alone has 800 million weekly users and their total revenue in 2025 was 13 billion with 70% coming from normal users. That’s drop in the bucket though considering they’ve commited to investing a trillion dollars into new computing capacity over the next 10 years.

                • Varyk@sh.itjust.worksOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  10 hours ago

                  This is why you should provide a source, your numbers and associated assumptions are incorrect:

                  Chatgpt has estimated revenue of 1.3 billion, not 13 billion, neither of which are remotely significant as revenue streams relative to cost.

                  That’s the thrust of my opening paragraph, and then you appear to have taken up my drop in the bucket analogy, so i guess we’re on the same page now.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        21 hours ago

        A major problem faced by first-mover companies like OpenAI is that they spend an enormous amount of money on basic research and initial marketing and hardware purchases to set up in the first place. Those expenses become debts and have to be paid off by the business later. If they were to go bankrupt and sell off ChatGPT to some other company for pennies on the dollar that new owner would be in a much better position to be profitable.

        There is clearly an enormous demand for AI services, despite all the “nobody wants this” griping you may hear in social media bubbles. That dermand’s not going to disappear and the AIs themselves won’t disappear. It’s just a matter of finding the right price to balance things out.

    • ch00f@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      23 hours ago

      I think OP is talking about all of the future data centers that are allegedly being build despite nobody even knowing where. Nvidia has agreed to pay OpenAI $10B per gigawatt of datacenter for 10 gigawatts of datacenter build up over the next few years.

      Unlikely that will fully materialize, but that’s the current outlook.

    • Melobol@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      22 hours ago

      The free plan of chatgtp is more than enough for most people. And when they decide to start charging for it, probably 30% of free users will switch to a different (mahbe even locally run) Ai.