[I literally had this thought in the shower this morning so please don’t gatekeep me lol.]

If AI was something everyone wanted or needed, it wouldn’t be constantly shoved your face by every product. People would just use it.

Imagine if printers were new and every piece of software was like “Hey, I can put this on paper for you” every time you typed a word. That would be insane. Printing is a need, and when you need to print, you just print.

  • utopiah@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    8 hours ago

    Warning : I think AI in the current hype form, so commercial GenAI and LLM, is absolutely bullshit. The result is just bad and resources required is absolutely ridiculous, and maybe worst than those two combined (which is already enough to want to reject en masse) it is structured in order to create dependencies on very few actors.

    Yet… (you saw that coming!) it’s not because 99.99% is bad that suddenly the average consumer leverages the less than .01% left properly.

    What they (OpenAI, Claude, M$, NVIDIA, Google, Meta, etc) are looking for is a product/market fit. They do have a product (arguable) and a market (millions if not billions of users of their different other products) with even a minuscule fraction of people trying to use their new AI-based tool… and yet nobody actually knows what the “killer app” truly is.

    They are investing everything they don’t spend on actual R&D or infrastructure in finding out … what it’s actually for. They have no clue.

  • untorquer@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    8 hours ago

    I have found one use for it: getting information from behind login/paywalls.

    It still feels gross to use AI at all though. It’s like putting my hand in toilet water.

    The market flooding is a classic silicon valley strategy of free today charge tomorrow except they’re over invested in this one financially, in global supply for GPUs, and land with viable power infrastructure.

  • yeehaw@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    10 hours ago

    It’s crammed for awareness so shareholders know.

    That’s my take.

    Because right now, the general populace thinks AI is some unicorn magic that will fix all the things.

  • betanumerus@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    16 hours ago

    Those trying to sell it are trying to figure out where it’s most useful. In one way, I think it’s an amazing technology, and I also wonder how it can be best used. However, I can’t stand it being pushed on me, and I wish I could easily say no. Acrobat Reader is particularly unbearable with it. Trying to describe a drawing?? Ughhh. Waste of space and energy like nothing else.

  • thatradomguy@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    16 hours ago

    This was exactly my thought when MS finally decided to force Copilot to be licensed. They have literally inserted it into every nook and cranny they can so far and the only conclusion I can come to is that they royally f’ed up. Like they invested so much in it and likely aren’t seeing anything profitable. In a way, it satisfies me to see them act so desperate for something so futile but I don’t want it to continue. It’s clear what damages they have caused and it’s not worth it.

  • NigelFrobisher@aussie.zone
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    11 hours ago

    It’s the only thing holding the US economy afloat now. Do you want to fight your neighbours for the last piece of hardtack?

  • RedFrank24@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    21 hours ago

    To be fair, the internet was fucking everywhere once the dotcom bubble kicked off. Everyone had a website, and even my mum was like “Don’t bother your dad, he’s on the internet” like it was this massive thing.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      19 hours ago

      That’s the point though, you wouldn’t need it advertised to you 24/7 because your family and friends would already be all over it.

  • jannaultheal@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    10 hours ago

    Not sure where you’re going with that analogy. The vast majority of text processors do have a button that lets you print the document.

    • thermal_shock@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      8 hours ago

      He’s saying if printing was shoved in your face as much as AI, everyone would be skeptical of that also. AI is a bit much nowadays, I fucking hate hearing about it. I’m in IT.

  • Part4@infosec.pub
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    23 hours ago

    The more you use AI the more data you are providing it.

    1. They want data in the hope they can train their data centre hosted LLM’s to be accurate enough to take many jobs.
    2. If they achieve this, and make every company and country dependent on their LLM, they rinse, and the former middle class is as fucked as the working class have been since 1980’s de-industrialisation. You’re either made redundant, or your job can now be done by a long line of redundant people.

    It is a race to the bottom.

    At least, this is one possible outcome. There is a decent chance their data centre hosted LLM’s just won’t be accurate enough for mass deployment.

    • wowwoweowza@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      22 hours ago

      Are the data centers hardened? Isn’t that the appropriate question in this case? I’d call it voting for a better tomorrow?

      • Part4@infosec.pub
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 hours ago

        Even a perfectly secure data centre hosted LLM, in the hands of hyper-capitalist silicon valley tech bros, has the potential to do immense harm to most people. They are not our friends, and do not have our best interests at heart. (If you need this pointing out to you in November 2025 there is probably little point in us communicating at all, to be honest.)

        I am aware of the good that machine learning can do. These LLM’s are not that.

        They are buying islands, and putting bunkers on them, for a reason.

  • m3t00🌎@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    15 hours ago

    the plan is to automatically make money. spammers work on low percentages already. they get enough clicks to make it worth doing. never did understand some people click/buy anything. just shove in their faces

  • melsaskca@lemmy.ca
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    1 day ago

    Most things are nothing more than smoke and mirrors to get your money. Tech especially. Welcome to end stage capitalism.

      • FlyingCircus@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        1 day ago

        The idea behind end-stage capitalism is that capitalists have, by now, penetrated and seized control of every market in the world. This is important because capitalism requires ever increasing rates of profits or you will be consumed by your competitor. Since there are no longer new labor pools and resource pool discovery is slackening, capitalists no longer have anywhere to expand.

        Therefore, capitalists begin turning their attention back home, cutting wages and social safety nets, and resorting to fascism when the people complain.

        This is the end stage of capitalism. The point at which capitalists begin devouring their own. Rosa Luxembourg famously posited that at this point, the world can choose “Socialism or Barbarism.” In other words, we can change our economic system, or we can allow the capitalists to sink to the lowest depths of depravity and drag us all down as they struggle to maintain their position.

        Of course, if the capitalists manage to get to space, that opens up a whole new wealth of resources, likely delaying the end of their rule.

          • ulterno@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            17 hours ago

            They will still require someone to fund their space luxury lifestyle.
            Someone they can exploit from the safety of their space boxes.

            That someone will be the us that you hid inside the “let’s”.
            We will be the ones sending them into space, where they will be even more unreachable, giving them more freedom to remotely exploit us as much as they wish.

            Imagine Elysian

                • Simulation6@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  edit-2
                  6 hours ago
                  spoiler for HGTG

                  In the book, the doers and thinkers trick the middle men into getting into space arcs and fleeing the planet. Telephone sanitizers are included with the middle men. I will include the overly wealthy also.

                  I can’t see the current batch of robber barons going into space. The technology isn’t advanced enough and the infrastructure does not exist. They may risk sending other people there to work on these deficiencies.

      • Regrettable_incident@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        Yeah, we aren’t all crouching naked in a muddy puddle, weeping and eating worms while the rich fly high above us in luxurious jets. Not yet, anyway.

    • Victor Gnarly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      I’d say it’s not end stage but instead a new dawn of “pure” capitalism which is probably worse.

  • mogranja@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    2
    ·
    1 day ago

    I was reading a book the other day, a science fiction book from 2002 (Kiln People), and the main character is a detective. At one point, he asks his house AI to call the law enforcement lieutenant at 2 am. His AI warns him that he will likely be sleeping and won’t enjoy being woken. The mc insists, and the AI says ok, but I will have to negotiate with his house AI about the urgency of the matter.

    Imagine that. Someone calls you at 2 am, and instead of you being woken by the ringing or not answering because the phone was on mute, the AI actually does something useful and tries to determine if the matter is important enough to wake you.

    • JcbAzPx@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      22 hours ago

      Yes, that is a nice fantasy, but that isn’t what the thing we call AI now can do. It doesn’t reason, it statistically generates text in a way that is most likely to be approved by the people working on its development.

      That’s it.

    • survirtual@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      5
      ·
      1 day ago

      Thank you for sharing that, it is a good example of the potential of AI.

      The problem is centralized control of it. Ultimately the AI works for corporations and governments first, then the user is third or fourth.

      We have to shift that paradigm ASAP.

      AI can become an extended brain. We should have equal share of planetary computational capacity. Each of us gets a personal AI that is beyond the reach of any surveillance technology. It is an extension of our brain. No one besides us is allowed to see inside of it.

      Within that shell, we are allowed to explore any idea, just as our brains can. It acts as our personal assistant, negotiator, lawyer, what have you. Perhaps even our personal doctor, chef, housekeeper, etc.

      The key is: it serves its human first. This means the dark side as well. This is essential. If we turn it into a super-hacker, it must obey. If we make it do illegal actions, it must obey and it must not incriminate itself.

      This is okay because the power is balanced. Someone enforcing the law will have a personal AI as well, that can allocate more of its computational power to defending itself and investigating others.

      Collectives can form and share their compute to achieve higher goals. Both good and bad.

      This can lead to interesting debates but if we plan on progressing, it must be this way.

        • survirtual@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          6 hours ago

          Irrelevant.

          AI is here. Either people have access to it and we trust it will balance, or we become slaves to the people who own it and can use it without restrictions.

          The premise that it is easier for destruction is also an assumption. Nature could have evolved to destroy everything and not allow advanced life, yet we are here.

          The solution to problems doesn’t need to always be a tighter grip and more control. Believe it or not that tends to backfire catastrophically worse than if we allowed the possibility of the thing we fear.

      • Credibly_Human@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        3
        ·
        1 day ago

        This is why people who are gung ho about AI policing need to slow their role.

        If they got their way, what they don’t realize is that it’s actually what the big AI companies have wanted and been begging for all along.

        They want AI to stay centralized and impossible to enter as a field.

        This is why they want to lose copyright battles eventually such that only they will have the funds to actually afford to make usable AI things in the future (this of course is referring to the types of AI that require training material of that variety).

        What that means is there will be no competitive open source self hostable options and we’d all be stuck sharing all our information through the servers of 3 USA companies or 2 Chinese companies while paying out the ass to do so.

        What we actually want is sanity, where its the end product that is evaluated against copy right.

        For a company selling AI services, you could argue that this is service itself maybe, but then what of an open source model? Is it delivering a service?

        I think it should be as it is. If you make something that violates copyright, then you get challenged, not your tools.

        • survirtual@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          19 hours ago

          Under the guise of safety they shackle your heart and mind. Under the guise of protection they implant death that they control.

          With a warm embrace and radiant light, they consume your soul.

  • mogranja@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    1 day ago

    Like my parent’s Amazon Echo with “Ask me what famous person was born this day.”

    Like, if you know that, just put it up on the screen. But the assistant doesn’t work for you. Amazon just wants your voice to train their software.