A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 hours ago

      I’d be interested on a study there.

      I lot of therapy is taking emotions and verbalising them so that the rational part of the brain can help in dealing with things. Even a journal can help with that, so talking to an inanimate machine doesn’t seem stupid to me.

      However therapists guide the conversation to challenge the patient, break reinforcing cycles, but in a way that doesn’t cause trauma. A chatbot isn’t going to be the same.

    • IAmNorRealTakeYourMeds@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      16 hours ago

      that’s easy to say, but when someone is in a crisis, I would be wrong to judge then for talking to an AI (shitty terrible solution) instead of a therapist that can be unaffordable and also comes with a risk of then being terrible.

      • TimewornTraveler@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        5 hours ago

        a terrible therapist at least has an ethics board

        a terrible therapist at least has evidence-based interventions on their side

        a terrible therapist at lest has the fact that ~80% of positive outcomes have nothing to do with the interventions or anything the therapist does besides show up and be cool (a statistic I remember quite well from grad school)

        AI has none of these things

        therapy isn’t fucking magic. it’s a relationship. you can’t have a relationship with an LLM. there’s no such thing as AI therapy, you’re just training it to tell you about CBT worksheets while you bitch about your problems like you’re in a nail salon

        • IAmNorRealTakeYourMeds@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          ok.

          but the problem is that real therapy is expensive, and unaccessible, while AI is freely accessible, even though it’s shit.

          and open ai is profiting from that.

          I’m just saying the blame should be aimed at the corporations and the healthcare system, rather than someone who is desperate for help

        • Guidy@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 hours ago

          The best therapist in the world can still end your career by causing your clearance to be revoked or rendering you unqualified for your unit’s mission.

          (Suicide is a big problem in the military, I lost a buddy to it.)

          The cheapest therapist in the world may still not be covered by your insurance. (And nothing you write in reply will alter that.)

          They should work to make AI therapy better while keeping it totally anonymous. If it were really good it would be the number one use for running a local and disconnected and air gapped LLM: perfectly private therapy with no “we just use telemetry to improve our product” bullshit.

          Then maybe a lot more men would seek help/talk about their thoughts and feelings.