• FishFace@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    16 days ago

    LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model’s view of “context” doesn’t change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don’t work perfectly.

    *Token

    • ideonek@piefed.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      16 days ago

      That was the answer I was looking for. So it’s simmolar to “seahorse” emoji case, but this time.at some point he just glitched that most likely next world for this sentence is “or” and after adding the “or” is also “or” and after adding the next one is also “or”, and after a 11th one… you may just as we’ll commit. Since thats the same context as with 10.

      Thanks!

        • ideonek@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          16 days ago

          Chill dude. It’s a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have “grammatical gender”. Everything have “gender” in mine. “Spoon” is a “she” for example, but im not proposing to any one soon. Not all hills are worth nitpicking on.

          • atomicbocks@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            16 days ago

            This one is. People need to stop anthropomorphizing AI. It’s a piece of software.

            I am chill, you shouldn’t assume emotion from text.

            • ideonek@piefed.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              16 days ago

              As I explained, this is specyfic example, I no more atrompomorphin it than if I’m calling a “he” my toliet paper. The monster you choose to charge is a windmill. So “chill” seems adequate.

              • ulterno@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                ·
                16 days ago

                Yeah. It would have been much more productive to poke at the “well”, which was turned into “we’ll”.

                • ideonek@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  16 days ago

                  You brought this unmistaken “I speak lauder and lauder on my European vacation until waiter that doesn’t speek English can finaly understand me” energy to this conversation.

                • atomicbocks@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  16 days ago

                  I don’t care that this person, who seems to maybe be typing English on a keyboard with a different language dictionary, misspelled some words.

                  I care that people in general keep talking about AI like it is living or capable of thinking.

                  • luciferofastora@feddit.org
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    15 days ago

                    A gendered pronoun as result of translingual grammar bleed doesn’t make the AI living and thinking. In German, a corpse would be he or she too (der Leichnam or die Leiche), but I’m pretty sure it’s not living or thinking by definition.

                    You’re literally looking at what has been explained at length to be an artifact of a foreign language and attacking it for something it isn’t.

              • atomicbocks@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                16 days ago

                To be clear using gendered pronouns on inanimate objects is the literal definition of anthropomorphization. So chill does not seem fair at all.

                • MinnesotaGoddam@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  15 days ago

                  To be clear using gendered pronouns on inanimate objects is the literal definition of anthropomorphization

                  you really need to get over yourself. the universe does not revolve around you nor humans. the use of gendered pronouns on inanimate objects is not anthropomorphization.