• PiraHxCx@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 hours ago

    I don’t know about a lot of those points. I can read French quite well, but can’t speak it for shit, I don’t know how much I have to “convert” in my head since its phonetics are irrelevant to me (and as English became a main online language, tons of people everywhere in the world can read and write it, but not really speak it since we are all communicating primarily through text - my English pronunciation sucks btw)… but anyway, about stability and adaptation, China has 120k+ different characters in its language, the vast majority got out of use because other ways to write the same thing became more popular, so I don’t think it works like you described.

    Have you ever heard about Paulo Freire? The guy developed a very interesting literacy method, he tested it out with adult rural workers from poor regions and in just 2 months he was able to get those people to read and write (even if with grammatical mistakes) because his method is phonetic (well, there’s quite more to it, but the reading/writing part is phonetic). For learning to read/write other languages, the “no sounding out” might be an advantage (like a lot of netizens writing in English without really speaking it), but for your own language, well, from what I understand they expect that only by high school the kids in Japan and China should be able to read their local newspaper because of the amount of characters they need to know for it, meanwhile Paulo Freire got adults, who have very low mental plasticity, able to do it in 2 months… phonetics and alphabets ftw :P

    edit: We both know we are talking about Chinese and Japanese when talking about logograms, so great for them they have the same root and symbols have the same meaning even when sounding different. If there were other languages using logograms but with different roots, the positive point of symbols having the same meaning wouldn’t hold. But something happened with the Koreans that made them break away and build from scratch what seems to be the best and most logical writing system around.

    • zalgotext@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      56 minutes ago

      Sure, I agree that alphabet systems are initially easier to learn than logographic systems. But to achieve that they sacrifice the consistency and lack of ambiguity of a logographic system. It’s funny you bring up Korean as an example of a good alphabet system, because I can assure you as someone who is currently learning Korean, it has it’s weird spelling inconsistencies and pronunciation “rules” and exceptions, just like any other alphabet system.

      And again, I’m not trying to convince you that one is better than the other. My whole point is that one isn’t any better or worse than another. They each have their own strengths, weaknesses, and specific purposes, they’re both functional, one isn’t better or worse than the other as a whole.

      • PiraHxCx@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        46 minutes ago

        A take I had from this is that a non-phonetic written language works like cached memory (and you might have a lot to cache), while phonetic is like real-time rendering. I was reading about how Vietnam changed to its current script, and just like Korea, and also Paulo Freire’s view of language, seems like the change made the language more accessible.

        • zalgotext@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          12 minutes ago

          You’re absolutely correct that Korea (and Vietnam I suppose, I don’t know much about their language) invented their alphabet to make literacy more accessible, and I think that’s awesome and a really good feature of alphabet systems. I can even see why that would make people prefer alphabet systems, since accessibility is super important when you’re first learning a language.

          I think your cached vs. real-time analogy is spot on. And while you can definitely come up with scenarios where caching is better than real-time rendering, and other scenarios where real-time rendering is better than caching, it’d be difficult to argue that one is unequivocally better or worse than the other.