We all migrate to smaller websites try not to post outside drawing attention just to hide from the “Ai” crawlers. The internet seems dead except for the few pockets we each know existed away from the clankers

  • ipkpjersi@lemmy.ml
    link
    fedilink
    arrow-up
    13
    ·
    2 hours ago

    Well I mean that’s kind of what Lemmy is like since it’s far more niche than something like reddit, but AI crawlers will find it anyway.

    • mic_check_one_two@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 hour ago

      AI crawlers don’t even need to crawl individual instances. If someone wanted to scrape Lemmy, it would be way more efficient to simply spin up their own instance and let federation do its thing. Federation is literally a built in way to mass distribute content to a bunch of different servers. So just spin up an instance, set it to not respect delete requests, (so you still get the deleted posts and comments), and scrape it locally. The entire thing could be set up in like 20 minutes, and it would allow for passive data collection instead of requiring active scrapers that run constantly.

  • BodePlotHole@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 hour ago

    I was thinking the other week about how it’s getting to a point that I would consider a membership fee to access something like lemmy but guaranteed no AI or bots or bullshit advertising.

    I know it isn’t possible, but if it was, I’d pay a small fee to have it.

  • NihilsineNefas@slrpnk.net
    link
    fedilink
    arrow-up
    4
    ·
    3 hours ago

    Do you think there will be safe places on the internet?

    If it’s connected, it’s accessible. Won’t matter what human level security we put in place when the datacenters these clankers run on have enough GPUs to brute force their way through.

    Offline communication will make a resurgence, and will become indespensible when the resource wars the billionaires are funding reach the rest of the world.

        • Lag@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          55 minutes ago

          If the person who got invited by the person you invited gets banned, your whole family dies. It’s the only way to keep people honest.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    189
    ·
    23 hours ago

    I have a testing website. I have never gave the address to absolutely anyone, ever. It’s not linked with anything. It’s just a silly html site living in a domain.

    It’s still being ping and probed to death by bad actors. No necessarily AI scrappers. But it’s dozens or hundreds of http petitions a day for random places all over the world.

    There’s no black forest. It’s all light up and under constant attack, every tree is already on fire.

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      English
      arrow-up
      87
      ·
      22 hours ago

      That’s because it’s numerically possible to sweep through the entire IPv4 address range fairly trivially, especially if you do it in parallel with some kind of botnet, proverbially jiggling the digital door handles of every server in the world to see if any of them happen to be unlocked.

      One wonders if switching to purely IPv6 will forestall this somewhat, as the number space is multiple orders of magnitude larger. That’s only security through obscurity, though, and it’s certain the bots will still find you eventually. Plus, if you have a doman name the attackers already know where you are — they can just look up your DNS record, which is what DNS records are for.

      • SkyeStarfall@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        2
        ·
        6 hours ago

        It’s not as simple as “only security through obscurity”. You could say the same thing for an encryption key of a certain length. The private key to a public key is still technically just an obscurity, but it’s still impractical to actually go through the entire range

        IPv6 is big enough where this obscurity becomes impractical to sweep. But of course, as you said, there may be other methods of finding your address

      • kazaika@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        19 hours ago

        Servers which are meant to be secure usually are configured to not react to pings and do not give out failure responses to unauthenticated requests. This should be viable for a authenticated only walled garden type website op is suggesting, no?

        • Cooper8@feddit.online
          link
          fedilink
          English
          arrow-up
          2
          ·
          14 hours ago

          I have suggested a couple of times now that ActivityPub should implement an encryption layer for user authentication of requests and pings. It already has a system for instances vauching for each other. The situation is that users of “walled garden” instances in ActivityPub lack means of interfacing with public facing instances that doesnt leave the network open for scraping. I believe a pivot towards default registered users only content service built on encrypted handshakes, with the ability for servers to opt-in to serving content to unregistered users would make the whole network much more robust and less dependent on third party contingencies like CloudFlare.

          Then again, maybe I should just be looking for a different network, I’m sure there are services in the blockchain/cryptosphere that take that approach, I just would rather participate in a network built on commons rather than financialization at it’s core. Where is the protocol doing both hardened network and distributed volunteer instances?

        • dual_sport_dork 🐧🗡️@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 hours ago

          There are several things you could do in that regard, I’m sure. Configure your services to listen only on weird ports, disable ICMP pings, jigger your scripts to return timeouts instead of error messages… Many of which might make your own life difficult, as well.

          All of these are also completely counterproductive if you want your hosted service, whatever it is, to be accessible to others. Or maybe not, if you don’t. The point is, the bots don’t have to find every single web service and site with 100% accuracy. The hackers only have to get lucky once and stumble their way into e.g. someone’s unsecured web host where they can push more malware, or a pile of files they can encrypt and demand a ransom, or personal information they can steal, or content they can scrape with their dumb AI, or whatever. But they can keep on trying until the sun burns out basically for free, and you have to stay lucky and under the radar forever.

          In my case just to name an example I kind of need my site to be accessible to the public at large if I want to, er, actually make any sales.

      • kossa@feddit.org
        link
        fedilink
        arrow-up
        9
        ·
        21 hours ago

        But an IP can have multiple websites and even not return anything on plain IP access. How do crawlers find out about domains and unlinked subdomains? Do they even?

          • taaz@biglemmowski.win
            link
            fedilink
            arrow-up
            10
            ·
            edit-2
            19 hours ago

            thinking about this, wouldn’t the best way to hide a modern websie be something along getting a wildcard domain cert (can be done with LE with DNS challenge), cnaming the wildcard to the root domain and then hosting the website on a random subdomain string ? am I missing something

            • confusedpuppy@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              7
              ·
              17 hours ago

              I do something something like this using wildcard certs with Let’s Encrypt. Except I go one step further because my ISP blocks incoming data on common ports so I end up using an uncommon port as well.

              I’m not hosting anything important and I don’t need to always access to it, it’s mostly just for fun for myself.

              Accessing my site ends up looking like https://randomsubdomain.registered-domain-name.com:4444/

              My logs only ever show my own activity. I’m sure there are downsides to using uncommon ports but I mitigate that by adjusting my personal life to not caring about being connected to my stuff at all times.

              I get to have my little hobby in my own corner of the internet without the worry of bots or AI.

        • simeon@reddthat.com
          link
          fedilink
          arrow-up
          3
          ·
          21 hours ago

          Every SSL certificate is publicly logged(you can see these logs e. g. under crt.sh) and you might be able to read DNS records to find new (sub)domains. The modern internet is too focused on being discoverable and transparent to make hiding an entire service(domain + servers) feasible. But things like example.com/dhusvsuahavag8wjwhsusiajaosbsh are entirely unfindable as long as they are not linked to

          • kossa@feddit.org
            link
            fedilink
            arrow-up
            4
            ·
            12 hours ago

            Random subdomain on wildcard certificate, IP written in the host file to mitigate DNS records, only given by word-to-mouth 😅.

            Nobody said the uncrawled dark forest would be comfortable.

      • lauha@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        17 hours ago

        I love your “multiple orders of magnitude”. I don’t think you appreciate or realise how much larger ipv6 address space is :)

        • dual_sport_dork 🐧🗡️@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          16 hours ago

          I wasn’t going to type that many commas for the sake of brevity, but it’s 340,282,366,920,938,463,463,374,607,431,768,211,456 possible addresses. I.e. 2128. So yes, I do.

          I consider 96 orders (in binary, anyway) as “multiple.” Wouldn’t you?

          • lauha@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            8 hours ago

            No need to be defensive. I’m not insulting, I just find it funny :) usually people call that “dozens”. But dozens of orders of magnitude really doesn’t give the sense of scale.

            You could have 8 billion in habitants in every 10^24 stars in the universe and everyone could still have 42k addresses.

    • RememberTheApollo_@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      17 hours ago

      I have a DDNS setup. Pretty random site name. Nonetheless, it’s been found and constantly probed. Lots of stuff from Russia, China, a few countries in Africa, and India. A smattering of others, but those are the constant IPs that are probing or attempting logins.

      • Croquette@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        6 hours ago

        DNS only translate a string address (www.mywebsite.com) to its IP address (xxx.xxx.xxx.xxx) so that it is easier to remember.

        Bots just try a range of address and they don’t need to know your domain name. You could have the most unintelligible domain name in the world, bots would still ping your website because they use direct IP addresses.

      • dual_sport_dork 🐧🗡️@lemmy.world
        link
        fedilink
        English
        arrow-up
        39
        ·
        22 hours ago

        Almost certainly. There are only 4,294,967,296 possible IPv4 addresses, i.e. 4.3ish billion, which sounds like a lot but in computer terms really isn’t. You can scan them in parallel, and if you’re an advanced script kiddie you could even exclude ranges that you know belong to unexciting organizations like Google and Microsoft, which are probably not worth spending your time messing with.

        If you had a botnet of 8,000 or so devices and employed a probably unrealistically generous timeout of 15 seconds, i.e. four attempts per minute per device, you could scan the entire IPv4 range in just a hair over 93 days and that’s before excluding any known pointless address blocks. If you only spent a second on each ping you could do it in about six days.

        For the sake of argument, cybercriminals are already operating botnets with upwards of 100,000 compromised machines doing their bidding. That bidding could well be (and probably is) probing random web servers for vulnerabilities. The largest confirmed botnet was the 911 S5 which contained about 19 million devices.

        • Melobol@lemmy.ml
          link
          fedilink
          arrow-up
          13
          ·
          21 hours ago

          That’s amazing and scary at the same time. Thanks for putting it into perspective!

      • friend_of_satan@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        18 hours ago

        If it’s https it’s discoverable by hostname.

        https://0xffsec.com/handbook/information-gathering/subdomain-enumeration/#certificate-transparency

        Certificate Transparency (CT) is an Internet security standard and open-source framework for monitoring and auditing digital certificates. It creates a system of public logs to record all certificates issued by publicly trusted CAs, allowing efficient identification of mistakenly or maliciously issued certificates.

        • kossa@feddit.org
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          12 hours ago

          But there can be multiple websites behind one IP address?! They would not show when onhy accessing the IP. They would need to know about the domains somehow.

  • crandlecan@mander.xyz
    link
    fedilink
    arrow-up
    43
    ·
    23 hours ago

    Fabulous insight. I think that would make me very happy. Bring back the forests! Burn down the Nazi trees!

  • Jo Miran@lemmy.ml
    link
    fedilink
    arrow-up
    22
    ·
    22 hours ago

    Cyberpunk as a literary genre, and the Cyberpunk TTRPG in specific, are incredibly prophetic. In the Cyberpunk TTRPG (which predates the WWW), “the net” is eventually condemned (as in boarded up) because of AI and ia replaced by silo’d networks (think extended intranets).

    • Cooper8@feddit.online
      link
      fedilink
      English
      arrow-up
      8
      ·
      15 hours ago

      And of course in Cyberpunk the ttrpg setting much of the o0en internet was rendered useless by self replicating AI malware hijacking storage, processing, and bandwidth due to a zero day exploit discovered by one egomaniacal hacker.

  • Brkdncr@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    22 hours ago

    Back in the days of dial up and bbs this was a problem but you would still get robots trying to connect to modems by dialing every phone number possible.

    • friend_of_satan@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      18 hours ago

      War dialing! Those were the days. I lived in a city where war dialing was illegal, but that didn’t stop me… maybe that’s just an admission of stupidity though. Definitely had some cool stuff come from it though.