

If you remember the project I would be interested to see it!
But I’ve seen some AI poisoning sink holes before too, a novel concept as well. I have not heard of real world experiences of them yet.
If you remember the project I would be interested to see it!
But I’ve seen some AI poisoning sink holes before too, a novel concept as well. I have not heard of real world experiences of them yet.
They’re working on no-js support too, but this just had to be put out without it due to the amount of AI crawler bots causing denial of service to normal users.
Doesn’t run against Firefox only, it runs against whatever you configure it to. And also, from personal experience, I can tell you that majority of the AI crawlers have keyword “Mozilla” in the user agent.
Yes, this isn’t cloudflare, but I’m pretty sure that’s on the Todo list. If not, make an issue to the project please.
The computational requirements on the server side are a less than a fraction of the cost what the bots have to spend, literally. A non-issue. This tool is to combat the denial of service that these bots cause by accessing high cost services, such as git blame on gitlab. My phone can do 100k sha256 sums per second (with single thread), you can safely assume any server to outperform this arm chip, so you’d need so much resources to cause denial of service that you might as well overload the server with traffic instead of one sha256 calculation.
And this isn’t really comparable to Tor. This is a self hostable service to sit between your web server/cdn and service that is being attacked by mass crawling.
Edit: If you don’t like the projects stickers, fork it and remove them. This is open source project.
And Xe who made this project is quite talented programmer. More than likely that you have used some of Xe’s services/sites/projects before as well.
Yes, Anubis uses proof of work, like some cryptocurrencies do as well, to slow down/mitigate mass scale crawling by making them do expensive computation.
https://lemmy.world/post/27101209 has a great article attached to it about this.
–
Edit: Just to be clear, this doesn’t mine any cryptos, just uses same idea for slowing down the requests.
And replaced the word “AI” with “Apple”. ( ͡° ͜ʖ ͡°)
You have a point here.
But when you consider the current worlds web traffic, this isn’t actually the case today. For example Gnome project who was forced to start using this on their gitlab, 97% of their traffic could not complete this PoW calculation.
IE - they require only a fraction of computational cost to serve their gitlab, which saves a lot of resources, coal, and most importantly, time of hundreds of real humans.
(Source for numbers)
Hopefully in the future we can move back to proper netiquette and just plain old robots.txt file!