Building on an anti-spam cybersecurity tactic known as tarpitting, he created Nepenthes, malicious software named after a carnivorous plant that will “eat just about anything that finds its way inside.”
Aaron clearly warns users that Nepenthes is aggressive malware. It’s not to be deployed by site owners uncomfortable with trapping AI crawlers and sending them down an “infinite maze” of static files with no exit links, where they “get stuck” and “thrash around” for months, he tells users. Once trapped, the crawlers can be fed gibberish data, aka Markov babble, which is designed to poison AI models. That’s likely an appealing bonus feature for any site owners who, like Aaron, are fed up with paying for AI scraping and just want to watch AI burn.
I hope it’s effective.
AI crawlers and sending them down an “infinite maze” of static files with no exit links, where they “get stuck”
Maybe against bad crawlers. If you know what you’re trying to look for and just just trying to grab anything and everything this should not be very effective. Any good web crawler has limits. This seems to be targeted. This seems to be targeted at Facebooks apparently very dumb web crawler.
Any good web crawler has limits.
Yeah. Like, literally just:
- Keep track of which URLs you’ve been to
- Avoid going back to the same URL
- Set a soft limit, once you’ve hit it, start comparing the contents of the page with the previous one (to avoid things like dynamic URLs taking you to the same content)
- Set a hard limit, once you hit it, leave the domain altogether
What kind of lazy-ass crawler doesn’t even do that?
The way I understand it, the hard limit to leave the domain is actually the only one of these rules that would trigger on Nepenthes. The tar pit keeps generating new linked pages full of trash.
It might be initially, but they’ll figure out a way around it soon enough.
Remember those articles about “poisoning” images? Didn’t get very far on that either
It’s not that we “hate them” - it’s that they can entirely overwhelm a low volume site and cause a DDOS.
I ran a few very low visit websites for local interests on a rural. residential line. It wasn’t fast but was cheap and as these sites made no money it was good enough Before AI they’d get the odd badly behaved scraper that ignored robots.txt and specifically the rate limits.
But since? I’ve had to spend a lot of time trying to filter them out upstream. Like, hours and hours. Claudebot was the first - coming from hundreds of AWS IPs and dozens of countries, thousands of times an hour, repeatedly trying to download the same urls - some that didn’t exist. Since then it’s happened a lot. Some of these tools are just so ridiculously stupid, far more so than a dumb script that cycles through a list. But because it’s AI and they’re desperate to satisfy the “need for it”, they’re quite happy to spend millions on AWS costs for negligable gain and screw up other people.
Eventually I gave up and redesigned the sites to be static and they’re now on cloudflare pages. Arguably better, but a chunk of my life I’d rather not have lost.
I am so gonna deploy this. I want the crawlers to index the entire Mandelbrot set.
I’ll train with with lyrics from Beck Hansen and Smash Mouth so that none of it makes sense.
This is the song that never ends.
It goes on and on my friends.Well the hits start coming and they dont start ending
Good
Notice how it’s “AI haters” and not “people trying to protect their IP” as it would be if it were say…China instead of AI companies stealing the IP.
deleted by creator
manual and builds are here: https://zadzmo.org/code/nepenthes/
Oh I love this!
Does it also trap search engine crawlers? That would be a problem
The big search engine crawlers like googles or Microsoft’s should respect your robots.txt file. This trick affects those who don’t honor the file and just scrape your website even if you told it not to
I imagine if those obey the robots.txt thing that it’s not a problem.
Don’t make me tap the sign
OTOH infinite loop detection is a well known coding issue with well known, freely available solutions, so this approach will only affect the lamest implementations of AI,
an infinite loop detector detects when you’re going round in circles. They can’t detect when you’re going down an infinitely deep acyclic graph, because that, by definition doesn’t have any loops for it to detect. The best they can do is just have a threshold after which they give up.
You can detect pathpoints that come up repeatedly and avoid pursuing them further, which technically aren’t called “infinite loop” detection but I don’t know the correct name. The point is that the software isn’t a Star Trek robot that starts smoking and bricks itself when it hears something illogical.
So instead of the AI wasting your resources and money by ignoring your robots.txt, you’re going to waste your own resources and money by inviting them to increase their load on your server, but make it permanent and nonstop. Brilliant. Hey, even better, you should host your site on something that charges you based on usage, that’ll really show the AI makers who is boss. 🤣
It’s already permanent and nonstop. They’re known to ignore robots.txt, and remove user agent on detection.
And the goal is not only to prevent resource abuse, but break a predatory model.
But, feel free to continue gracefully doing nothing while other takes action, it’s bound to help eventually.
Hey, you don’t need to convince me, you’ve clearly already committed to bravely sacrificing your own time and money in this valiant fight. Go get ‘em, tiger! I look forward to the articles about AI being stopped coming out any day now.
There are different kinds of AI scraper defenses.
This one is an active strategy. No shit people know that this costs them resources. The point is that they want to punish the owners of bad-behaved scrapers.
There is also another kind which just blocks anything that tries to follow an invisible link that goes to a resource forbidden by robots.txt
One or two people using this isn’t going to punish anything, or make enough of a difference to poison the AI. That’s the same phrase all these anti-AI projects for sites and images use, and they forget that, like a vaccine. you have to have the majority of sites using your method in order for it to be effective. And the majority of sysadmins are not going to install what’s basically ICE from Cyberpunk on a production server.
Once again, it’s lofty claims from the anti-AI crowd, and once again it’s much ado about nothing. But I’m sure that won’t stop people from believing that they’re making a difference by costing themselves money out of spite. 😂
The only AI company that responded to Ars’ request to comment was OpenAI, whose spokesperson confirmed that OpenAI is already working on a way to fight tarpitting.
Ah yes. It’s extremely common for one of the top companies in an industry to spitefully expend resources fighting the irrelevant efforts of…
One or two people
Please, continue to grace us with you unbiased wisdom. Clearly you’ve read the article and aren’t just trying to simp for AI or start flame wars like a petulant child.
Well, luckily for them, it’s a pretty simple fix. Congrats on being a part of making them jot down a note to prevent tarpitting when they get around to it. You’ve saved the internet!
And stop pretending like you’re unbiased either. We both have our preconceived notions, and you’re not more likely to be open to change yours than I am. In fact, given the hysterical hyperventilating anti-AI “activists” get to, we both know you’re not ever going to change your mind on AI, and as such you’ll glom onto any small action you think is gonna stick it to the man, no matter whether that action is going to have any practical effect on the push for AI or not.
Not like you can load balance requests of the malicious subdirectories to a non-prod hardware. Can be decommissioned hardware.
How many hobby website admins have load balancing for their small sites? How many have decommissioned hardware? Because if you find me a corporation wiling to accept the liability doing something like this could open them up to, I’ll pay you a million dollars.