Comment by marcus0x62
Comment by marcus0x62 2 days ago
Author of a similar tool here[0]. There are a few implementations of this sort of thing that I know of. Mine is different in that the primary purpose is to slightly alter content statically using a Markov generator, mainly to make it useless for content reposters, secondarily to make it useless to LLM crawlers that ignore my robots.txt file[1]. I assume the generated text is bad enough that the LLM crawlers just throw the result out. Other than the extremely poor quality of the text, my tool doesn't leave any fingerprints (like recursive non-sense links.) In any case, it can be run on static sites with no server-side dependencies so long as you have a way to do content redirection based on User-Agent, IP, etc.
My tool does have a second component - linkmaze - which generates a bunch of nonsense text with a Markov generator, and serves infinite links (like Nepthenes does) but I generally only throw incorrigible bots at it (and, at others have noted in-thread, most crawlers already set some kind of limit on how many requests they'll send to a given site, especially a small site.) I do use it for PHP-exploit crawlers as well, though I've seen no evidence those fall into the maze -- I think they mostly just look for some string indicating a successful exploit and move on if whatever they're looking for isn't present.
But, for my use case, I don't really care if someone fingerprints content generated by my tool and avoids it. That's the point: I've set robots.txt to tell these people not to crawl my site.
In addition to Quixotic (my tool) and Napthenes, I know of:
* https://github.com/Fingel/django-llm-poison
* https://codeberg.org/MikeCoats/poison-the-wellms
* https://codeberg.org/timmc/marko/
0 - https://marcusb.org/hacks/quixotic.html
1 - I use the ai.robots.txt user agent list from https://github.com/ai-robots-txt/ai.robots.txt
poison-the-wellms
I gotta give props for this project name.