Comment by nottorp
Heh. Who asked those website owners to have laggy scrolling, non existent contrast, hijack my back button, generally run so much javascript that a cluster is needed client side just to display a 3 line LLM generated blog post?
Heh. Who asked those website owners to have laggy scrolling, non existent contrast, hijack my back button, generally run so much javascript that a cluster is needed client side just to display a 3 line LLM generated blog post?
You're painting with a pretty damn broad brush there.
Do you think that every small personal website is serving nothing but "a 3 line LLM generated blog post"? Do you not think there are some out there that have perfectly reasonable content? Much of it not even monetized?
And yet the bots are causing this problem for everyone. They are completely indiscriminate.
So before you try to dismiss this as a non-issue, maybe consider that there's more out there being affected by this than the absolute worst case possible to imagine.
To be fair, the sites resorting to extreme anti-bot measures are also often not the ones that are a clientside JavaScript hell.
Thinking of the most extreme option (throwing proof of work checks at browsers), the main stuff that jumps to mind is sites like sourcehut, Linux Kernel Archives and so on and the admins for all of those sites have noted that the traffic they get is far outside of expectations[0]. Not whatever blogspam ended up on the top of Google search that day.
The badly designed sites are often the ones that don't care about their bandwidth anyways.
[0]: https://drewdevault.com/2025/03/17/2025-03-17-Stop-externali...