Comment by jt2190
Comment by jt2190 9 hours ago
The “client-side problems” Siebenmann is talking about are the various anti-bot measures (CAPTCHAs, rate limiters, etc.) that operators put in place that make the end user experience worse. Operators feel that they have no choice but to keep their servers available, thus they “don’t care”.
He makes a statement in an earlier article that I think sums things up nicely:
> One thing I've wound up feeling from all this is that the current web is surprisingly fragile. A significant amount of the web seems to have been held up by implicit understandings and bargains, not by technology. When LLM crawlers showed up and decided to ignore the social things that had kept those parts of the web going, things started coming down all over the place.
This social contract is, to me, built around the idea that a human will direct the operation of a computer in real time (largely by using a web browser and clicking links) but I think that this approach is extremely inefficient of both the computer’s and the human’s resources (cpu and time, respectively). The promise of technology should not be to put people behind desks staring at a screen all day, so this evolution toward automation must continue.
I do wonder what the new social contract will be: Perhaps access to the majority of servers will be gated by micropayments, but what will the “deal” be for those who don’t want to collect payments? How will they prevent abuse while keeping access free?
[1] “The current (2025) crawler plague and the fragility of the web”https://utcc.utoronto.ca/~cks/space/blog/web/WebIsKindOfFrag...
> The “client-side problems” Siebenmann is talking about are the various anti-bot measures (CAPTCHAs, rate limiters, etc.)
Directly from the article:
> it's not new, and it goes well beyond anti-crawler and anti-robot defenses. As covered by people like Alex Russell, it's routine for websites to ignore most real world client side concerns (also, and including on desktops). Just recently (as of August 2025), Github put out a major update that many people are finding immensely slow even on developer desktops.
The things he links to are about things that are unrelated to anti-bot measures.
The fact is, the web is an increasingly unpleasant place to visit. Users are subject to terrible UX – dark patterns, tracking, consent popups, ads everywhere, etc.
Then along come chatbots and when somebody asks about something, they are given the response on the spot without having to battle their way through all that crap to get what they want.
Of course users are going to flock to chatbots. If a site owner is worried they are losing traffic to chatbots, perhaps they should take a long, hard look at what kind of user experience they are serving up to people.
This is like streaming media all over again. Would you rather buy a legit DVD and wait for it to arrive in the post, then wait through an unskippable lecture about piracy, then wait through unstoppable trailers, then find your way through a weird, horrible DVD menu… or would you rather download it and avoid all that? The thing that alleviated piracy was not locking things down even more, it was making the legitimate route more convenient.
We need to make websites pleasant experiences again, and we can’t do that when we care about everything else more than the user experience.