Comment by jsnell
"Why didn't they do it earlier?" is a fallacious argument.
If we accepted it, there would basically only be a single point in time where a change like this could be legitimately made. If the change is made before there is a large enough problem, you'll argue the change was unnecessary. If it's made after, you'll argue the change should have been made sooner.
"They've already done something else" isn't quite as logically fallacious, but shows that you don't experience dealing with adversarial application domains.
Adversarial problems, which scraping is, are dynamic and iterative games. The attacker and defender are stuck in an endless loop of game and counterplay, unless one side gives up. There's no point in defending against attacks that aren't happening -- it's not just useless, but probably harmful, because every defense has some cost in friction to legitimate users.
> My guess is it had more to do with squeezing out more profit from that supposed 0.1% of users.
Yes, that kind of thing is very easy to just assert. But just think about it for like two seconds. How much more revenue are you going to make per user? None. Users without JS are still shown ads. JS is not necessary for ad targeting either.
It seems just as plausible that this is losing them some revenue, because some proprortion of the people using the site without JS will stop using it rather than enable JS.
It can't lose them revenue. Serving queries is expensive, getting rid of bots yields immediate and direct savings measured in $$$