Comment by userbinator

Comment by userbinator 9 days ago

19 replies

but on the other hand, having a rising browser engine might eventually remove this avenue for fingerprinting

If what I've seen from CloudFlare et.al. are any indication, it's the exact opposite --- the amount of fingerprinting and "exploitation" of implementation-defined behaviour has increased significantly in the past few months, likely in an attempt to kill off other browser engines; the incumbents do not like competition at all.

The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

SoftTalker 8 days ago

It's entirely deliberate. CloudFlare could certainly distinguish low-volume but legit web browsers from bots, as much as they can distinguish chrome/edge/safari/firefox from bots. That is if they cared to.

hansvm 9 days ago

Hold up, one of those things is not like the other. Are we really blaming webmasters for 100x increases in costs from a huge wave of poorly written and maliciously aggressive bots?

  • refulgentis 9 days ago

    > Are we really blaming...

    No, they're discussing increased fingerprinting / browser profiling recently and how it affects low-market-share browsers.

    • hansvm 9 days ago

      I saw that, but I'm still not sure how this fits in:

      > The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

      I'm reading that as `enemy == fingerprinters`, `that == AI bots DDoSing`, and `their own == webmasters, hosting providers, and CDNs (i.e., the fingerprinters)`, which sounds pretty straightforwardly like the fingerprinters are responsible for the DDoSing they're receiving.

      That interpretation doesn't seem to match the rest of the post though. Do you happen to have a better one?

      • userbinator 9 days ago

        "their own" = CloudFlare and/or those who have vested interests in closing up the Internet.

        • [removed] 8 days ago
          [deleted]
  • jillyboel 8 days ago

    Your costs only went up 100x if you built your site poorly

    • hansvm 8 days ago

      I'll bite. How do you serve 100x the traffic without 100x the costs? It costs something like 1e-10 dollars to serve a recipe page with a few photos, for example. If you serve it 100x more times, how does that not scale up?

      • jillyboel 8 days ago

        It might scale up but if you're anywhere near efficient you're way overprovisioned to begin with. The compute cost should be miniscule due to caching and bandwidth is cheap if you're not with one of the big clouds. As an example, according to dang HN runs on a single server and yet many websites that get posted to HN, and thus receive a fraction of the traffic, go down due to the load.

      • immibis 7 days ago

        You got 100x the traffic if your traffic was near zero to begin with.

cyanydeez 8 days ago

I dont think they're doing this to kill off browser engines; they're trying to sift browsers into "user" and "AI slop", so they can prioritize users.

This is entirely web crawler 2.0 apocolypse.

  • nicman23 8 days ago

    man i just want a bot to buy groceries for me

    • baq 8 days ago

      That’s one of the few reasons to leave the house. I’d like dishes and laundry bots first, please.

      • dodslaser 8 days ago

        You mean dishwashers and washing machines?

  • extraduder_ire 8 days ago

    I think "slop" only refers to the output of generative AI systems. bot, crawler, scraper, or spider would be a more apt term for software making (excessive) requests to collect data.