Comment by cebert

Comment by cebert 4 days ago

13 replies

I have to imagine Google added these headers to make it easier for them to identify agentic requests vs human requests. What angers me is that this is yet another signal that can be used to uniquely fingerprint users.

gruez a day ago

It doesn't really meaningfully increase the fingerprinting surface. As the OP mentioned the hash is generated from constants that are the same for all chrome builds. The only thing it really does is help distinguish chrome from other chromium forks (eg. edge or brave), but there's already enough proprietary bits inside chrome that you can easily tell it apart.

  • thayne a day ago

    > The only thing it really does is help distinguish chrome from other chromium forks (eg. edge or brave)

    You could already do that with the user agent string. What this does is distinguishes between chrome and something else pretending to be chrome. Like say a firefox user who is spoofing a chrome user agent on a site that blocks, or reduces functionality for the firefox user agent.

    • bobbiechen a day ago

      Plenty of bots pretend to be Chrome via user agent, but if you look closely are actually running Headless Chromium. This is a very useful signal for fraud and abuse prevention.

      • thayne a day ago

        Let's ignore for the moment that this has been reverse engineered.

        If they only look at this header, then legitimate users using non-chrome browsers will get treated as bots.

        If the these headers are only used for chrome user agents, then it would be easy to bypass by using headless chromium with a user agent that spoofs firefox or safari.

        • [removed] a day ago
          [deleted]
      • TechDebtDevin a day ago

        This is what I don't get. Anybody scraping at scale is using headful browsers as fallback, this does nothing. I will just find the browser that works, and use it.

      • littlestymaar a day ago

        > This is a very useful signal for fraud and abuse prevention.

        Like people spoofing the Chrome UA in Firefox to avoid artificial performance degradation inflicted by Google on their websites...

    • TechDebtDevin a day ago

      I spoof User Agent, TLS/browser fingerprinting all day. These are the basics. None of this bothers me tbh, I'm constantly running tests on lots of versions chrome, firefox and brave and haven't really seen any impact in bot detection. I do a lot of browser emulation of other browsers in Chrome. PermiterX/Human seems to be the only WAF that is really good about catching this.

thayne a day ago

I'm more concerned that whether intentional or not this will probably cause problems for users who use non-chrome browsers. Like say slowing down requests that don't have this header, responding with different content, etc.

  • userbinator a day ago

    User-agent discrimination has been happening for literally decades at this point, but you're right that this could make things worse.

    • snackbroken a day ago

      User-agent discrimination is tolerable when it's Joe Webmaster doing it out of ignorance. It is not acceptable if it is being used by a company leveraging their dominant position in one market to gain an advantage over its competitors in another market. It's not acceptable even if it's not said company's expressed intent to do so but merely a "happy accident" that is getting "overlooked".

      Indeed, even for those who require a round of mental gymnastics before they concede that monopolies are, like, "bad" or whatever, GP points out precisely how this would constitute "consumer harm".

      • mook a day ago

        Tell that to Google intentionally slowing down Firefox even without ad blocking. (I'm talking about them using the fallback for web components instead, not the slowdowns when ads don't load.)

qingcharles 17 hours ago

How does that work, though? I have a bunch of automated tasks I use to speed up my workflows, but they all run on top of the regular browser that I also use. I don't see how this war is winnable? (not without tracking things like micro-movements of the mouse that might be caused by being a human etc)