Comment by nyanpasu64
Comment by nyanpasu64 8 days ago
I suppose it does make sense that a "make curl look like a browser" program would get sponsored by "bypass bot detection" services...
Comment by nyanpasu64 8 days ago
I suppose it does make sense that a "make curl look like a browser" program would get sponsored by "bypass bot detection" services...
Why do people even think this? Bots almost always just use headful instrumented browsers now. if a human sitting at a keyboard can load the content, so can a bot.
Security measures never prevent all abuse. They raise the cost of abuse above an acceptable threshold. Many things work like this. Cleaning doesn't eliminate dirt, it dilutes the dirt below an acceptable threshold. Same for "repairing" and "defects", and some other pairs of things that escape me atm.
That's the same argument as CAPTCHA's - as far as I know there are no bots protesting them making their lives harder, but as a human - my life is much harder than it needs to be because things need me to prove I'm a human.
Clean for data ingestion usually means complicated for data creation - optimizing for the advertisers has material cash value downstream, but customers are upstream, and making it harder is material too.
We are talking about Curl bots here. How is what you are saying relevant?
Maybe you can call a WebGL extension that isn't supported. Or better yet have a couple of overdraws of quads. Their bot will handle it, but it will throttle their CPU like gangbusters.
You are just guessing, please stop. Also, you’re wrong. All serious scraping is using browsers today.
Easy. Just make a small fragment shader to produce a token in your client. No bot is going to waste GPU resources to compile your shader.