It's not like spam farms can't use their own version of Chromium that already mimicks a real browser. Relying on client side indicators for your bot detection will only catch the bots that don't care about being caught in the first place. Show an alert that says "welcome to my site!" for any browsers originating from a data center and you've probably filtered most of those out.
I like automating menial tasks in shitty web UIs (i.e. clearing out a list of sessions/search history/ad providers that only allow removing a single entry at a time). Simply using Firefox also gets flagged by a lot of these shitty bot detection services. I've never seen them do any useful work.
The only exception is maybe reCAPTCHA or Cloudflare's alternative; that seems to be quite good at catching actual bots, but I do hate most websites that use them because in Firefox you end up clicking on boats twenty times. They're also trivially bypassed by delegating your spamming to click farms, as 1000 minimum wage workers in a faraway country can be cheaper than paying for dev time to work around the minor nuisances of bot detection.
I like automating menial tasks in shitty web UIs (i.e. clearing out a list of sessions/search history/ad providers that only allow removing a single entry at a time). Simply using Firefox also gets flagged by a lot of these shitty bot detection services. I've never seen them do any useful work.
The only exception is maybe reCAPTCHA or Cloudflare's alternative; that seems to be quite good at catching actual bots, but I do hate most websites that use them because in Firefox you end up clicking on boats twenty times. They're also trivially bypassed by delegating your spamming to click farms, as 1000 minimum wage workers in a faraway country can be cheaper than paying for dev time to work around the minor nuisances of bot detection.