Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not even India, it's just piped into all our capchas


“Select all images with illegally parked cars” starts to feel quite possible


that would actually be cool, and an useful use of captchas, for once.


Captchas, at least recaptcha which is the main image selection one, exist to enforce Google's browser monopoly and force tracking on you. You're not proving you're a human, you're being punished for not behaving how google wants. It's vile.


This is a little known fact and proper characterization of reCaptcha that I wish I could draw even more attention to.


I don't think that's an accurate characterization because the difficulty of the recaptcha is based upon the property its protecting. If it was just about behaving how google wants, then someone who behaved how Google wants shouldn't get tested harder on sketchier sites.


I dunno, I frequently do recaptchas on Firefox on Linux and it hasn't been a problem for me? It does feel like they make me do the image selection more often, but I've never had any problems completing the captchas and it doesn't happen often enough to bother me.


You like wasting time working for free for google?


Well no, but I also understand why captchas are kinda necessary.


Because google wants us to work for free?

Because they want to punish those who dare to use firefox without changing the user agent?


Possible, but I think the more likely explanation is that Firefox users are just higher risk. I don't think the slightly increasesed rare of captchas is going to drive people off of Firefox.


Do you really think hackers don't know about the User-Agent header?


No, but in whatever little bot/scraper blocking work I did, user-agent blocking was an incredibly powerful low-hanging fruit.

For sensitive material, we would often insta-ban anyone who hit us with a 'curl' user-agent and that definitely killed a lot of script-kiddies' dreams right then and there.

Obviously, if you are a determined "hacker" you would get past that and hit the honeypots but simple filters could filter out enough scrapers such that the rest of the attacks could be manually evaluated by humans.


It feels very strange to me that someone can honestly believe that google isn't pulling shenanigans.


Practically fool proof. A net positive for society.


I think it is unethical to make people free work in such case. Captcha is there to prove if the user is human or not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: