Hacker News new | past | comments | ask | show | jobs | submit login

Check if the response time, the length of the "main text", or other indicators are in the lowest few percentile -> send to the heap for manual review.

Does the inferred "topic" of the domain match the topic of the individual pages? If not -> manual review. And there are many more indicators.

Hire a bunch of student jobbers, have them search github for tarpits, and let them write middleware to detect those.

If you are doing broad crawling, you already need to do this kind of thing anyway.




> Hire a bunch of student jobbers,

Do people still do this, or do they just off shore the task?




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: