Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Let me add more: crawling the web is costly for EVERYONE.

The more crawlers out there, the more useless traffic is being served by every single website on the internet.

In an ideal world, there would be a single authoritative index, just as we have with web domains, and all players would cooperate into building, maintaining and improving it, so websites would not need to be constantly hammered by thousands of crawlers everyday.



I already get hit by literally hundreds of crawlers, presumably trying to find grist for there AI mills.

One more crawler for a search index wouldn’t hurt.


Bandwidth is cheap. I also like seeing more traffic in the logs.


Yeah not that cheap. There's a few articles on HN now about small, independent websites being essentially DDOS'd by crawlers. Although, to be fair, mostly AI crawlers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: