Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Then why do they want Google's search index?


Crawling the web is costly. I assume it's cheaper to use the results from someone else's crawling. I don't know what Kagi is using to argue that they should have access to Google's indexes, but I'd guess it's some form of anti trust.


Let me add more: crawling the web is costly for EVERYONE.

The more crawlers out there, the more useless traffic is being served by every single website on the internet.

In an ideal world, there would be a single authoritative index, just as we have with web domains, and all players would cooperate into building, maintaining and improving it, so websites would not need to be constantly hammered by thousands of crawlers everyday.


I already get hit by literally hundreds of crawlers, presumably trying to find grist for there AI mills.

One more crawler for a search index wouldn’t hurt.


Bandwidth is cheap. I also like seeing more traffic in the logs.


Yeah not that cheap. There's a few articles on HN now about small, independent websites being essentially DDOS'd by crawlers. Although, to be fair, mostly AI crawlers.


So they can work even better...?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: