Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Banning Edge to deal with the problem is probably not ideal.

What I am thinking about may even be less idea.

For people actively working on these projects how about puptting the git server on a private net with VPN or SSH access.

Use a seperate read only static git server to the net.



I was also thinking about VPNs but the static copy still has to serve a lot of traffic so I don't know if that's an economically viable solution. Furthermore it creates a market for VPN credentials, but that's another issue. At least I expect that a bot with sold or stolen credential will be easier to discover.

Anyway, why not git clone the project and parse it locally instead of scraping the web pages? I understand that scraping works on every kind of content but given the scale git clone and periodical git fetch could save money even to the scrapers.

Finally, all of this reminds me about Peter Watt's Maelstrom, when viruses infested the Internet so much (at about this time in history) that nobody was using it anymore [1]

[1] https://rifters.com/maelstrom/maelstrom_master.htm


Yes, this might be the best solution. and put read-only repo on github or some public facing host.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: