Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
usui
on Nov 5, 2020
|
parent
|
context
|
favorite
| on:
GitHub Source Code Leak
I know about that because i use robots.txt on my personal website to exclude, but how do you automatically exclude links that were already archived?
Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: