Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
paraknight
on Nov 5, 2020
|
parent
|
context
|
favorite
| on:
GitHub Source Code Leak
You don't have to pressure them to remove a page. I remember that all you needed to do was add a line to your robots.txt to have a page excluded, and you can also just request to have a page excluded (that you own).
usui
on Nov 5, 2020
[–]
I know about that because i use robots.txt on my personal website to exclude, but how do you automatically exclude links that were already archived?
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: