Hacker News new | past | comments | ask | show | jobs | submit login

You don't have to pressure them to remove a page. I remember that all you needed to do was add a line to your robots.txt to have a page excluded, and you can also just request to have a page excluded (that you own).



I know about that because i use robots.txt on my personal website to exclude, but how do you automatically exclude links that were already archived?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: