“Private links” should be covered by robots.txt. The only case I see this happening is for those “anyone with link” shares and those are easy to cover.
Anything private should ideally be put behind authentication. If that isn't possible, than robots.txt. Search engines are _meant_ to index everything that is publicly accessible and not blacklisted on robots.txt.
Imagine you send someone a "private" link to a file...Bing sees that and indexes it for the world to see. Not cool.