I'm pretty sure I could easily bot this though - I'm not an expect web scraper/automator by any means, my experience with PhantomJS is probably quite out of date, but what's stopping me from running x,000's of little scripts on a x,000 different VPS's (or a botnet, if I'm a bit evil), setup a unique looking User Agent and client environment in case of any finger-printing, and adding my competitor to a blacklist x,000 times? I'm pretty sure I could make it all look human enough. The manual check doesn't stop me from flooding the moderation queue at the very least, and perhaps the mod gets genuinely fooled, or convinced that my competitor must be up to no good.
Unfortunately, you only have to compete with real users using that specific feature, and there aren't that many when measured against the power of even cheap attacks.
1. even enabling personal blacklists would be a huge benefit to power users even if they aren't aggregated.
2. the user I replied to above suggested running n 1000 vps-es or using a criminal botnet. Both of these are significant hurdles.
Also while running n 1000 vps-es isn't a significant hurdle in todays clouds if it is a one time job, it at least becomes expensive if the blacklists are generated based on active users at random times around the clock forcing would-be abusers to not only fire up their vps-es, running a couple of queries and shut them down but also to keep them running around the clock, or rather in 8 hour shifts to not trigger abuse detection.
If a site is found in more than x blacklists, manually take a look at it and decide either to remove it or keep it.