Rather than block on UA, just add some honeypots. An invisible link. Any bot that pulls that page gets blocked as scrapers tend to pull all links from the page and follow.
Use the robots.txt to ban the pulling of specific pages. Bots 99% of the time ignore robots, so if they pull it: block
Check how quickly pages are pulled. If passes a threshold: block
I've seen bot traffic claiming to be recent versions of Firefox from residential IPs in the Ukraine pulling robots.txt. Sometimes this is one of the few clues to go on.
Use the robots.txt to ban the pulling of specific pages. Bots 99% of the time ignore robots, so if they pull it: block
Check how quickly pages are pulled. If passes a threshold: block