They have am aggressive crawler that I block. Why would I let a company use my resources to gather my data just so they can sell it to others for a premium at my expense?
They use a specific UA which is in the spirit of robots.txt, you're able to identify and allow/disallow access.
Trying to masquerade as another agent would be considered bad form, but obviously happens a lot.
There's a similar bot, MJ12Bot that powers Majestic's index which is similar to ahrefs. IIRC they have a user agent but their crawling is distributed, it's impossible to verify whether someone with that UA is them or someone else masquerading.
Good practice by bot owners is having a UA and known IPs they crawl from which can be verified by DNS and reverse DNS lookups.