When the work you're doing is several layers of abstraction removed from its worst applications it's easy to rationalize. For the authoritarian leader, this is a huge advantage of specialization and "replaceable cog in the machine" style of job standardization - no one person is building enough of the "evil" thing to feel responsible for the result, and most of the workers are replaceable enough that the "if I don't do it, they'll just get someone else who will" rationalization is probably correct.
> that the "if I don't do it, they'll just get someone else who will" rationalization is probably correct.
That may be correct, but from an ethical point of view, it's completely bankrupt. People who justify doing things they know to be unethical on the basis that someone else will just do it anyway are, of course, being unethical even if they are correct.
Even worse are those people who think others will do it anyway, so it's better if they themselves (obviously being good people) do it instead of those other, terrible people.
Everything can (and will) be weaponized, so the only realistic way to approach it is as a cost/benefit analysis. If something can bring more good than harm, excellent. If something can bring more harm than good, then maybe rethink things.
The upside to this technology is allowing a few people to communicate who have neurological damage. The downside is every government on earth being able to read your mind. Sounds like a good trade!