"someone else will do it" is not, has never been, and never will be valid moral reasoning. You are responsible for your own actions, not what someone else might do if you refuse to take actions you consider immoral.
> "someone else will do it" is not, has never been, and never will be valid moral reasoning
I’m not saying someone has to do it. I’m saying if you work on AI it’s delusional to think you can keep it from being applied militarily. If you’re opposed to military AI stop building AI. (But don’t pretend you made a difference. You’re absolving yourself. Nothing more.)