Who should develop biological weapons? Chemical weapons? Nuclear weapons?
Ideally no one, and if the cost / expertise is so niche that only a handful of sophisticated actors could possibly actually do it, then in fact (by way of enforceable treaty) no one.
> Who should develop biological weapons? Chemical weapons? Nuclear weapons?
Anyone who wants to establish deterrence against superiors or peers, and open up options for handling weaker opponents.
> enforceable treaty
Such a thing does not exist. International affairs are and will always be in a state of anarchy. If at some point they aren't, then there is no "international" anymore.
> in other words, cede military superiority to your enemies?
We're talking about making war slightly more expensive for yourself to preserve the things that matter, which is a trade-off that we make all the time. Even in war you don't have to race for the bottom for every marginal fraction-of-a-percent edge. We've managed to e.g. ban antipersonnel landmines, this is an extremely similar case.
> How would you enforce it after you get nuked?
And yet we've somehow managed to avoid getting into nuclear wars.
Resusal to make or use AI-enabled weapons is not "making war slightly expensive for yourself", it's like giving up on the Manhattan project because the product is dangerous.
Feels good but will lead to disaster in the long run.
If we hadn't developed nuclear weapons we would still be burning coal and probably even closer to death from global warming. The answer here is government contractors should be developing the various type of weapon as they are, people just do not think of google as a government contractor for some reason.
Ideally no one, and if the cost / expertise is so niche that only a handful of sophisticated actors could possibly actually do it, then in fact (by way of enforceable treaty) no one.