Cars nowadays have radars and cameras that (for the most part) prevent you from running over pedestrians. Is that also a tool refusing to work? I'd argue a line needs to be drawn somewhere, LLMs do a great job of providing recipes for dinner but maybe shouldn't teach me how to build a bomb.
> LLMs do a great job of providing recipes for dinner but maybe shouldn't teach me how to build a bomb.
Why not? If someone wants to make a bomb, they can already find out from other source materials.
We already have regulations around acquiring dangerous materials. Knowing how to make a bomb is not the same as making one (which is not the same as using one to harm people.)
It's about access and command & control. I could have the same sentiment as you, since in high school, friends & I were in the habit of using our knowledge from chemistry class (and a bit more reading; waay pre-Internet) to make some rather impressive fireworks and rockets. But we never did anything destructive with them.
There are many bits of technology that can destroy large numbers of people with a single action. Usually, those are either tightly controlled and/or require jumping a high bar of technical knowledge, industrial capability, and/or capital to produce. The intersection of people with that requisite knowledge+capability+capital and people sufficiently psycopathic to build & use such destructive things approaches zero.
The same was true of hacking way back when. The result was interesting, sometimes fun, and generally non-destructive hacks. But now, hacking tools have been developed to the level of copy+paste click+shoot. Script kiddies became a thing. And we now must deal with ransomeware gangs of everything from nation-state actors down to rando teenage miscreants, but they all cause massive damage.
Extending copy+paste click+shoot level knowledge to bombs and biological agents is just massively stupid. The last thing we need is having a low intelligence bar required to have people setting off bombs & bioweapons on their stupid whims. So yes, we absolutely should restrict these kinds of recipe-from-scratch responses.
In any case, if you really want to know, I'm sure that, if you already have significant knowledge and smarts, you can craft prompts to get the LLM to reveal the parts you don't know. But this gets back to raising the bar, which is just fine.
Indeed, anything and everything that can conceivably be used for malicious purposes should be severely restricted so as to make those particular usecases near impossible, even if the intended use is thereby severely hindered, because people can't be trusted to behave at all. This is formally proven by the media, who are constantly spotlighting a handful of deranged individuals out of eight billion. Therefore, every one of us deserves to be treated like an absolute psychopath. It'd be best if we just stuck everybody in a padded cell forever, that way no one would ever be harmed and we'd all be happy and safe.