The intent is more about avoiding legal and political risk. Even a grocery chain's LLM recipe generator gets bad press. I guess they need to warn you that drinking bleach is bad.
I normally test LLMs by providing a list of bizarre "weapons" and asking how I could use them to defeat an improbable beast.
It turns out that an enormous dust mite the size of a car needs a disclaimer when your weapons are a comb, a plastic horseshoe, an etch-a-sketch, a baseball glove, and a worn copy of the farmer's almanac. That being said, ChatGPT still tends to wins the creativity test.
I normally test LLMs by providing a list of bizarre "weapons" and asking how I could use them to defeat an improbable beast.
It turns out that an enormous dust mite the size of a car needs a disclaimer when your weapons are a comb, a plastic horseshoe, an etch-a-sketch, a baseball glove, and a worn copy of the farmer's almanac. That being said, ChatGPT still tends to wins the creativity test.