Telling a LLM 'do not hallucinate' doesn't make it stop hallucinating. Anyone who has used an LLM even moderately seriously can tell you that. They're very useful tools, but right now they're mostly good for writing boilerplate that you'll be reviewing anyhow.
That’s the thing about the GP. In a sense, this poster is actually hallucinating. We are having to “correct” their hallucination that they use an LLM deeply.