LLMs hallucinate on legal questions because they hallucinate on everything.
Hallucination isn't a special weird form of error: it's the entire process by which LLMs work. Proponents and enthusiasts just call the errors "hallucinations" because it sounds better than admitting "this technology makes no distinction between correct and incorrect, and will often incorrectly correct itself when an error is pointed out".
Hallucination isn't a special weird form of error: it's the entire process by which LLMs work. Proponents and enthusiasts just call the errors "hallucinations" because it sounds better than admitting "this technology makes no distinction between correct and incorrect, and will often incorrectly correct itself when an error is pointed out".