> The model is a pattern recognition engine and the problem we call "hallucination" is actually that the model is recognizing patterns from data that is similar to, but not an exact match to what we want.
But that's a terrible fit for law. In law, the differences really matter.
But that's a terrible fit for law. In law, the differences really matter.