Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The model is a pattern recognition engine and the problem we call "hallucination" is actually that the model is recognizing patterns from data that is similar to, but not an exact match to what we want.

But that's a terrible fit for law. In law, the differences really matter.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: