Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs hallucinate on legal questions because they hallucinate on everything.

Hallucination isn't a special weird form of error: it's the entire process by which LLMs work. Proponents and enthusiasts just call the errors "hallucinations" because it sounds better than admitting "this technology makes no distinction between correct and incorrect, and will often incorrectly correct itself when an error is pointed out".



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: