Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Models don't have access to "reality"

This is an explanation of why models "hallucinate" not a criticism for the provided definition of hallucination.





That's a poor definition, then. It claims that a model is "hallucinating" when its output doesn't match a reference point that it can't possibly have accurate information about. How is that an "hallucination" in any meaningful sense?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: