Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it that fuzzy though? If it was would language not adequately grasp and model our realities? And what about the physical world itself: animals are modeling the world adequately enough to navigate it. There's significant gains to make from modeling _enough_ of the world, without falling into hallucinations of purely statistical associations of an LLM.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: