Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't that reasoning a "philosophical error" in itself though? If you make embodied experience be a prerequisite, then things which can't embody stuff can't meet that prerequisite. That doesn't seem a very interesting insight.

An AI literally cannot embody pain - it has no nervous system and no pain receptors. So AI is excluded from understanding it in that way by definition. It has no sensory perception of any kind so cannot have the kind of embodied experience. Heck it doesn't even have a body with which to embody anything. This is obviously unsatisfactory because it seems just a logical/rhetorical trick.

It's also no different from the concept of a person with no visual apparatus (mentioned in another comment thread) and whether they have thought about light and colour and so on. The fact that they are physically unable to have the same kind of experience of these things as someone else doesn't preclude them from having thoughts and experiences that are within the domain of their perception.

An LLM is even more limited than an AI generally because it is literally a model of language. I don't personally think that any LLM could conceivably have a theory of mind, but arguing that it cannot have a theory of mind simply because of things that are exogenous to language by definition seems arbitrary.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: