Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I see what you're saying; ChatGPT doesn't have a physical relationship with the world, doesn't have agency (is essentially paused until given input), doesn't have reward/punishment stimulae, etc.

I do think that a large portion of what seems to be missing here is trivial to add, relative to the effort in creating ChatGPT in the first place.

Side note: I'm not sure 'semantic relationship' is the right term here. Pretty sure it is specific to relationships between linguistic constructs. That wording very much triggered my "Bah, dualism!" response, as I thought you were insinuating some metaphysical bond between the mind and the world. Maybe "meaningful relationship" would serve better?



> I do think that a large portion of what seems to be missing here is trivial to add

If you really think it's trivial, then do it! I would be interested to see the results of anyone doing this. But there aren't any to see right now.

> I'm not sure 'semantic relationship' is the right term here.

It might not be; but in the cognitive science literature that term is used for more than just relationships between linguistic constructs; it is used for relationships between internal features of a model or an entity and features of the external world. I think this usage is also common in robotics, and more generally in domains like mechanical engineering which are often concerned with creating software programs to do things like manage fuel and air flow in car engines.


Look again. There are papers that hook up llms to robots with vision and other sensors. The LLM is fed descriptions of the world and then emits instructions for where to go


The thought of an LLM interacting with the world through a MUD is entertaining :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: