Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's pretty clear that we have a fancy autocomplete but the other components are not the same. Reasoning is not just stringing together likely tokens and our development of mathematics seems to be an externalization of some very deep internal logic. Our memory system seems to be its own thing as well and can't be easily brushed off as a simple storage system since it is highly associative and very mutable.

There's lots of other parts that don't fit the ChatGPT model as well, subconscious problem solving, our babbling stream of consciousness, our spatial abilities and our subjective experience of self being big ones.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: