I think it's pretty clear that we have a fancy autocomplete but the other components are not the same. Reasoning is not just stringing together likely tokens and our development of mathematics seems to be an externalization of some very deep internal logic. Our memory system seems to be its own thing as well and can't be easily brushed off as a simple storage system since it is highly associative and very mutable.
There's lots of other parts that don't fit the ChatGPT model as well, subconscious problem solving, our babbling stream of consciousness, our spatial abilities and our subjective experience of self being big ones.
There's lots of other parts that don't fit the ChatGPT model as well, subconscious problem solving, our babbling stream of consciousness, our spatial abilities and our subjective experience of self being big ones.