Hacker News new | past | comments | ask | show | jobs | submit login

> Studies like this should make it evident that LLMs are not reasoning at all. An AI that would reason like humans....

Humans don't reason either. Reasoning is something we do in writing, especially with mathematical and logical notation. Just about everything else that feels like reasoning is something much less.

This has been widely known at least since the stories where Socrates made everybody look like fools. But it's also what the psychological research shows. What people feel like they're doing when they're reasoning is very different with what they're actually doing.




Well no, most people can reason without writing or speaking. I can just think and reason about anything. Not sure what you mean.

Reasoning is something like structured thoughts. You have a series of thoughts that build on each other to produce some conclusion (also a thought). If we assume that the brain is a computer, then thoughts and reasoning are implemented on brain software with some kind of algorithm... and I think it's pretty obvious this algorithm is completely different than what happens in LLMs... to the extent that we can safely say it is not reasoning like the brain does.

There is also a semantic argument here, if we say that since we don't know what humans are doing then we can also stretch the word and use it for AI, but I think this is muddying the waters and creating all the hype that I think will not deliver what it's promising.


That's not at all what the brain does though.

What the brain does is closer to activating a bunch of different ideas in parallel. Some of those activations rise to the level of awareness, some don't. Each activation triggers others by common association. And we try to make the best of that thought soup by a combination of reward neurochemicals and emotions.

A human brain is nothing at all like a computer in terms of logic. It's much more like an LLM. That makes sense because LLMs came largely from trying to build artificial versions of biological neural networks. One big difference is that LLMs always think linguistically, whereas language is only a relatively small part of what brains do.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: