Hacker News new | past | comments | ask | show | jobs | submit login

> and wonder how an intelligent person can still think this, can be so absolute about it. What is "actual" reasoning here?

Large language models excel at processing and generating text, but they fundamentally operate on existing knowledge. Their creativity appears limited to recombining known information in novel ways, rather than generating truly original insights.

True reasoning capability would involve the ability to analyze complex situations and generate entirely new solutions, independent of existing patterns or combinations. This kind of deep reasoning ability seems to be beyond the scope of current language models, as it would require a fundamentally different approach—what we might call a reasoning model. Currently, it's unclear to me whether such models exist or if they could be effectively integrated with large language models.




> True reasoning capability would involve the ability to analyze complex situations and generate entirely new solutions, independent of existing patterns or combinations.

You mean like alphago did in its 36th move?


Isn't that a non-generic 'reasoning-model' instead of something that is reminiscent of the large language model based AIs we use today?

The question is, is it possible to make reasoning models generic and can they be combined with large language models effectively.


Move 37.


"Their creativity appears limited to recombining known information"

There are some theories that this is true for humans also.

There are no human created images that weren't observed first in nature in some way.

For example, Devils/Demons/Angels were described in terms of human body parts, or 'goats' with horns. Once we got microscopes and started drawing insects then art got a lot weirder, but not before images were observed from reality. Then humans could re-combine them.


I understand your point, but it's not comparable:

Humans can suddenly "jump" cognitive levels to see higher-order patterns. Gödel seeing that mathematics could describe mathematics itself. This isn't combining existing patterns, but seeing entirely new levels of abstraction.

The human brain excels at taking complex systems and creating simpler mental models. Newton seeing planetary motion and falling apples as the same phenomenon. This compression isn't recombination - it's finding the hidden simplicity.

Recombination adds elements together. Insight often removes elements to reveal core principles. This requires understanding and reasoning.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: