Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> You can scramble to submit PRs to ChatGPT to keep up with the “how many Rs in blueberry” kind of problems but it’s clear they can’t even keep up with shitposters on reddit.

Nobody does that. You can't "submit PRs" to an LLM. Although if you pick up new pretraining data you do get people discussing all newly discovered problems, which is a bit of a neat circularity.

> And your 2nd and third point about planning and compounding errors remain challenges.. probably unsolvable with LLM approaches.

Unsolvable in the first place. "Planning" is GOFAI metaphor-based development where they decided humans must do "planning" on no evidence and therefore if they coded something and called it "planning" it would give them intelligence.

Humans don't do or need to do "planning". Much like they don't have or need to have "world models", the other GOFAI obsession.





Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: