Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

IME, LLMs are completely incapable of reasoning about anything remotely difficult. All you'll get are proofs by assertion, not anything rigid you can trust.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: