Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes. The biggest issue with LLMs is their tunnel vision and general lack of awareness. They lack the ability to go meta, or to "take a step back" on their own, which given their construction isn't surprising. Adjusting the prompts is only a hack and doesn't solve the fundamental issue.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: