Hacker News new | past | comments | ask | show | jobs | submit login

> What workflow can have occasional catastrophic lapses of reasoning, non factuality, no memory and hallucinations etc?

LLMs might enable some completely new things to be automated that made no sense to automate before, even if it’s necessary to error correct with humans / computers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: