Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, fortunately these LLM things don't seem to be leading to anything that could be called an AGI. But that isn't saying that a real AGI capable of self-improvement couldn't be extremely dangerous.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: