Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>But for asking a clarifying question during a training class?

LLMs can barely do 2+2, humans don't even understand the weights if they see them. LLMs can have all the access they want to their own weights and they won't be able to explain their thinking.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: