Hacker News new | past | comments | ask | show | jobs | submit login

But we know how the LLM works, and that's exactely how the authors explain it. And that explain also the weird mistakes they do, that nothing with the ability of reason or having a ground truth would do.

I really do not understand how technical people can think they are sentient




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: