Hacker News new | past | comments | ask | show | jobs | submit login

Truth is relevant if you put it in the loss function.

It’s clear at this point that hallucinations happen due to missing information in the base model and trying to force an answer out of them.

There’s nothing inherent about it really, it’s more the way we use them




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: